Monday, January 6th 2020

NVIDIA Announces New G-SYNC Esports Displays with 360 Hz Refresh-rate

In the world of competitive gaming, where mere milliseconds can make the difference between victory and defeat, NVIDIA today unveiled new G-SYNC displays with a 360 Hz refresh rate, providing esports enthusiasts and competitive gamers with the fastest gaming displays ever made. At 360 Hz, game frames are displayed once every 2.8 ms—up to 6X faster than traditional gaming displays and TVs.

NVIDIA and ASUS are showcasing the world's first G-SYNC 360Hz display--the ASUS ROG Swift 360--at this week's Consumer Electronics Show (CES) in Las Vegas. The display, which will be available later this year, pairs perfectly with GeForce RTX, the world's fastest gaming GPUs, to deliver the absolute best competitive gaming experience.
The esports and competitive gaming communities continue to expand at a phenomenal rate. More than 60% of GeForce gamers worldwide play competitive games every month and $211M was awarded as esports prize money in 2019, representing a 29% jump from the previous year. Esports viewership has skyrocketed as well, with more than 450M gamers tuning in to watch competitive tournaments played in real time.

NVIDIA G-SYNC--Driving Innovation and Advancing Gaming
First introduced in 2013, NVIDIA G-SYNC is best known for its innovative Variable Refresh Rate (VRR) technology that eliminates tearing by synchronizing the refresh rate of the display with the GPU's frame rate. Since then, G-SYNC processors have added new display technologies to accelerate esports panels, including dynamic overdrive to enhance player perception, custom-tuned firmware to improve image quality for better target acquisition, and now 360 Hz refresh rates for rapid reaction time. While esports pros previously accepted frame tearing on their display to avoid waiting for the next frame to update, with G-SYNC's 360 Hz and VRR technologies, frames now refresh in less than 3 ms, so esports pros can get both tear-free frames and incredibly low latency.

Designed for Esports Pros—the ASUS ROG Swift 360
Developed in conjunction with NVIDIA, the ASUS ROG Swift 360 is specifically designed for esports competitions and features a 24.5 inch form factor to keep every pixel of action in the field of view. The ASUS ROG Swift 360 features ASUS's new sleek ROG finish that feels at home on the grand stage at any esports event as well as:
  • Blistering Fast Refresh Rate: 360Hz delivers crystal clear visuals, extremely low system latency for faster reaction times, and the smoothest motion to keep pros on target.
  • Designed for Esports: Play how the pros do and never miss a critical moment with the highest performance 24.5 inch Full HD display.
  • Superior Clarity: No distracting tearing, stuttering, flicker, or artifacts with NVIDIA G-SYNC VRR technology.
  • Quality Certified by NVIDIA: All G-SYNC certified displays undergo a rigorous validation process and are subjected to 300 image quality tests to ensure they deliver consistent quality and maximum performance.
NVIDIA will be demonstrating the new 360Hz G-SYNC display at its press suite at the Wynn hotel during CES this week, featuring a set of esports demos that show how 360Hz improves gamer's performance in target acquisition, aiming, and perception.
Add your own comment

34 Comments on NVIDIA Announces New G-SYNC Esports Displays with 360 Hz Refresh-rate

#1
XL-R8R
No one cares.


If you DO care - well done to you - you are in the 0.002% of potential users and should buy this (albeit innovative in some way or form) trash product.






FreeSync and its open-source nature is where the future is for me and many others; not because nVidia are baddies, but, simply because the price of G-Sync/the implementation of the tech is killing off the value of monitors with this 'feature' included.
Posted on Reply
#2
EzioAs
XL-R8RNo one cares.
I'm sure some people do.
Posted on Reply
#3
64K
XL-R8Ryou are in the 0.002% of potential users.......
I don't know how many gamers are involved in competitive e-sports but I doubt it's very much as a percentage of the PC gamers buying monitors and that's why I think these monitors will be very expensive and probably won't drop much over the years because ASUS can't manufacture in quantity to reduce costs.
Posted on Reply
#4
XL-R8R
EzioAsI'm sure some people do.
Are you sure? :roll:


It sounds all well and good to own a monitor with a refresh rate of 285729hz.... but driving that fps/hz range is another story all together and even the best GPU today will come nowhere near the frame rates this monitor is capable of refreshing at.



So based on this, I can only deduce that idiots, dumb kids OR the ill-informed will care.... anyone who knows anything - or is remotely sane/sensible - will avoid this product like the trash is actually is. :fear:
Posted on Reply
#5
ZoneDymo
yeah this wont make a difference for anyone sooo yeah.

also: NVIDIA G-SYNC is best known for its innovative Variable Refresh Rate (VRR) technology that eliminates tearing by synchronizing the refresh rate of the display with the GPU's frame rate.

Im pretty sure Nvidia Gsync is...ONLY... known for that because that is what it does... its like saying Firefox is best known for browsing the web.

Lastly, would you not love to interview the Asus rep and asking "soo this is 360hz huh? makes that new "flagship" monitor you are going to release soon with a measily 144hz look pretty bad doesnt it?"
Posted on Reply
#6
EzioAs
XL-R8RAre you sure? :roll:
Yes
Posted on Reply
#7
GlacierNine
64KI don't know how many gamers are involved in competitive e-sports but I doubt it's very much as a percentage of the PC gamers buying monitors and that's why I think these monitors will be very expensive and probably won't drop much over the years because ASUS can't manufacture in quantity to reduce costs.
If gaming wasn't a *huge* and growing market, there wouldn't be so many gaming products. Sure, it's saturated, but the people are out there to buy the products.

The real issue with this monitor is that the hardware doesn't exist to be able to leverage it. Hell, the **software** doesn't exist to be able to leverage it - Doom 2016 has a hard cap of 200FPS. Overwatch has a hard cap of 300FPS, in the engine, and it can't be surpassed. 360Hz is pointless in those titles. The frames don't exist to justify the Hz.

We're now at the point where not only can these framerates not be achieved with present-day hardware in a lot of games, they also can't be achieved with future hardware because the games themselves are limited. That's the real reason this Hz race bullshit is a waste of time - not only does it provide a tiny, dubiously useful benefit, but also it *cannot be utilised*.
Posted on Reply
#8
XL-R8R
GlacierNineWe're now at the point where not only can these framerates not be achieved with present-day hardware in a lot of games, they also can't be achieved with future hardware because the games themselves are limited. That's the real reason this Hz race bullshit is a waste of time - not only does it provide a tiny, dubiously useful benefit, but also it *cannot be utilised*.
Entirely this.




I also suspect the above will end up being the most precise/best post of this entire thread.
Posted on Reply
#9
oobymach
I've gone from 60hz to 144hz to 240hz and yes, there is a NOTICEABLE difference between them, for those claiming it bull you are probably among those who haven't tried higher refresh rate monitors. I'm more looking forward to 240hz 4k but to each his own. Gsync and Freesync are the respective gpu vsync without using vsync and it means compatibility with that gpu vendor out of the box, basically tear free video and gaming without using vsync. Also there is a big difference between 144hz and 240 so I assume 360 will be that much faster.

Just because your gpu doesn't get 150+fps doesn't mean you won't immediately notice a difference with higher refresh monitors in both gaming and video (which you will, unless you're blind).
Posted on Reply
#10
Vya Domus
ZoneDymoyeah this wont make a difference for anyone sooo yeah.
Not to you but it does to the masses of delusional "pro gamers" who are convinced it's the difference between 290 fps and 300 fps which makes them miss the shot and not their skill. You know, the jet fighter pilots who missed their career and are now playing Fortnite.

Truth is people are going to gobble up these many hundred frames per second displays in no time. What matters is if you think it makes a difference not if it really does.
Posted on Reply
#11
Vayra86
XL-R8RNo one cares.


If you DO care - well done to you - you are in the 0.002% of potential users and should buy this (albeit innovative in some way or form) trash product.






FreeSync and its open-source nature is where the future is for me and many others; not because nVidia are baddies, but, simply because the price of G-Sync/the implementation of the tech is killing off the value of monitors with this 'feature' included.
:toast:
GlacierNineIf gaming wasn't a *huge* and growing market, there wouldn't be so many gaming products. Sure, it's saturated, but the people are out there to buy the products.

The real issue with this monitor is that the hardware doesn't exist to be able to leverage it. Hell, the **software** doesn't exist to be able to leverage it - Doom 2016 has a hard cap of 200FPS. Overwatch has a hard cap of 300FPS, in the engine, and it can't be surpassed. 360Hz is pointless in those titles. The frames don't exist to justify the Hz.

We're now at the point where not only can these framerates not be achieved with present-day hardware in a lot of games, they also can't be achieved with future hardware because the games themselves are limited. That's the real reason this Hz race bullshit is a waste of time - not only does it provide a tiny, dubiously useful benefit, but also it *cannot be utilised*.
:toast:

Next!
Posted on Reply
#12
jabbadap
After that nvidia announcement opening gsync for DP VRR, so does this monitor work with cards using VESAs adaptive sync standard like AMDs Radeons as freesync or upcoming intel Adaptive sync?

Edit: still though not really interested this one, unless it's IPS or even VA. Perfect monitor for my use case would be something like 24" 1080p~120Hz flat IPS monitor, with some VRR tech.
Posted on Reply
#13
Sybaris_Caesar
After Nvidia embraced Freesync and claiming to open up G-sync I think there's viable competition in the market now. Instead of choosing among a select few, people can buy according to their budget. Just like AMD's resurgence made the CPU market more exciting.

There's still some place for G-sync modules in the market. I won't make brain-dead statement like G-sync is better/faster than Freesync but it has one feature set that everyone should want. Variable Overdrive.

As for this monitor, I dont particularly dig ASUS's ROG overpriced lineup. And I feel like they're overclocking the hell out of AUO's 240hz native TN panel so have to look at reviews of its performance.
Posted on Reply
#14
Xaled
oobymachI've gone from 60hz to 144hz to 240hz and yes, there is a NOTICEABLE difference between them, for those claiming it bull you are probably among those who haven't tried higher refresh rate monitors. I'm more looking forward to 240hz 4k but to each his own. Gsync and Freesync are the respective gpu vsync without using vsync and it means compatibility with that gpu vendor out of the box, basically tear free video and gaming without using vsync. Also there is a big difference between 144hz and 240 so I assume 360 will be that much faster.

Just because your gpu doesn't get 150+fps doesn't mean you won't immediately notice a difference with higher refresh monitors in both gaming and video (which you will, unless you're blind).
There's a huge difference between 60 to 120 and between 120 to 240
The first is obvious and clearly noticeable when switching between 60-120
However, between 120 and 240 you need to spend little time on 240 and use the mouse to feel the difference
Posted on Reply
#15
IceShroom
Do game server update on every 2.8ms?
Posted on Reply
#16
GlacierNine
IceShroomDo game server update on every 2.8ms?
Sometimes it's not about the server. Overwatch for example has 64 tick servers, so the updates are 15.625ms.

Thing is, Overwatch's input lag is affected by the render thread, so if you're getting 300FPS, you'll have noticably lower input lag than if you were locked to 120Hz. You can see this ingame by hitting CTRL+SHIFT+N to view statistics. The SIM value at 300FPS will be around 7ms, while at 120FPS it'll be more like 14ms.

SIM isn't a measure of input lag in any raw sense, but if that value is higher your input lag is also higher, so the fact FPS affects it directly means we can also verify that nobody is imagining things when they say Overwatch feels more responsive at extremely high FPS, even on monitors that cannot reproduce that many frames.

Again though, this means jack shit in terms of buying a 360Hz monitor - the Overwatch engine is hard capped at 300FPS, and even if it weren't, the input lag reductions I just described don't care about whether you can actually see these frames you're rendering - they only care that the frames have been rendered.
Posted on Reply
#17
Th3pwn3r
XL-R8RNo one cares.


If you DO care - well done to you - you are in the 0.002% of potential users and should buy this (albeit innovative in some way or form) trash product.






FreeSync and its open-source nature is where the future is for me and many others; not because nVidia are baddies, but, simply because the price of G-Sync/the implementation of the tech is killing off the value of monitors with this 'feature' included.
Says the guy who opened the thread and then posted in it.
Posted on Reply
#18
AlienIsGOD
Vanguard Beta Tester
Th3pwn3rSays the guy who opened the thread and then posted in it.
People will look for a reason to complain/bitch about anything :P
Posted on Reply
#19
GlacierNine
AlienIsGODPeople will look for a reason to complain/bitch about anything :p
There are valid reasons to consider this race for more Hz, a pointless exercise in marketing wank. This isn't simply "I don't want it therefore it's worthless". It's "People will buy this believing that it's better when it isn't, and those people are being taken advantage of".

Voicing displeasure at the existence of predatory marketing tactics like this is something that should be encouraged, because when consumers are empowered to demand real improvements in products, rather than being told that some meaningless number increase is a big deal, that translates into a marketplace where user experience improves more quickly and in bigger steps, rather than a marketplace where prices increase every year despite performance remaining equivalent.

You of all people should know this, as someone clearly quite proud of supporting AMD - you know better than most that consumers didn't shout loud enough at Intel for improved product and real technological advances, and as a result the consumer got stiffed on performance improvements and corecount increases for 10 years. This really isn't that different - the consumer should demand a real improvement instead of a box-ticking exercise with a marginally higher number than last year.
Posted on Reply
#20
Unregistered
Need 4 * 144hz for a total of 576hz refresh minimum, anything under that is just a stepping stone. That will be 4 display refreshes per FPS at 144 FPS, still not ideal but better.

Now... people may need a few generations for our brains to evolve to reduce our response time, but that's outside the scope of this exercise.
#21
Slizzo
XL-R8RNo one cares.

If you DO care - well done to you - you are in the 0.002% of potential users and should buy this (albeit innovative in some way or form) trash product.

FreeSync and its open-source nature is where the future is for me and many others; not because nVidia are baddies, but, simply because the price of G-Sync/the implementation of the tech is killing off the value of monitors with this 'feature' included.
Supposedly since the time that NVIDIA announced that G-Sync monitors would be compatible with VESA VRR all new monitors to market that have G-Sync will be open to all. If this monitor is the first of those, then you're pissing into the wind.

If it's not, no big loss. The panel isn't made by NVIDIA, just certified by them. It will make it to non-G-Sync monitors in time.
Posted on Reply
#22
seccentral
Dudes, in all seriousness, what combo of cpu/gpu esports game can do 360 fps constantly ?
.. no big esports streamer or casual streamer that has its 2080ti and 9900ks in their description and also enables some sort of metrics on stream can pull that. Most of them can however pull 144 constantly in titles like pubg/fortnite and please don't tell me this monitor is built for lol. Potato pcs can play that at decent fps.
This is just marketing bravado.
"We can do 360hz, that s how awesome we are"
It doesn't even make sense anymore, there s that principle called diminishing returns.
Why not re enable quad sli then ? Somebody will want to quad pair rtx Titans for the lulz. (Linus, you know yo would)
Posted on Reply
#23
efikkan
GlacierNineThe real issue with this monitor is that the hardware doesn't exist to be able to leverage it. Hell, the **software** doesn't exist to be able to leverage it - Doom 2016 has a hard cap of 200FPS. Overwatch has a hard cap of 300FPS, in the engine, and it can't be surpassed. 360Hz is pointless in those titles. The frames don't exist to justify the Hz.

We're now at the point where not only can these framerates not be achieved with present-day hardware in a lot of games, they also can't be achieved with future hardware because the games themselves are limited. That's the real reason this Hz race bullshit is a waste of time - not only does it provide a tiny, dubiously useful benefit, but also it *cannot be utilised*.
At some point, it does become pointless, but many people keeps forgetting that the entire chain of latency is like this;
- OS/hardware input (from ~2ms-30ms)
- Engine - Game loop: fixed tick rate, can be anywhere from 30 Hz to 200 Hz or more, plus any overhead in the engine.
- Engine - rendering (60 Hz => 16.7 ms, 144 Hz => 6.9 ms, 320 Hz => 2.8 ms), plus some driver overhead
- Optional Vsync
- Monitor input lag (~1ms - 30ms)
- Monitor pixel response
In total this usually adds up to 30-50ms or worse.

But latency is only one of two important aspects of this, the other is smoothness of animation. Quite often, the main bottleneck of the game engines is the game simulation(game loop) which can run at a really low tick rate. If you run a higher frame rate than tick rate(e.g. 60 Hz tick rate and 120 FPS), your GPU will just produce multiple identical frames, so you're basically wasting a lot of power on rendering the same thing only to gain a tiny bit of shorter latency. Since there will be diminishing returns for latency both for rendering and for game loops, it will get to a point where advancing it becomes pointless.

I think the appropriate cut-off point for fast-paced games would be 200 Hz, anything beyond that just gets negligible. Then it's better to focus the remaining resources on lowering the latency of the remaining parts of the chain. The first step, the OS and hardware input is often overlooked. Polling events from the OS can be quite unpredictable in terms of latency.

Also, what's up with all these "stupid" refresh rates? Why can't they standardize round numbers? I really dislike 60 Hz, 120 Hz, 144 Hz, 165 Hz, 240 Hz or 360 Hz, none of these are easy to time precisely from software. I really wish they used refresh rates like 50, 100, 200, etc.
IceShroomDo game server update on every 2.8ms?
Depends on the game, but in most cases no.
Posted on Reply
#24
semantics
People really hate progress? If someone wants higher refresh rate why hate, personally i don't like this monitor not for it's 360hz but because it's 1080p and a TN panel. I have my tradeoffs you have yours, i want higher resolution and better color accuracy across wide viewing angles, refresh rate only matters till about 120hz for me because i don't play competitive games that much.

Zeesh you might as well berate anyone that sells dacs that do more than 24bit, 48khz audio which is everyone even though it's really pointless.
Posted on Reply
#25
Camper7
More than 144 Hz refresh rate is just wasted. Why do they produce this at all?
Posted on Reply
Add your own comment
Jun 3rd, 2024 10:54 EDT change timezone

New Forum Posts

Popular Reviews

Controversial News Posts