Thursday, October 15th 2015

Acer Intros the Predator X34 Curved G-SYNC Monitor

Acer announced its latest premium gaming display, the Predator X34. This 34-inch, curved display offers resolution of 3440 x 1440 pixels, and features NVIDIA G-SYNC adaptive refresh-rate technology. The panel supports refresh-rates of 60 Hz, with up to 100 Hz via overclocking. The monitor features ZeroFrame technology, an Acer innovation for multi-monitor setups that makes bezel-compensation easier. The display takes input via DisplayPort 1.2 (needed for G-SYNC, besides an NVIDIA GPU), and HDMI 2.0. Its tripod stand allows tilt adjustments. Other features include 7W stereo speakers with DTS surround, and a 4-port USB 3.0 hub. Acer is expected to price this display at US $1,299.
Add your own comment

38 Comments on Acer Intros the Predator X34 Curved G-SYNC Monitor

#1
MxPhenom 216
ASIC Engineer
My friend just got the freesync version of this monitor and it is beautiful. Watched him play Battlefront beta on it, and I was in awe the whole time. Sooooooooo silky smooth. Freesync and gsync are something else.
Posted on Reply
#2
PLAfiller
That is SO cool. I wonder if curved monitors are as awesome as it looks to me, or just plain uncomfortable.
Posted on Reply
#3
FreedomEclipse
~Technological Technocrat~
Do want.... But that price though....
Posted on Reply
#4
Tannhäuser
"ZeroFrame".

Marketeers from today even can't count to zero, as it seems.
Posted on Reply
#5
rooivalk
lZKoceThat is SO cool. I wonder if curved monitors are as awesome as it looks to me, or just plain uncomfortable.
my opinion,

if you're using it for gaming = yes
if you have 70"+ screen = yes for gaming/movie
if you don't have gigantic screen but sit 2 feets away from screen = yes somehow it's more immersive
if you sit alone on dead center of axis = yes
if you sit together 3-4.2m (depends on curvature) from the screen = yes it's how it supposed to work according to manufacturers, but who use monitor at that range.
if you use it for graphic design = no
if you have reflection problem = no, curve will make reflection even wider
if you mount it on much higher position = no

Other than that, the advantages are pretty negligible, complete waste of money.
Posted on Reply
#6
erixx
Ultrawide: great for office and movies. And CERTAIN games. Many older -or modern "indie" games- do not scale too well... I had to fiddle too much to play SABOW and gave my UW to my wife :)

Curve: marketing gimmick. Gime flat anytime.
Posted on Reply
#7
deemon
I hope g-sync DIES a horrible death and sooner the better.
Curve is cool.
Posted on Reply
#8
jumpman
This is OLED?!?! That's something new.
Posted on Reply
#9
Tannhäuser
deemonI hope g-sync DIES a horrible death and sooner the better.
Why exactly? Why is G-Sync bad?
Posted on Reply
#10
Uplink10
TannhäuserWhy exactly? Why is G-Sync bad?
It is not an open standard and you require NVIDIA hardware.
Posted on Reply
#11
Solidstate89
jumpmanThis is OLED?!?! That's something new.
It's most certainly not OLED.
Posted on Reply
#12
natr0n
freesync version is $1,099 currently
Posted on Reply
#14
Solidstate89
Uplink10It is not an open standard and you require NVIDIA hardware.
And yet G-Sync has the distinct advantage at working better at lower framerates than Freesync.

If you don't want to use G-Sync, don't buy an nVidia GPU, and don't buy monitors that explicitly support the standard. There, you're done, and you haven't ruined it for anyone else that might actually want to use it.
Posted on Reply
#15
Patriot
Solidstate89And yet G-Sync has the distinct advantage at working better at lower framerates than Freesync.

If you don't want to use G-Sync, don't buy an nVidia GPU, and don't buy monitors that explicitly support the standard. There, you're done, and you haven't ruined it for anyone else that might actually want to use it.
Tis not true... nvidia is just giving the panel manufactures enough of the margin to keep gsync monitors having better official specs.
Posted on Reply
#16
nickbaldwin86
but 144hz? lol 3440x1440 @144hz is awaz out I know but that is what I want... I think it is crazy this can be OCed to 100hz...

Would OC to 100Hz hurt the life of the panel? never OCed a panel, don't even know what is involved.

I have the ROG Swift and I think I might have to sell it for this monitor :)
Posted on Reply
#17
nickbaldwin86
oh and its sad they have to price jump because of G-Sync :( should be $1000 for both versions.
Posted on Reply
#18
Solidstate89
PatriotTis not true... nvidia is just giving the panel manufactures enough of the margin to keep gsync monitors having better official specs.
That's a beautiful citation you've got there.
Posted on Reply
#20
jumpman
Solidstate89It's most certainly not OLED.
jabbadapEhh isn't this thing being market already? At least tftcentral has reviewed it half a month ago:
www.tftcentral.co.uk/reviews/acer_predator_x34.htm

And it does not have hdmi2.0, it's hdmi1.4.



It has IPS panel with w-led backlight.
Had me excited for a sec. Good thing TPU corrected it from the article.
Posted on Reply
#21
dwade
$1299 for a tiny 34" with a whatever picture quality. This is why I avoid PC branded displays such as Acer and Asus.
Posted on Reply
#22
silapakorn
Acer used to beat Asus in terms of price, but apparently not any more.

BTW I'm not impressed with Acer's quality control so far. I got 3 dead pixels and horrible IPS glow from my XB270HU. I bought it from local store so their returning policy is pretty much non-existent.
Posted on Reply
#23
deemon
Solidstate89And yet G-Sync has the distinct advantage at working better at lower framerates than Freesync.

If you don't want to use G-Sync, don't buy an nVidia GPU, and don't buy monitors that explicitly support the standard. There, you're done, and you haven't ruined it for anyone else that might actually want to use it.
That's not it! We are strongly against corporate "eco-system" creation bullshit with all their proprietary ports, protocols, hardware, software and other crap that works against interoperability and common free standards in tech.
Every once in a while another company arises to try something like this. IBM with their notorious inoperable add-on cards with other OEM PC-s; Siemens with their inoperable phones with other companies phone stations. Cisco/HP with their inoperable SFP ports (that just don't work with 3rd party SFP modules ... and this inoperability bullshit is even added with new firmware updates, when 3rd parties make new models.... so you have to downgrade firmware for the modules to work); Apple with all their fucking inoperable ports and plugs in the past; Now NVIDIA with their hairworks and G-Sync; Americans with their feet and balls measurements system (use metric system like the rest of the world!); ISIS/jews/islam with their inoperable religion/delusion system (yes also jews... you know... can't marry non-jew etc crap... like jewish penises were inoperable with christian/muslim/atheist vaginas or vice-versa).

Sure... not everything can be interoperable (like man getting another man or donkey pregnant), but everything that can, should be worked on in that direction, not against it, like nvidia with their g-sync. AND MORE SO WITH NOT ADDING SUPPORT TO NOW VESA STANDARD ADAPTIVE SYNC ... sure they can build their better g-sync chips and whatnot, but there is no reason not to enable also adaptive sync with your drivers!

What next? NVIDIA invents some new N-slot that works just like PCIe, but only NVIDIA cards can go into it and you need to buy new motherboard with N-slot to add nvidia GPU to it... and N-slot costs +300€/$ to your motherboard manufacturer. Oh ... wait... that's already patented way-to-do-things by IBM!

And wtf is still with the power sockets today:
en.wikipedia.org/wiki/AC_power_plugs_and_sockets
Can't we all use one standard? Why I need like 10 different adapters and crap when traveling
Posted on Reply
#24
Solidstate89
"We" who? I sure hope you don't think you're speaking for everyone. G-Sync doesn't prevent a monitor from being a monitor. You can plug an AMD card in and it'll still work fine. You won't have the frame rate sync like you would with a Freesync monitor, but the proprietary protocol doesn't stop you from using your monitor as a monitor.

Until if and when Freesync catches up to G-Sync, I'll stick to getting a G-Sync monitor as my next monitor. You can vote with your wallet and not get one. Pretty simple really.
Posted on Reply
#25
deemon
Solidstate89"We" who? I sure hope you don't think you're speaking for everyone. G-Sync doesn't prevent a monitor from being a monitor. You can plug an AMD card in and it'll still work fine. You won't have the frame rate sync like you would with a Freesync monitor, but the proprietary protocol doesn't stop you from using your monitor as a monitor.

Until if and when Freesync catches up to G-Sync, I'll stick to getting a G-Sync monitor as my next monitor. You can vote with your wallet and not get one. Pretty simple really.
"We" = definately not EVERYone, but strong majority of internet/techies community.

I don't want to sound too morbid, but supporting G-Sync is a slippery slope and in essence you give thumbs up to ISIS with this. I wouldn't!
Posted on Reply
Add your own comment
Nov 8th, 2024 16:53 EST change timezone

New Forum Posts

Popular Reviews

Controversial News Posts