CES is set to get started right away, but already some news is starting to trickle out of Las Vegas. Acer has taken the wraps off of two new monitors aimed at gamers. The first features NVIDIA G-SYNC technology, for smoother frame delivery. For more info on G-SYNC, please refer to this article. The basics of it is that instead of the monitor refreshing at a fixed rate, it instead waits for a new frame from the GPU before refreshing.

Acer XB270HU

The first monitor is the Acer XB270HU, which, according to the Acer press release, is the world’s first IPS monitor with G-SYNC capability. It is a 27” 2560x1440 resolution IPS panel, with a maximum refresh rate of 144 Hz. This will give much better viewing angles (up to 178°) and generally a better color accuracy as well, although that will have to be tested. The XB270HU also comes on a height adjustable stand which offers tilt and swivel. The specifics of the panel are not mentioned, so at this time we cannot say whether it is a 6 or 8 bit panel. Availability is March 2015.

Acer XG270HU

The second gaming monitor is the XG270HU which has a 27” edge-to-edge frameless display according to the press release. It is not completely frameless of course, but the bezels are much smaller than normal on the side. The bottom of the monitor still has a large bezel though, so if you are looking for something frameless to use in portrait mode, these are not the monitors for you. The XG model is a TN panel, but features the same 2560x1440 resolution and 144 Hz refresh rate as the XB model, and features HDMI 2.0, DVI and DisplayPort 1.2 connections. Acer is claiming a 1 ms response time for this model. As with the XB model, availability will be in March 2015.

Prices were not announced at this time.

Source: Acer



View All Comments

  • mobutu - Sunday, January 4, 2015 - link

    "IPS panel ... will give ... generally a better color accuracy as well, although that will have to be tested"
    You know, the worst IPS has better colors thatn the best TN. "will have to be tested" lol
  • QuantumPion - Monday, January 5, 2015 - link

    That's not true at all. There are plenty of low-end IPS with crap color quality, and there are a few 8-bit TN panels with color quality nearly as good as high end IPS. Reply
  • hpglow - Sunday, January 4, 2015 - link

    Your comment clearly shows you know nothing of the tech you are talking about. They can't measure total input lag because they don't know what your system its. They can however use standard tests for response. Which its what its listed as the latency from a received signal until panel alteration. Input latency is the whole chain from the button press to the monitors response. That will vary by device (Pc, console) therefore almost impossible to quantify for a manufacturer to replicate. What they will do in many cases is attach a simple device with one button that changes the screen from black to white and measure that difference. When the button its pressed a internal timer starts and when a photo sensor detects any change the timer stops the difference is then displayed. Your controllers, mice, and kb all have ICs that add latency to this equation asking a monitor maker to account for your given setup is inane. Reply
  • Kutark - Sunday, January 4, 2015 - link

    From my basic understanding of Gsync, is also that gsync removes any input lag. Something to consider:


    Q: How is G-SYNC different than Adaptive V-SYNC?

    A: Adaptive V-Sync is a solution to V-Sync’s issue of stuttering, but input lag issues still persist. NVIDIA G-SYNC solves both the stutter and input lag issue.
  • DanNeely - Sunday, January 4, 2015 - link

    No, what's normally referred to as input lag when talking about a monitor is the delay from when a monitor recieves a new frame to when it displays it on the screen. No one other than people who're confused or are trying to obfuscate the issue tries to combine processing times in the computer. This is why input lag can be measured and specified by monitor vendors; because everything in the number is under their control.

    This is due to processing that the monitor does to implement scalers to show non-native resolution content full screen, to improve response time scores (by starting transitions that have longer amounts of latency before becoming apparent a frame or two ahead of ones that would occur faster - this was much more of a problem with *VA type displays from a half dozen years ago), and possibly for FRC dithering to make a cheap 6 bit panel appear to be 8 bit (or occasionally 8 appear 10). (I'm not sure if FRC implementations do use implementations that add latency; but being able to look a frame or two ahead would allow for algorithms that more accurately produced colors).

    What gSync/Adaptive Vsync do is different. Normally your video card starts sending a new frame out as soon as it renders it; this results in tearing (although rarely at an apparent level) because the rendering pipeline isn't running at exactly 16.7ms/frame (60 Hz). VSync causes your GPU to hold the frame it just completed rendering until the next 16.7ms clock interval passes so that it starts outputing it at the top of the screen instead of the middle to avoid tearing. The problem with doing so is that if your GPU can't complete a frame in 16.7ms it has to wait until the start of the next frame (another 16.7ms later) to begin sending it to the monitor. This means that if your GPU is taking 20ms/frame instead of getting 50 FPS and a bit of tearing you get no tearing but only 30 FPS because the GPU ends up sitting for 13ms after finishing each frame before outputting it. In the real world your games won't render all frames in the same amount of time and Adaptive VSync just turns it on/off depending on if their algorithm thinks tearing or delaying a fame to be the greater evil.

    Both GSync and FreeSync give the monitor the ability to receive frames of output at something other than a strict 60hz rate so the GPU can (to reuse my example from above) send out fresh frames every 20ms as they're rendered instead of having to add the synchronization delay for vsync.

    The VSync delay that's being removed is not what's normally called input lag. My gut reaction is that nVidia marketting is playing fast and loose with definitions to make a deceptive claim; although it's possible that nVidia only allows its gsync modules to be used in monitors that don't do a frame or two of pre-processing before displaying them.
  • Guspaz - Sunday, January 4, 2015 - link

    While you're right in theory, enabling vsync in many games can add a huge amount of latency, even when rendering at a solid 60FPS. Far more latency than the 17ms you'd normally expect. Perhaps it's a buffering issue. Reply
  • theuglyman0war - Friday, January 9, 2015 - link

    3D Vision? Reply

Log in

Don't have an account? Sign up now