After I published the initial grayscale dE2000 results for the PQ321Q, there was a lot of feedback over the quality of the numbers. The simple fact in the end is that only two of the numbers rose above the visual threshold of a dE2000 of 3.0. Yes, I would really like to see an average dE2000 of 1.0 or less, but the point of dE2000 is to tell us if we can see an error or not. With the exception of pure white, and even then just barely, the PQ321Q should look incredibly neutral without any calibration done to the grayscale. The gamma also tracks the 2.2 target very well, which will help to make up for the lack of depth in the black level.

Of course we want to calibrate the PQ321Q to see what it can really do. If you are buying a $3,500 display, you can likely buy, or rent, calibration gear to get it dialed in if you really care about image quality. And if you do, you are in for a treat.

Post-calibration, our dE2000 average is now 0.56. We see every error below 2.0 except for 0, which really doesn’t matter anyway. There is no color shift, and the gamma tracks along 2.2 almost perfectly. Our contrast ratio has taken a small hit, down to 704:1 as it has to manipulate the peak white level to be more accurate. What we see now is a grayscale image that is basically flawless.

With color the PQ321Q has a pair of issues. The first issue is undersaturation in the gamut when it comes to red and blue. This also causes Magenta to be undersaturated, and brings out some high dE values for 100% saturations on those colors. We also see that green and red fall outside the sRGB gamut triangle. This too is unfortunate as it pushes yellows and oranges outside the gamut, and causes errors there. Our Cyan value is almost perfect, but every other color has some issue on the PQ321Q, at least at 100%.

The reason we don’t just stop with the gamut charts is because they only represent 6 colors out of millions. They might have a large error, but it is rare to see 100% saturated colors in real content like photos or movies. To test this we use the Colorchecker chart in CalMAN. We’ve always used the Gretag Macbeth chart, which has 24 common colors on it. SpectraCal has added a 96-sample chart to the newest version of CalMAN. This adds more skin tones and other colors, and will provide a more accurate indication of monitor performance. The color samples for it are taken from an XRite chart. I’m going to run them both for a few reviews to see how it goes, but more samples leads to better accuracy, and lets us pinpoint exactly where things are going bad.

On the PQ321Q the issues all come back to Red/Orange/Yellow shades. Skin tones are good, blues and purples and cyans are all decent, and blue-greens are nice as well. Those tones that fall outside of the triangle cause the dE2000 average to rise way, way up and show us where the issues are. Reds certainly have a good amount of pop on the ASUS, and there really isn’t a good way to bring them back in. This issue is the largest one with the ASUS PQ321Q, as that extra gamut can’t really be addressed.

Traditionally if you look at the standard gamut dE2000 chart and see a large error in red, that indicates you will have issues with skin tones and people looking sunburned, With the larger Color Checker chart, we see that this is not the case, and that skin tones are, for the most part, under control while orange and yellows are not. This extra data helps us realize where the ASUS will look correct and where it will fall short.

With the saturations charts, we see that the errors start small, like usual, and get larger as you move towards the edge of the gamut. Blue has the largest error, though we are less sensitive to blue than other colors.  Cyan again remains great, and it would be nice if every color had the same performance as it does.

Post-calibration the ASUS is a bit of a mixed bag. We get great grayscale and gamma performance, but most people don’t just deal in black-and-white content. The color gamut provides a bit of an issue being both under-saturated and out-of-gamut at the same time. Looking through images the issue that pops out to me is the yellows. They really pop off the screen and look too bright. The color of reds looks slightly off, but yellow has the biggest issue that I can see with my eyes.

Internal Scaling, Brightness and Contrast dE2000 Data, 80 cd/m2 Calibration
Comments Locked

166 Comments

View All Comments

  • cremefilled - Wednesday, July 24, 2013 - link

    Even putting aside your syntax issues, almost all audio equipment depreciates in value quickly. The rare exceptions would be (the minority of) tube based amplification, and maybe a very few speakers (Quad ESL 63's).
  • Calista - Thursday, July 25, 2013 - link

    What's even worse, judging from the pictures of the homes of audiophiles the same person spending 10-20k or more on his stereo spend little to no thought on how the room effect the sound. Something we all know is incredibly important for the way sound-waves behave.
  • cheinonen - Thursday, July 25, 2013 - link

    As an audiophile as well as a video guy, I don't think the problem is that audiophiles are the worst hobbyists when it comes to paying for diminishing returns. I think the problem is that the press around audio focuses far too much on those diminishing return pieces. I'm considering writing a piece on budget phono amps, as more and more people buy turntables, but it's going to be hard. You can find 100 reviews of a $2,500 phono stage, but none of a $130 one that most people might buy. I think audio has a bad, bad marketing problem that the press reinforces.
  • vgroo - Monday, July 29, 2013 - link

    Diminishing returns is an understatement. Whenever I hear the word "audiophile", it always reminds me of those numerous sound-clarifying snakeoil products (e.g. the magnificent Bedini Clarifier, http://www.bedini.com/clarifier.htm) and the praising reviews they get around the web.
  • Shadowself - Wednesday, July 24, 2013 - link

    The low end of what anyone with any knowledge of the subject says is *easily* discernible by a person is one arc minute per pixel. (There are things like vernier acuity, rotational acuity and such that can push that by a factor of 10 or more -- to 0.1 arc minute per pixel or less.)

    A commonly accepted, comfortable viewing angle is 60 degrees. (Some "experts" put it at 90 degrees or more.)

    Combining the minimum 1 arc minute per pixel with the minimum 60 degrees gives a horizontal pixel count of 3,600 as the minimum that the average person can easily discern given an optimum viewing distance. (If you take that to the other extreme of 0.1 arc minute and 90 degrees this becomes 54,000 horizontal pixels at the optimum viewing distance.)

    So is 2160 (v) x 3820 (h) worth it to the average person with good eye sight? Absolutely. It just barely crosses the 3600 horizontal pixel count.

    If you can't tell the difference between UHD (2160p) and HD (1080p) then I humbly submit you need to get your eyes checked. If you can't tell the difference between 720p and 1080p then you REALLY need to get your eyes checked.
  • 1Angelreloaded - Wednesday, July 24, 2013 - link

    No believe me playing games for long enough, on low end and high end screens def makes you more aware to PPI, in fact it is funny that most HDTVs look horrible to me, even at the optimized distance. There are things you just notice, if anything I thing being around PC for 2 long makes us somewhat sensitive just like the difference between 30fps and 60fps typically you shouldn't be able to tell a difference but as so many had said yes you can.
  • cremefilled - Wednesday, July 24, 2013 - link

    Anyone with knowledge of the subject knows that 60 Hz versus 30 or 24 Hz is easily discernible, and 120 vs 60 is also easily discernible. The confusion here stems from 24 fps being the standard for film, but the difference is that film has built-in artifacts like motion blur that make 24 Hz the bare MINIMUM for smooth motion.

    If you've seen IMAX presentations, you know that even for true film, 60 vs. 24 is a huge difference.
  • entrigant - Friday, May 30, 2014 - link

    Sound is as quantifiable as video. The accuracy of each can be measured and known beyond question. It's just that nobody does it because they don't want to admit their $2000 stereo is measurably terrible despite how good they've convinced themselves it sounds.
  • ImSpartacus - Tuesday, July 23, 2013 - link

    They sit too far away ON AVERAGE.

    Sitting distance is a random variable and it has non-trivial variance.

    I made a spreadsheet to measure this effect: http://goo.gl/dNkj6
  • n13L5 - Thursday, July 25, 2013 - link

    I'm glad I don't have to worry about yanking $ 3.5k from somewhere, cause years of computer use has caused my 20/20 vision to weaken to the point where 1080p on a 27" screen works just fine...

Log in

Don't have an account? Sign up now