Catalyst 10.6 does not provide any support for VLC's GPU acceleration methodology, but AMD seems to suggest that a update to fix this is coming soon. Knowing ATI's lethargy in fixing drivers for anything not related to gaming, we decided to ignore their GPUs for the time being.

[UPDATE 1: June 26, 2010: We heard back from AMD PR as well as the concerned VLC developer, and we are told that Catalyst 10.7 (expected by mid-July), and VLC 1.1.1 (expected in 3 - 4 days) enable acceleration on ATI GPUs also. Apparently, they have verified that the acceleration works in the labs, and are waiting on final QA. I am now willing to reconsider my earlier opinion on ATI's lethargy and hope that this sort of response is a sign of good things to come for AMD/ATI HTPC users.]

[UPDATE 2: July 2, 2010: AMD provided us with the pre-release Catalyst 10.7, and Jean-Baptiste gave us the VLC 1.1.1 build. On one of AMD's recent Radeon chipsets, GPU acceleration works better than Nvidia's. Also, it looks likely that Radeon 3xxx users will be unable to take advantage of VLC's acceleration. More details will follow once Catalyst 10.7 is officially released].

VLC developers couldn't test their acceleration methodology on the Intel IGPs at all. As end users, we decided to test it out for them.

We utilized 3 test beds for our evaluation

1. Intel IGP - Arrandale ClearVideo: Gateway NV5935u
2. Nvidia - Quadro FX2700M PureVideo VP2: Customised HP 8730w [ Core 2 Duo T9400 / 4GB RAM ]
3. Nvidia - GeForce G210M PureVideo VP4: Sony Vaio VPCCW13FX/R [ PDF ]

The DXVA capabilities of each platform are evident in the screenshots below.

 
Intel i5-430M DXVA Capabilities
 
 
 
Quadro FX2700M DXVA Capabilities
 
 
 
GeForce G210M DXVA Capabilities
 
 

All machines were tested using VLC 1.1.0 on Microsoft Windows 7, using a 37" Toshiba Regza HDTV connected via HDMI through an Onkyo TX-SR606, at 1920x1080p resolution in Extend mode (with the primary screen running at 1366x768). One set of tests was run with GPU acceleration disabled, and another with GPU acceleration enabled. CPU usage was tracked for both runs and the maximum values over the course of playback compared.

GPU acceleration has been provided by VLC for MPEG-2, H.264 and VC-1. Since MPEG-2 is easily handled by even low performance processors, we decided to cover only H.264 and VC-1 in our test suite. Eight different streams were tested, with the following characteristics

1. L4.1 H.264 1080p30 @ 8.3 Mbps (M2TS)
2. L4.1 H.264 1080p24 @ 10.2 Mbps (MKV)
3. L5.1 H.264 1080p60 @ 10 Mbps - 8 reference frames (MKV)
4. L5.1 H.264 1080p24 @ 19 Mbps - 16 reference frames (MKV)
5. VC-1 Main Profile 1080p24 @ 8 Mbps (WMV9)
6. VC-1 Advanced Profile 1080p24 @ 18 Mbps (MKV)
7. VC-1 Advanced Profile 1440 x 576 @ 6 Mbps (WMV)
8. VC-1 Advanced Profile 720p60 @ 15 Mbps (WMV)

We decided not to use any interlaced media in the test suite since VLC does the deinterlacing on the CPU using SSE2 instructions even if GPU acceleration is enabled. This ensures that deinterlaced media playback remains consistent across different cards and driver versions.

The GPU acceleration support provided by VLC on Windows has a very different architecture compared to the one used by programs such as MPC-HC and Windows Media Player. As explained by one of the developers here, VLC prefers a slower method of GPU acceleration in order to maintain the framework aspect. It decodes on the GPU but gets the decoded data back for further processing. Therefore, CPU usage would be worse off when compared with playback using MPC-HC or Windows Media Player. For this reason, the only comparisons we make further down in this piece are within VLC (acceleration on vs. acceleration off), and not with other media playback programs.

Introduction Playback Performance
Comments Locked

74 Comments

View All Comments

  • MGSsancho - Friday, June 25, 2010 - link

    What software did you use to test DXVA compatibility? Also if possible where can we get a hold of it? :)
  • Per Hansson - Friday, June 25, 2010 - link

    It is "DXVA Checker"
    You can doiwnload it here;
    http://bluesky23.hp.infoseek.co.jp/en/index.html
  • barniebg - Friday, June 25, 2010 - link

    Come on, the most important benefit of using a GPU to decode video is the fact that you can apply hardware deinterlacing. VLC deinterlacing is nowhere near even remotely comparable to any GPU.
  • MGSsancho - Friday, June 25, 2010 - link

    I am disappointed as well but applaud VLC for being another competitor in this very important arena.
  • CSMR - Friday, June 25, 2010 - link

    Deinterlacing is a legacy concept relating to content produced with CRTs in mind. It is not important in the modern world. If you have content that is interlaced, your encoding software should deal with it, or else download a better version.
  • probedb - Friday, June 25, 2010 - link

    What about those of us that don't want to reencode video? Or that play DVDs back from the drive.

    De-interlacing is still very much required.
  • mckirkus - Friday, June 25, 2010 - link

    DVD content is stored as progressive (480p) On an old CRT/Tube TV, the DVD player interlaces the content (480i) so it is compatible with the TV.

    I can't think of any digital content that is stored in interlaced format these days.
  • mckirkus - Friday, June 25, 2010 - link

    (ok, no edit button, non HD tv is interlaced, as is 1080i broadcast ATSC). I should have said DVDs and Blu-Ray are not interlaced.
  • flanger216 - Sunday, June 27, 2010 - link

    Swing and a miss, #2. Try again, please.

    TONS of DVDs are interlaced, both from PAL and NTSC regions. Plenty of content --- HDV and tape sources, for starters --- has never been anything other than interlaced, right from camera acquisition, and is directly encoded from the interlaced source to an interlaced DVD... for obvious reasons.

    Many film-based DVDs released prior to 2000 or so are also interlaced, because they were encoded from old cable, laserdisc and VHS masters. Also, heaps of low-budget and foreign (especially Asian) DVDs are made from interlaced masters, due to old or subpar equipment, or simply because interlaced workflows are often cheaper.

    And NO, your "encoding software" should NOT "deal with it." Deinterlacing prior to encoding gives you the following options: you can deinterlace to half-resolution and encode w/ a good bitrate but with poor quality, or you can deinterlace to full-resolution, but that'll require a doubled frame-rate and, obviously, doubled file-sizes. Interlaced sources should always be encoded to interlaced targets and deinterlaced during playback, preferably by a high-quality temporal/spatial filter @ a doubled frame-rate. Realistically speaking, you're only going to get that from a GPU (or a smokingly fast CPU running one of the newer software deinterlacers).

    WHY do people write things that are flatly untrue?
  • electroju - Monday, June 28, 2010 - link

    I agree, but all DVD movies that I have are interlace. Yes, even the latest movie from 2007 is interlace. I am sure that Blu-ray and HD-DVD are interlaced as well. Like you said, these interlace content have to be set at double the frame rate and be de-interlaced to view correctly on a progressive screen. Though a 3:2 pull-up also have to be used to keep within 24 frame rate of the movie, but this adds distortion.

    Not all codecs are compatible with interlace content, so the video have to be deinterlace. This means double frame rate and adding 3:2 pull-up if it needs it.

    There are dozens of deinterlace algorithms. Not one will suit every content. The only programs that I know that include most of the deinterlacing algorithms is dscaler and tvtime.

    None of the GPU that I know of actually increases the frame rate, so you are back where you started. In order to do it right, it is do the post-processing task with the CPU. Though there is a compromise between loading the CPU to 100% and not using the GPU or using the GPU and have fraction of the CPU being utilized.

Log in

Don't have an account? Sign up now