[vlc-devel] [PATCH] DxVA2: allow NV12 pixel format all the way to the D3D texture

Steve Lhomme robux4 at videolabs.io
Wed Apr 1 15:35:35 CEST 2015


On Wed, Apr 1, 2015 at 3:16 PM, Jean-Baptiste Kempf <jb at videolan.org> wrote:
> On 01 Apr, Steve Lhomme wrote :
>> j-b mentioned that a custom VLC_CODEC_DXVA pixel format could be used
>> for hardware acceleration, like it's done in Android. But if we want
>> to use filters that means the DXVA format cannot be opaque.
>
> Well, one idea is that the DxVA decoder always outputs
> VLC_CODEC_D3D9_OPAQUE or VLC_CODEC_D3D11_OPAQUE and we create a fiter
> that can do VLC_CODEC_D3D9_OPAQUE -> NV12 and one VLC_CODEC_D3D9_OPAQUE
> -> YV12.

As NV12 is a requirement of DxVA there's a good chance that any GPU
that comes with support for it and outputs NV12 will support NV12
input format.

> Therefore we can do all the format negotiation in the vfilter chain.
>
> And of course, we can have the D3D or D3D11 video output to render
> directly the VLC_CODEC_D3D9_OPAQUE or VLC_CODEC_D3D11_OPAQUE
> (respectively)

In the end what's the difference in having VLC_CODEC_DXVA_OPAQUE just
being VLC_CODEC_NV12 ? It gives the same problem with filters that
only handle I420.

It's not that opaque if we want to able to convert it. So let's not
using an opaque format at all.

A real opaque format would make sense in the case the user doesn't
want to bother with filters. It could be handled by the matching vout.
It might be useful to support formats that cannot be accessed at all
by the CPU (DRM).



More information about the vlc-devel mailing list