[vlc-devel] [PATCH] avcodec: only assume DXVA2 is used if it's forced

Steve Lhomme robux4 at videolabs.io
Tue Mar 31 12:06:02 CEST 2015


On Mon, Mar 30, 2015 at 7:06 PM, Rémi Denis-Courmont <remi at remlab.net> wrote:
> Le lundi 30 mars 2015, 18:44:31 Steve Lhomme a écrit :
>> I see,
>> https://mailman.videolan.org/pipermail/vlc-devel/2014-August/099420.html
>>
>> So what you're saying is that it should be done differently ?
>
> I mean that:
> 1) This patch is wrong.
> 2) The problem was brought up a long time ago but ignored.
>
>> I agree on that.
>>
>> Since at this point we know the input format, we could check if dxva2
>> will actually be able to handle, using the regular module loading and
>> decide if we want FF_THREAD_FRAME or not.
>
> No, the format is not *fully* known at this point, at least not
> systematically. libavcodec provides the final format only upon get_format() or
> get_buffer() callback invocation. (In the latter case, it means that hardware
> acceleration is not supported.)
>
> Hardware capability checking can only happen within ffmpeg_GetFormat(),
> specifically vlc_va_New().

Looking at the FFMpeg code, this is only called when starting to
decode a frame. At this stage the threading model is already picked.

Our ffmpeg_GetFormat() is called with a list of possible output pixel
formats the codec can handle. When we do the FF_THREAD_FRAME toggling,
we already which codec is going to be used. Unfortunately they don't
always provide a hardcoded list of output formats they support (at
least not H264). So we cannot use that codec object to try to find out
if DXVA2 is going to be used or not.

That only leaves us to guess if DXVA2 is going to be used or not. I'll
submit a better patch.

Thanks for the hints.



More information about the vlc-devel mailing list