[vlc-devel] Decoder/Display implementation selection

Steve Lhomme robux4 at ycbcr.xyz
Thu Jul 4 10:38:17 CEST 2019


On 2019-07-04 10:14, Thomas Guillem wrote:
> 
> 
> On Thu, Jul 4, 2019, at 09:51, Steve Lhomme wrote:
>> On 2019-07-04 9:23, Thomas Guillem wrote:
>>> I'm in favor of removing avcodec-hw and use only use "dec-dev" to select the hw decoder.
>>>
>>> By default it's automatic. For example on linux, it will probe in the following order:
>>>    - VAAPI
>>>    - NVDEC
>>>    - VDPAU
>>
>> I don't think this order can work. NVDEC is a standalone decoder, while
>> VAAPI and VDPAU are loaded by lavc. So either NVDEC is before or after
>> them but it can't be picked in the middle.
> 
> I don't see any problems with the current order. This is the dec-dev order, not the lavc one.

The order or object creation is (going to be):

decoder init -> decoder device -> decoder creation with possible HW -> 
video context -> display module

You order means:
lavc init -> vaapi decoder device -> vaapi init -> vaapi context
nvdec init -> nvdec decoder device -> nvdec creation -> nvdec context
lavc init -> vdpau decoder device -> vdpau init -> vdpau context

That means you start lavc, stop it, start nvdec with the same data, stop 
it, start lavc again.

>>
>> Also I suppose VDPAU are not mutually exclusive, but NVDec will probably
>> be better than VDPAU ? So it should come before (which you already have
>> in your list).
> 
> Yes
> 
>>
>>> If the dec-dev succeed to load one of these backend, it is very likely that the hw decoder will succeed too. So we don't have your problem on Linux.
>>
>> The HW may be loadable but the codec/profile not supported. For example
>> as things are now NVDec only handles H264. So if you pick that
>> automatically all the time, without knowing the codec, you'd never load
>> the VDPAU dec-dev even though it can do HEVC (you can extend that to 10
>> bits, 4:4:4, etc instead of just the codec).
>>
>> We still need to be able to iterate between all the HW decoders before
>> we pick one.
>>
>> NVDec being a decoder, that means once it's started, it might eat data
>> before it selects a profile or realize it cannot handle it. We should
>> fallback to lavc then. But from what I understand restarting the decoder
>> will lose the bits it has already eaten ?
>>
>> (we could also support QSV decoders in the future which might support
>> more codec/profiles than DXVA, so the issue is not just with NVDec)
>>
>>> If the user force one backend, it will try that backend and fallback to SW if the backend can't be loaded.
>>
>> OK with that. There might still be a problem when falling back from
>> NVDec/QSV to lavc.
>>
>>> On Windows, you have
>>> - D3D11
>>> - D3D9
>>> - NVDEC
>>>
>>> But you need to load lavc to know the correct backend, is that right ? What I would do:
>>
>> It's the same as with VAAPI/VDPAU. lavc starts processing data and then
>> ask for frames to decode to, with various VLD (including cuda, which we
>> don't support for now). NVDec/QSV would be separate from that.
>>
>>> Only one decoder device module for D3D11/D3D9, linked with avcodec. This module should first load the hw decoder via lavc in order to know which backend will be used in order to set the enum vlc_decoder_device_type.
>>
>> lavc requests a "decoder device", not the other way around. So we should
>> pass the list of VLD (translated to VLC FourCC) to the "decoder device"
>> constructor so it's decided there which flavor would be used. There's
>> still no guarantee the decoder picked will be usable unless we have the
>> codec profile and dimensions (at least in DXVA we need all that to know
>> if it's decodable or not).
> 
> That is not what we decided in "[PATCH 1/7] decoder: pass the chroma to decode to the decoder device". The decoder device must loaded before the hw decoder module. Maybe you are talking about the decoder context ?

In the case of lavc that we have now the "decoder" is lavc and is the 
first thing loaded to handle the ES. The "decoder device" is (will be) 
loaded after that. At that point the VA used by lavc is not loaded yet.

> I see important components
>   - dec device
>   - hw decoder (that require ^^)
>   - dec context (that require ^^)

But who creates the dec device ? Is it a (currently loading) decoder as 
I assumed, or we create it for the ES, before we try to load a decoder ?


> The dec device is very important on Linux. It's way simpler to handle since there is generally only one working backend (on intel with a nvidia gpu, the vaapi backend won't work because the driver won't be loaded for example). This module will setup everything that is needed by the hw decoder.

Are you sure if you plug a screen on each GPU the drivers are not both 
loaded ? On Windows if only one is plugged it's not listed for 
displaying/decoding. But once it's plugged it's available. Even if the 
window will be displayed on the screen of the other GPU. I would assume 
it's the same on Linux.

It allows decoding with a strong GPU even if you're displaying on a 
different screen.

> Maybe you would need only a generic (dummy?) dec device on windows, and use specific dec context (that is not yet well defined)
> 
>>
>>> On Thu, Jul 4, 2019, at 08:49, Steve Lhomme wrote:
>>>> On 2019-07-04 8:23, Steve Lhomme wrote:
>>>>> The default behavior should not be an issue. It could select the decoder
>>>>> device based on the OS and if the implementation is available. It's bit
>>>>> heavier than the current ffmpeg VLD flavor selection as we need to open
>>>>> system resources to know if the HW acceleration is available (and yet we
>>>>> don't know exactly the codec/profile needed). On the other hand right
>>>>> now we open a whole display module and then check the VLD flavor works.
>>>>> So it would probably be done faster anyway.
>>>>
>>>> There's a slight difference with the current behaviour. Currently lavc
>>>> proposes various VLD flavors and we go though each to pick the right
>>>> decoder and end up with software decoding if none is found. If we always
>>>> select the same "default" decoder device no matter the VLD flavor, we
>>>> lose the possibility of trying different decoders. So the VLD flavor (or
>>>> more likely the VLC FourCC) be passed to the decoder device creation so
>>>> it can decide better if it's a good match or not ?
>>>>
>>>> If the case of NVDec, it's a full on decoder. So it will be driving the
>>>> decoder device creation, it can also iterate through possibilities like
>>>> lavc until it finds a match.
> 
> I would say keep it simple and don't use (automatically) nvdec on windows since it's already working with D3D11. NVDEC is really necessary on Linux.

If it ever decodes MVC it will at least need to be picked in that case 
rather than lavc.

>>>> _______________________________________________
>>>> vlc-devel mailing list
>>>> To unsubscribe or modify your subscription options:
>>>> https://mailman.videolan.org/listinfo/vlc-devel
>>> _______________________________________________
>>> vlc-devel mailing list
>>> To unsubscribe or modify your subscription options:
>>> https://mailman.videolan.org/listinfo/vlc-devel
>>>
>> _______________________________________________
>> vlc-devel mailing list
>> To unsubscribe or modify your subscription options:
>> https://mailman.videolan.org/listinfo/vlc-devel
> _______________________________________________
> vlc-devel mailing list
> To unsubscribe or modify your subscription options:
> https://mailman.videolan.org/listinfo/vlc-devel
> 


More information about the vlc-devel mailing list