[vlc-devel] [PATCH 04/31] display: don't store the dummy context in the display anymore

Rémi Denis-Courmont remi at remlab.net
Tue Sep 24 14:41:35 CEST 2019


Hi,

Once you go back to CPU, there should be no decoder device anymore. More generally, if you change regime, to CPU or to another GPU API family, you can't expect the decoder device to work. It won't match anyway.

Of course, you should anyway avoid that situation because performance will fundamentally suck, no matter the VLC core design.

Le 24 septembre 2019 14:55:04 GMT+03:00, Steve Lhomme <robux4 at ycbcr.xyz> a écrit :
>On 2019-09-24 11:41, Steve Lhomme wrote:
>> On 2019-09-23 19:11, Rémi Denis-Courmont wrote:
>>> Le maanantaina 23. syyskuuta 2019, 18.01.09 EEST Steve Lhomme a
>écrit :
>>>> Use the one from the vout thread if it exists.
>>>>
>>>> Later the video context will come from the decoder (if any).
>>>
>>> I still don't get why the VD should care/know about the decoder
>device 
>>> at all.
>>> If even VD knows about the decoder device, then what's the point of
>the
>>> distinct video context?
>> 
>> The hint has to work both ways. If a VAAPI decoder uses a VADisplay
>and 
>> the VD uses another one it may not work at all (not sure if the same 
>> value would be used if the default display is used in both cases).
>> 
>> With D3D it's the same thing, with external rendering we want the VD
>to 
>> use the D3D device provided by the host (otherwise it just cannot
>work). 
>> That means the decoder should also use this device. And it knows
>about 
>> this device/hint through the "decoder device". I agree that the VD
>may 
>> read that external D3D device on itself without using the "decoder 
>> device" at all. Given the "decoder device" may not be created at all 
>> (it's created only on demand which won't happen for AV1 playback for 
>> example) maybe I should not rely on it at all and get the host D3D 
>> device by other means.
>> 
>> (there are also possibilities to use a different D3D device for
>decoding 
>> and rendering which I have local support for, but that's besides the
>point)
>> 
>> So IMO it all comes down to VAAPI and whether the VADisplay value
>must 
>> be created once and used in the decoder and the VD.
>
>I forgot an important case. When using a GPU decoder, a CPU filter,
>then 
>a GPU filter and then the display. If we don't store the "decoder 
>device" (in the filter (chain) owner) the CPU>GPU filter will need to 
>create a new "decoder device" when we could use a cached one, matching 
>the one used to initialize the VD (via the video context). We will 
>likely need to recreate a VD when we shouldn't have to.
>
>> I think in all cases the VADisplay created for the decoder will be 
>> pushed in the video context when creating the display. So in all
>cases 
>> it should match on both sides.
>> 
>> I'll modify my patch (and working branches) accordingly.
>> _______________________________________________
>> vlc-devel mailing list
>> To unsubscribe or modify your subscription options:
>> https://mailman.videolan.org/listinfo/vlc-devel
>_______________________________________________
>vlc-devel mailing list
>To unsubscribe or modify your subscription options:
>https://mailman.videolan.org/listinfo/vlc-devel

-- 
Envoyé de mon appareil Android avec Courriel K-9 Mail. Veuillez excuser ma brièveté.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.videolan.org/pipermail/vlc-devel/attachments/20190924/17f1e055/attachment.html>


More information about the vlc-devel mailing list