[vlc-devel] [PATCH 00/17] Restart the display module on format changes

Steve Lhomme robux4 at ycbcr.xyz
Mon Nov 23 13:50:16 CET 2020


On 2020-11-22 11:27, Rémi Denis-Courmont wrote:
>>> It also avoids breaking the fine
>>> and simple format negotiation and chain build that we have.
>>
>> It's not fine because it causes issues (when a filter produces pictures
>> with chroma or size different from its input picture).
> 
> Filter that gratuitiously changes format is broken either way, as that would
> cause unnecessary conversion *between* filters. Building a proper chain is
> never going to work if filters forcing GPU<->CPU transfers, or RGB<->YUV
This patchset does not allow to *gratuitiously* change their output once 
they are settled, on Open.

> conversions.
> 
> And then, changing the size is not permissible at all in any case, as it would
> break the windowing code to handle window size and positioning. Whatever

This case already happens. You can use a deinterlace filter that will 
output at half the height ("--deinterlace-mode=discard"). As you know 
deinterlacing filters are added/removed dynamically during playback 
without any change of decoder. In 3.0 and the current master a filter 
will be artificially added to match the format the display was 
originally opened with. So even though the filter output is half size, 
the display will never know and we will waste CPU/GPU resources 
rescaling the output of filters to match some previous size.

This patchset changes this behavior by not adding the extra filter and 
restarting the display in this case.

> change of resolution is necessary has to be known up-front, e.g. to handle
> reorientation, *or* an invert SAR adjustment is necessary to compensate.
> 
> So those are both bogus scenarii.
> 
>> It would be more
>> straightforward to just create the vout display with the format the last
>> filter (if any) produces.
> 
> Of course not. That's exactly the wrong thing to do, as it requires each and
> every display to ship their own conversion/import of however many formats,

This conversion is already done in the form of "uploading to GPU 
texture" code. It's just internal to each display module.

> *and* it does not play nice with negotiation and filter chain building.

Which is way too tied to CPU formats.

>>>> The display sometimes do not have a "most convenient format". For
>>>> example, the OpenGL display doesn't care if it receives I420, NV12 or
>>>> RGBA for example. Adding a converter to match the decoder format (which
>>>> is meaningless) is a pure waste.
>>>
>>> Lol. Native I420 in OpenGL? And you call me stupid?
>>
>> I never said "native in OpenGL", I talked about the vout display.
>> The vout display does not care about the input format, because it always
>> uses an interop to import the picture.
>>
>> Btw, currently, if the decoder produces I420 pictures and the vout
>> display receives RGBA pictures, then a converter from RGBA to I420 must
>> be added so that OpenGL receives I420 (while its "native" format is
>> RGBA).
> 
> Self-contradiction much? In your previous mail, I420 was one of the most
> convenient format for OpenGL to receive alongside RGBA, but now it is not
> anymore.
> 
> In all my stupidity, I'd think that the OpenGL dispaly wants RGBA GL textures,
> much like the VDPAU wants VDPAU output surfaces, regardless of what comes in.
> 
> -- 
> Реми Дёни-Курмон
> http://www.remlab.net/
> 
> 
> 
> _______________________________________________
> vlc-devel mailing list
> To unsubscribe or modify your subscription options:
> https://mailman.videolan.org/listinfo/vlc-devel
> 


More information about the vlc-devel mailing list