[vlc-devel] [PATCH v2 17/18] video_output: restart display after filter change

Rémi Denis-Courmont remi at remlab.net
Wed Nov 25 16:55:55 CET 2020


Le mercredi 25 novembre 2020, 09:17:53 EET Steve Lhomme a écrit :
> On 2020-11-24 16:31, Rémi Denis-Courmont wrote:
> > This seems to have the exact same problem as previous version. It is very
> > much intended that we try to adjust the display conversion chain to
> > adjust to changes in the chroma or video format properties that are not
> > allowed to change for te display lifetime.
> > 
> > This allows swapping one conversion filter for another without messing the
> > display. Restarting the display should only be done as absolute last
> > resort.
> I agree. But this is not the job of the converter (be is in
> osys->converter of filter.chain_interactive) to *cancel* format changes
> done by a filter. Some simple examples:
> 
> * deinterlacer outputs half the frame size, a converter will artifically
> double back that frame size for now reason, when the display with better
> resizing algo (and using less resources) could do the same.

On a scaled display, the old scaler from video resolution to window resolution 
will be dropped, and a new scaler from halved resolution gets added. No 
problem there.

On a non-scaled display, iIn pull model, we would have needed a new pool. We 
couldn't handle it, and so we ended up keeping the display as it was, and 
inserting a scaler. That's why in 3.0, the transform filter causes a scaler to 
be (incorrectly) added.

But now in push model, shrinking the resolution is not a problem at all. This 
is just a change of both source A/R and source crop. Increasing the resolution 
might need small fixes in a few dispalys if they make assumptions about i_width 
and i_height, but that's pretty much it - the "physical" format size has no 
real meaning in push model.

> * adding a GPU filter to a CPU-based pipeline, it will force a GPU to
> CPU at the end of filter to match the CPU format that was originally
> given to the display, when in fact it could do a lot better with the GPU
> source.

First, CPU filters explicitly refuse to start when the input is in GPU for 
obvious performance reasons. This case is impossible.

And second, even if a filter would accept this, you need two conversions (GPU-
>CPU and CPU->GPU), and that's what you get conversions. The first one will be 
done by the filter chain in front of the SW filter, and the second one will be 
done by the display converter in front of the display. 

-- 
Rémi Denis-Courmont




More information about the vlc-devel mailing list