[vlc-devel] nvdec GPU to CPU converter

Steve Lhomme robux4 at ycbcr.xyz
Tue Mar 3 08:30:14 CET 2020


Hi Quentin,

On 2020-03-02 18:56, Quentin Chateau wrote:
> Hi,
> 
> You recently fixed vout_DisableWindow to not release the decoder device 
> on failure. Thanks to this patch, the NVDEC decoder can now correctly 

You're the one who did all the work ;)

> fallback to a non-opaque chroma (CPU buffer) if the video initialization 
> using an opaque chroma fails.
> 
>      for (chroma_idx = 0; output_chromas[chroma_idx] != 0; chroma_idx++)
>      {
>          p_dec->fmt_out.i_codec = p_dec->fmt_out.video.i_chroma = 
> output_chromas[chroma_idx];
>          result = decoder_UpdateVideoOutput(p_dec, p_sys->vctx_out);
>          if (result == VLC_SUCCESS)
>          {
>              msg_Dbg(p_dec, "using chroma %4.4s", 
> (char*)&p_dec->fmt_out.video.i_chroma);
>              break;
>          }
>          msg_Warn(p_dec, "Failed to use output chroma %4.4s", 
> (char*)&p_dec->fmt_out.video.i_chroma);
>      }
> 
> Previously, only the first chroma of the output_chromas array could 
> succeed due to the vout_DisableWindow bug.
> 
> It seems to me the GPU to CPU video filter (chroma.c file in the same 
> directory as nvdec.c) is now useless: the decoder can already output 
> either CPU or GPU buffers. Worse than than, its very existence degrades 
> performance in the case where a CPU buffer is used as output.
> 
> Currently, even if the pipeline requires a CPU buffer as output, the 
> decoder will output GPU buffers and the GPU-to-CPU video filter will be 
> used: 1 GPU to GPU (decoder) + 1 GPU to CPU (filter) copies are made.
> 
> Disabling the video filter, the behavior becomes: the first call to 
> decoder_UpdateVideoOutput fails, then the decoder retries with a 
> non-opaque chroma and succeeds. In this configuration, only 1 GPU to CPU 
> copy is performed by the decoder. The pipeline performance is 
> objectively better (1 less GPU-GPU copy), but a lot of error logs are 
> printed out during the first attempt by the decoder to use opaque chromas.
> 
>  1. Do you see any reason to keep the GPU-to-CPU video filter ?

Yes. Even if the initial playback is using the GPU output, the user may 
add a CPU based filter during playback and a conversion will be 
required. The decoder will know nothing about that and will not change 
its output.

>  2. I think a --nvdec-prefer-opaque/--no-nvdec-prefer-opaque could be a
>     good addition if one knows in advance which chroma type will be
>     used. This would avoid the wall of error logs in certain
>     circumstances. What do you think ?

The software output from the decoder only exists as a convenience. In an 
ideal world the decoder should only output NVDEC chromas and the 
conversion be done elsewhere. Maybe we can remove that part now that the 
push model is (mostly) done.

One of the issue to take care of is the CPU fallback doesn't use I420 
first as it will find converters to do that, but with extra filters and 
copies. For most hardware decoders the best fallback to try is NV12 (and 
P010/P016). I don't think the core is ready for that yet. Same thing 
with 444 output.


More information about the vlc-devel mailing list