[vlc-devel] VLC decoder flushing?

John Michael Zorko j.zorko at att.net
Mon Aug 11 20:10:58 CEST 2003


Hello, all ...

The current VLC has this issue i'm trying to work on -- if you set one 
VLC instance as a server, then stream to another VLC instance (same or 
different machine), if you seek on the server VLC, the client VLC's 
video gets jumbled up for a bit before it recovers.  This same behavior 
doesn't happen when VLC is reading the media from a file rather than 
the network, and it does not happen if the server VLC is paused before 
doing the seek.

Since we use input_Seek() to do other things as well (ff / rewind), we 
would like to not have these visual artifacts.  Pausing the stream 
between each seek is not really optimal for us, as unpausing it takes 
too much time (sometimes the client VLC, even with --udp-caching set 
low (100) and --udp-sout-caching set low (100) on the server VLC, takes 
2-3 seconds to respond after the server VLC is unpaused) and we seek a 
lot.

So, i'm exploring the following:

1. Are there certain points in the media that we can seek to that don't 
exhibit this artifact?  Even if I seek to a sequence header or the pack 
header before the sequence header, the artifact is still there when 
streaming.

2. Is there a function I can call after input_Seek() that will sort of 
flush the decoder, or otherwise make these artifacts less likely?

Regards,

John

Falling You - exploring the beauty of voice and sound
New EP, "hope thrown down," available now at
http://www.mp3.com/fallingyou










-- 
This is the vlc-devel mailing-list, see http://www.videolan.org/vlc/
To unsubscribe, please read http://developers.videolan.org/lists.html
If you are in trouble, please contact <postmaster at videolan.org>



More information about the vlc-devel mailing list