[vlc-devel] Inference from RTP Packets
tejasa at tataelxsi.co.in
Mon Dec 8 11:34:41 CET 2008
We have a server to stream different formats of video(no audio) over RTP. And we opted VLC as a client to playback the data.When data is streamed from the server I observe inconsistent behaviour at VLC ,while playing the video data. It crashes at times ..and plays out properly sometimes. I feel that VLC uses RTP headers (like Timestamp,payload type) etc.. to derive the frame rate, buffer sizes and program the decoder(FFMPEG/x264).
I have gone through the posts but didnt find any clue over it. Can someone tell me how VLC allocates its buffers to hold frames and how it derives the presentation timestamp.
To my knowledge presentation timestamp is derived from the basic formula
Timestamp = Presentation time * CLOCK;
I have appropriately calculated the timestamp, since im very much well aware of the frame rate Im pumping into the network.
This holds good for a low resolution video. For a HD, VLC seems to be inconistent. So I suspect the buffer handling at VLC.
Can someone please tell me what are the key parameters that help VLC gracefully handle video for different formats(H264,MJPEG,MPEG4) and where does it derive from...
Is it from RTP headers or Packetizer headers or extract from the coded video..and how shud i take care of the respective.
This message (including any attachment) is confidential and may be legally privileged. Access to this message by anyone other than the intended recipient(s) listed above is unauthorized. If you are not the intended recipient you are hereby notified that any disclosure, copying, or distribution of the message, or any action taken or omission of action by you in reliance upon it, is prohibited and may be unlawful. Please immediately notify the sender by reply e-mail and permanently delete all copies of the message if you have received this message in error.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the vlc-devel