[x264-devel] x264 encoding delay

Bachhuber, Christoph christoph.bachhuber at tum.de
Thu Sep 1 09:33:51 CEST 2016


Hi,
we are using the x264 encoder for low latency video streaming. To our best knowledge, we have set all parameters to latency optimized values:

x264_param_default_preset(&x264params, "ultrafast", "zerolatency,fastdecode");
x264_param_apply_profile(&x264params, "baseline”);
x264params.i_threads=1;
x264params.i_keyint_max = 1; 
//Fixed qp
x264params.rc.i_qp_max = m_qp;
x264params.rc.i_qp_min = m_qp;

We encode the video, stream it over an almost free of cross-traffic gigabit network to another PC via UDP, where it is decoded using libavcodec. Now the problem is that the delay times do not add up. Encoding, measured via

start_time = std::chrono::high_resolution_clock::now();
frame_size = x264_encoder_encode(encoder, &nals, &num_nals, &pic_in, &pic_out);
end_time = std::chrono::high_resolution_clock::now();

gives an encoding delay of 1 to 2 milliseconds for our VGA resolution color frames at 240Hz, depending on the image content. Decoding is less than 1ms, the transmission of encoded data <0.1ms and ping times are around 0.2ms. However, we see roughly 10ms delay from the start of encoding until decoding is finished. This leads me to the question of the delay cause: does the encoder or decoder somehow buffer frames that we don’t know of? Is our way of measuring the delays as shown above correct? Is the OS network stack the cause of the additional delay?

Thank you in advance for you answers!

Cheers,
Christoph Bachhuber



More information about the x264-devel mailing list