Hi --<div><br></div><div>I'm trying to figure out how to properly use rate control in x264 -- I am encoding a stream where the frame rate changes periodically (it's a webcam that drops the framerate while it's re-focusing). How does x264 handle applying rate control when it doesn't necessarily know how many frames it is going to get over time? When I set rc.i_bitrate, I do see the stream's bitrate change, but it's always off by some amount -- i.e, I ask for 2500kbps and my stream will be 3500kbps. If I ask for 500kbps, it will be 1000kbps. It gets closer, but not quite there. I assume this has to do with not setting up the x264_param_t struct correctly, but I'm really not sure what I need to set it up with. </div>
<div><br></div><div>I am using low-latency mode already -- is there anything specific I should be setting? b_vfr_input, i_fps_num/den and i_timebase_num/den look like things I should be looking into, but I'm currently at a loss as to how to set these correctly given my source stream. </div>
<div><br></div><div>Also -- do I need to set the i_dts for the picture to account for the variable frame rate? Currently I am simply starting at 0 and incrementing by 1 for each picture -- I assume that I would need to set this differently to let the encoder know that framerate is different now.</div>
<div><br></div><div>Thanks!</div><div><br>-- <br>dennis<br>
</div>