[x264-devel] Degrading video quality over time
Jason Garrett-Glaser
darkshikari at gmail.com
Mon May 31 23:50:56 CEST 2010
On Mon, May 31, 2010 at 1:28 PM, Jan Willamowius <jan at willamowius.de> wrote:
> Jason Garrett-Glaser wrote:
>> > // Rate control set to CBR mode
>> > context.rc.i_rc_method = X264_RC_ABR;
>> > context.rc.f_rate_tolerance = 1;
>> > context.rc.i_vbv_max_bitrate = 0;
>> > context.rc.i_vbv_buffer_size = 0;
>> > context.rc.i_lookahead = 0;
>> > context.rc.i_bitrate = 768;
>>
>> That's not CBR mode, that's ABR mode! To set CBR, you need to set
>> bufsize and maxrate. *THAT* is why you're not getting the results you
>> expect.
>
> The comment is actually the error, ABR is ok.
> The problem seems to be that x264 in ABR mode tries to reduce the
> bitrate to be way under the set limit when dealing with my easily
> compressible still images.
>
> As an experiment, I set x264 to CRF mode instead
> _context.rc.i_rc_method = X264_RC_CRF;
> _context.rc.f_rf_constant = 16.0;
>
> This gives great video quality, as expected. When I transmit still
> images, the bitrate in CRF mode is even a bit below the bitrate of AMR
> (both well under the set limit in stats from the receiver). When feed
> motion pictures, the bitrate goes through the roof with this CRF
> setting.
>
> Could it be that the estimate how much bitrate x264 is currently
> producing is wrong (buggy ?) and it just starts to reduce the bitrate
> too early in my case ?
ABR is generally not suitable for streaming. But either way, the most
likely answer is that you are passing x264 invalid timestamps,
resulting any bitrate-based ratecontrol giving invalid results.
Dark Shikari
More information about the x264-devel
mailing list