Ok .. is this an added optimization in x264 codec or part of standard ? Can this be in any way avoided ?<div><br></div><div>thanks.. <br><br><div class="gmail_quote">On Fri, May 14, 2010 at 12:45 PM, Jason Garrett-Glaser <span dir="ltr"><<a href="mailto:darkshikari@gmail.com">darkshikari@gmail.com</a>></span> wrote:<br>
<blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex;"><div class="im">On Fri, May 14, 2010 at 9:05 AM, kumar gokhare <<a href="mailto:kumar.gokhare@gmail.com">kumar.gokhare@gmail.com</a>> wrote:<br>
> Hello,<br>
> I have a small doubt regarding co-efficients which are fed to the CAVLC<br>
> encoder. If I change the codewords of "level_token" table to encode these<br>
> co-efficients, they seem to change, ie; I get a different set of<br>
> co-efficients that are fed to CAVLC encoder for different CAVLC "level"<br>
> tables. Is this expected ?<br>
<br>
</div>Yes, because different codewords change bit costs, which changes the<br>
modes that RD decides on, which changes the coefficients used.<br>
<br>
Dark Shikari<br>
_______________________________________________<br>
x264-devel mailing list<br>
<a href="mailto:x264-devel@videolan.org">x264-devel@videolan.org</a><br>
<a href="http://mailman.videolan.org/listinfo/x264-devel" target="_blank">http://mailman.videolan.org/listinfo/x264-devel</a><br>
</blockquote></div><br></div>