[x264-devel] bug: incorrect 8bit -> 10bit conversion
madshi
madshi at gmail.com
Sat Jul 30 15:35:24 CEST 2011
Hey,
I've just been sent a test pattern which was encoded with x264 to 10bit. The
original 8bit test pattern has these black and white YCbCr values:
8 bit black, Y: 16, Cb: 128, Cr: 128
8 bit white: Y: 235, Cb: 128, Cr: 128
I've now checked the raw decoding output of the 10bit video file encoded
with x264 (ffmpeg/libav decoder). I'm getting these values:
10 bit black, Y: 64, Cb: 514, Cr: 514
10 bit white: Y: 943, Cb: 514, Cr: 514
The 8bit -> 10bit conversion is clearly incorrect. I guess that the
conversion is done by using "/ 255 * 1023". However, the correct conversion
would be a simple shift << 2.
As a reference, the BT.709 specification says:
Black level, 8bit: 16, 10bit: 64
Achromatic, 8bit: 128, 10bit: 512
Nominal peak, 8bit: 235, 10bit: 940
The incorrect conversion means that peak white is encoded with 943/514/514
instead of the correct 940/512/512. Furthermore, using "/ 255 * 1023" means
that a bit of banding is introduced. Both problems would not occur with a
simple << 2 shift.
Best regards, madshi.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.videolan.org/pipermail/x264-devel/attachments/20110730/caf9edc2/attachment.html>
More information about the x264-devel
mailing list