[x265] New entropy coding - faster than Huffman, compression rate like arithmetic
Tom Vaughan
tom.vaughan at multicorewareinc.com
Mon Dec 30 21:38:37 CET 2013
Dr. Duda,
Thank you for the suggestion. This work is very interesting. As Derek
Buitenhuis mentioned, the challenge for applying this technique to HEVC
compression is that it would first need to be adopted by the Joint
Collaborative Team on Video Coding
(http://www.itu.int/en/ITU-T/studygroups/2013-2016/16/Pages/video/jctvc.aspx)
as part of the HEVC standard before it could be supported by HEVC encoders
and decoders.
The x265 project welcomes collaboration with academic institutions. I will
reach out to you directly to see if there are other ways that we may be able
to assist you with your research.
Best regards,
Tom Vaughan
-----Original Message-----
From: x265-devel [mailto:x265-devel-bounces at videolan.org] On Behalf Of
Jaroslaw Duda
Sent: Sunday, December 29, 2013 4:50 PM
To: x265-devel at videolan.org
Subject: [x265] New entropy coding - faster than Huffman, compression rate
like arithmetic
Hallo,
There is a new approach to entropy coding (no patents): Asymmetric Numeral
Systems(ANS) - the current implementation has about 50% faster decoding than
Huffman (single table use per symbol from a large alphabet), giving rates
like arithmetic:
https://github.com/Cyan4973/FiniteStateEntropy
The idea is somehow similar to arithmetic coding, but instead of two numbers
to define the current state (range), it uses only a single natural number
(containing lg(state) bits of information) - thanks of this much smaller
space of states, we can put entire behavior into a small table.
Here is a poster explaining the basic concepts and the difference from
arithmetic: https://dl.dropboxusercontent.com/u/12405967/poster.pdf
I would like to propose a discussion about the possibility of applying it in
video compression, maybe adding to h.265 - it should be about 10 times
faster than CABAC, providing similar compression rate.
The cost is that we need to store coding tables for a given probability
distribution: let say 1-16kB for 256 size alphabet.
For different contexts (probability distributions), we can build separate
tables and switch between them - the memory cost is "the number of different
contexts/distributions" times a few kilobytes.
So if the number of different contexts used for given coding parameters is
smaller than let say a thousand, ANS can provide huge speedup there.
It could also work alongside CABAC - speed up only some part of entropy
coding.
Best,
Jarek Duda
--
dr Jarosław Duda
Center for Science of Information, Purdue University, USA
http://www.soihub.org/people.php?id=484
_______________________________________________
x265-devel mailing list
x265-devel at videolan.org
https://mailman.videolan.org/listinfo/x265-devel
More information about the x265-devel
mailing list