[x264-devel] x86: Only enable AVX-512 in 8-bit mode

Henrik Gramner git at videolan.org
Mon May 22 00:04:33 CEST 2017


x264 | branch: master | Henrik Gramner <henrik at gramner.com> | Mon May 15 00:18:36 2017 +0200| [d1fe6fd1c0930d88da90f23f6d5fdb6ceaf6b0a9] | committer: Henrik Gramner

x86: Only enable AVX-512 in 8-bit mode

> http://git.videolan.org/gitweb.cgi/x264.git/?a=commit;h=d1fe6fd1c0930d88da90f23f6d5fdb6ceaf6b0a9
---

 encoder/encoder.c | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/encoder/encoder.c b/encoder/encoder.c
index 4b7dcdac..7c2fe80f 100644
--- a/encoder/encoder.c
+++ b/encoder/encoder.c
@@ -1526,6 +1526,12 @@ x264_t *x264_encoder_open( x264_param_t *param )
     x264_rdo_init();
 
     /* init CPU functions */
+#if (ARCH_X86 || ARCH_X86_64) && HIGH_BIT_DEPTH
+    /* FIXME: Only 8-bit has been optimized for AVX-512 so far. The few AVX-512 functions
+     * enabled in high bit-depth are insignificant and just causes potential issues with
+     * unnecessary thermal throttling and whatnot, so keep it disabled for now. */
+    h->param.cpu &= ~X264_CPU_AVX512;
+#endif
     x264_predict_16x16_init( h->param.cpu, h->predict_16x16 );
     x264_predict_8x8c_init( h->param.cpu, h->predict_8x8c );
     x264_predict_8x16c_init( h->param.cpu, h->predict_8x16c );



More information about the x264-devel mailing list