[vlc-devel] [PATCH] opengl: interop: drain error in texture test

Alexandre Janniaux ajanni at videolabs.io
Thu Dec 24 09:17:34 UTC 2020

The code is used to detect whether we can allocated the texture, and is
especially designed to handle Apple hardware which should supposedly
support 16bit textures but actually fails during the TexImage2D
allocation call on most Apple OpenGL/ES implementations.

Because it was stacking OpenGL errors, the GL_ASSERT_NOERROR macro would
trigger when enabled, though it should have been silenced into interop
failure instead.

Drain the errors from the OpenGL error stack and return a failure. For
sanity purpose, also assert that there were no errors before this
function so that debug build don't silently drain errors from the
 modules/video_output/opengl/interop.c | 9 +++++++++
 1 file changed, 9 insertions(+)

diff --git a/modules/video_output/opengl/interop.c b/modules/video_output/opengl/interop.c
index 9abb2f802f..d9b74441ff 100644
--- a/modules/video_output/opengl/interop.c
+++ b/modules/video_output/opengl/interop.c
@@ -163,6 +163,7 @@ vlc_gl_interop_DeleteTextures(const struct vlc_gl_interop *interop,
 static int GetTexFormatSize(const opengl_vtable_t *vt, int target,
                             int tex_format, int tex_internal, int tex_type)
     if (!vt->GetTexLevelParameteriv)
         return -1;
@@ -192,6 +193,14 @@ static int GetTexFormatSize(const opengl_vtable_t *vt, int target,
     vt->GetTexLevelParameteriv(target, 0, tex_param_size, &size);
     vt->DeleteTextures(1, &texture);
+    bool has_error = false;
+    while (vt->GetError() != GL_NO_ERROR)
+        has_error = true;
+    if (has_error)
+        return -1;
     return size > 0 ? size * mul : size;

More information about the vlc-devel mailing list