[vlc-commits] opengl: fixed playback of 10bit content on Macs with OpenGL 1.4 drivers, notably using the GMA 950 chipset (close #5973)
Felix Paul Kühne
git at videolan.org
Sat Jun 9 11:51:38 CEST 2012
vlc | branch: master | Felix Paul Kühne <fkuehne at videolan.org> | Sat Jun 9 11:50:48 2012 +0200| [adcc3b14a16c712af51bfc9b7940921a9e6b3468] | committer: Felix Paul Kühne
opengl: fixed playback of 10bit content on Macs with OpenGL 1.4 drivers, notably using the GMA 950 chipset (close #5973)
We do so by forcefully disabling the 16bit shaders, which the chipset pretends to support, but actually doesn't.
> http://git.videolan.org/gitweb.cgi/vlc.git/?a=commit;h=adcc3b14a16c712af51bfc9b7940921a9e6b3468
---
modules/video_output/opengl.c | 18 ++++++++++++++++++
1 file changed, 18 insertions(+)
diff --git a/modules/video_output/opengl.c b/modules/video_output/opengl.c
index 82356dd..f7732b4 100644
--- a/modules/video_output/opengl.c
+++ b/modules/video_output/opengl.c
@@ -130,6 +130,24 @@ static inline int GetAlignedSize(unsigned size)
static bool IsLuminance16Supported(int target)
{
+#if defined(MACOS_OPENGL)
+ /* OpenGL 1.x on OS X does _not_ support 16bit shaders, but pretends to.
+ * That's why we enforce return false here, even though the actual code below
+ * would return true.
+ * This fixes playback of 10bit content on the Intel GMA 950 chipset, which is
+ * the only "GPU" supported by 10.6 and 10.7 with just an OpenGL 1.4 driver.
+ *
+ * Presumely, this also improves playback on the GMA 3100, GeForce FX 5200,
+ * GeForce4 Ti, GeForce3, GeForce2 MX/4 MX and the Radeon 8500 when
+ * running OS X 10.5. */
+ const GLubyte * p_glversion;
+ float f_glversion;
+ p_glversion = glGetString (GL_VERSION);
+ sscanf((char *)p_glversion, "%f", &f_glversion);
+ if (f_glversion < 2)
+ return false;
+#endif
+
GLuint texture;
glGenTextures(1, &texture);
More information about the vlc-commits
mailing list