[vlc-devel] Why/how VLC is so fast? Especially with MJPEG streams.
eric.beuque at gmail.com
Mon Jun 4 16:33:39 CEST 2012
I'm developing a Qt application which display an MJPEG stream from an Ip
Camera. I was thinking my application was fast until i compare it with VLC.
Actually, I'm reading the same stream in my application and VLC (unchecking
the "always fit to windows") option, to compare the rendering of the same
resolution (actually 1280x800).
On my computer (Linux 3.2 LMDE 64bits, intel core i5): my application takes
25% of CPU time, VLC only 8-9%.
On Windows 7 32 bits (inside a virtual box VM): my application takes
90-100% of CPU, VLC only 35-40%.
I'm using the libjpeg-turbo library to decode JPEG image to a Pixmap. This
library allow to use SIMD processor instructions to optimize decoding. This
is 2-4x more faster that the standard IJG JPEG library. But the CPU time
libjpeg-turbo decode the image, VLC has the time to render completely the
image on the screen.
So here are my question:
- How VLC can be more than twice faster than an really optimized library
dedicated for this task?
- How VLC perform the rendering? Qt has only one GUI Thread, to perform
rendering. But I'm thinking that VLC do not render the video using this GUI
thread, right? Is VLC drawing directly to the X11 or GDI+ output? I'm only
using a QWidget where i render all QImage i received, but i think it's not
he best way to do that.
- I tried to read the source code of VLC, i found the mjpeg demux, but i
don't understand where the data goes to be decoded. Could you explain to me
how globally works the VLC core and it threading?
Thank you very much, if you can help me to understand what is wrong with my
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the vlc-devel