Hi,<br><br>I'm developing a Qt application which display an MJPEG stream from an Ip Camera. I was thinking my application was fast until i compare it with VLC.<br><br>Actually, I'm reading the same stream in my application and VLC (unchecking the "always fit to windows") option, to compare the rendering of the same resolution (actually 1280x800).<br>
On my computer (Linux 3.2 LMDE 64bits, intel core i5): my application takes 25% of CPU time, VLC only 8-9%.<br>On Windows 7 32 bits (inside a virtual box VM): my application takes 90-100% of CPU, VLC only 35-40%.<br><br>
I'm using the libjpeg-turbo library to decode JPEG image to a Pixmap. This library allow to use SIMD processor instructions to optimize decoding. This is 2-4x more faster that the standard IJG JPEG library. But the CPU time libjpeg-turbo decode the image, VLC has the time to render completely the image on the screen.<br>
<br>So here are my question:<br>- How VLC can be more than twice faster than an really optimized library dedicated for this task?<br>- How VLC perform the rendering? Qt has only one GUI Thread, to perform rendering. But I'm thinking that VLC do not render the video using this GUI thread, right? Is VLC drawing directly to the X11 or GDI+ output? I'm only using a QWidget where i render all QImage i received, but i think it's not he best way to do that.<br>
- I tried to read the source code of VLC, i found the mjpeg demux, but i don't understand where the data goes to be decoded. Could you explain to me how globally works the VLC core and it threading?<br><br>Thank you very much, if you can help me to understand what is wrong with my program.<br>