[vlc-devel] [PATCH] swscale: Fix pixel format for VLC_CODEC_RGBA

Florian Albrechtskirchinger falbrechtskirchinger at gmail.com
Thu Dec 20 20:59:12 CET 2012


On 12/20/2012 07:34 PM, Rémi Denis-Courmont wrote:
>>> RGBA is: R at byte 0, G  at byte 1, B at byte 2, A at byte 3.
 >>
 >> To reiterate: On my system the screen capture chroma is either
 >> RGB32 or RGBA,
 >
 > X.Org uses BGRA on little endian systems, rather than RGBA. At least
 > both the ATI and NVIDIA drivers do:
 >
 > visual: visual id: 0x76 class: TrueColor depth: 32 planes
 > available colormap entries: 256 per subfield red, green, blue
 > masks: 0xff0000, 0xff00, 0xff significant bits in color
 > specification: 8 bits
I was refering to VLC_CODEC_RGB32 and VLC_CODEC_RGBA. Depending on what 
kind of window (transparent or not) I select, the XCB capture code picks 
VLC_CODEC_RGB32 or VLC_CODEC_RGBA.

>> depending on the window. In  both cases Blue is at byte 0 and Red at
 >> byte 2. By your definition the chromas should be BGR32 and BGRA
 >> respectively.
 >
 > Yes and no. For historical reasons, VLC_CODEC_RGB32 (RV32) represents
 > all 24- in-32 bits RGB pixel formarts. To specify the colour
 > position, i_(r|g|b)mask are used.
So irregardless of the RGBA issue, shouldn't the XCB capture code set 
those masks then?

> Retrospectively, I believe that  was a bad idea (not mine!); I think
 > the libav approach is better/clearer.
The reason I personally don't like masks here is that they can be 
arbitrary when arbitrary formats clearly aren't handled.

 > VLC_CODEC_RGBA represents *only* RGBA.
What about adapting VLC_CODEC_RGBA to use masks? Whatever the solution, 
shouldn't it at least be consistent?

> For X11 capture, a new codec  needs to be defined. Alternatively, the
 > alpha mask could be discarded completely.
Well, discarding alpha would certainly be a simple solution and I don't 
see of what use alpha is in this scenario anyway.

Options, options, ...

Florian




More information about the vlc-devel mailing list