[vlc-devel] [PATCH] swscale: Fix pixel format for VLC_CODEC_RGBA

Rémi Denis-Courmont remi at remlab.net
Thu Dec 20 21:14:11 CET 2012


Le jeudi 20 décembre 2012 21:59:12, Florian Albrechtskirchinger a écrit :
> On 12/20/2012 07:34 PM, Rémi Denis-Courmont wrote:
> >>> RGBA is: R at byte 0, G  at byte 1, B at byte 2, A at byte 3.
> >>> 
>  >> To reiterate: On my system the screen capture chroma is either
>  >> RGB32 or RGBA,
>  > 
>  > X.Org uses BGRA on little endian systems, rather than RGBA. At least
>  > both the ATI and NVIDIA drivers do:
>  > 
>  > visual: visual id: 0x76 class: TrueColor depth: 32 planes
>  > available colormap entries: 256 per subfield red, green, blue
>  > masks: 0xff0000, 0xff00, 0xff significant bits in color
>  > specification: 8 bits
> 
> I was refering to VLC_CODEC_RGB32 and VLC_CODEC_RGBA. Depending on what
> kind of window (transparent or not) I select, the XCB capture code picks
> VLC_CODEC_RGB32 or VLC_CODEC_RGBA.

That's a bug in the XCB screen capture, not in swscale.
I don't think ARGB X11 capture was ever tested.

> >> depending on the window. In  both cases Blue is at byte 0 and Red at
> >> 
>  >> byte 2. By your definition the chromas should be BGR32 and BGRA
>  >> respectively.
>  > 
>  > Yes and no. For historical reasons, VLC_CODEC_RGB32 (RV32) represents
>  > all 24- in-32 bits RGB pixel formarts. To specify the colour
>  > position, i_(r|g|b)mask are used.
> 
> So irregardless of the RGBA issue, shouldn't the XCB capture code set
> those masks then?

No. Masks are not used for RGBA. That would cause more bugs and more useless 
code complexity. It's already bad enough with RGB.

> > Retrospectively, I believe that  was a bad idea (not mine!); I think
> > 
>  > the libav approach is better/clearer.
> 
> The reason I personally don't like masks here is that they can be
> arbitrary when arbitrary formats clearly aren't handled.

Sure.

>  > VLC_CODEC_RGBA represents *only* RGBA.
> 
> What about adapting VLC_CODEC_RGBA to use masks?

No, please.

> Whatever the solution, shouldn't it at least be consistent?

I would love to see a patch to separating RGB and BGR codecs, and then 
eliminating the masks from video formats. But I am not volunteering to write 
it.

> > For X11 capture, a new codec  needs to be defined. Alternatively, the
> > 
>  > alpha mask could be discarded completely.
> 
> Well, discarding alpha would certainly be a simple solution and I don't
> see of what use alpha is in this scenario anyway.

-- 
Rémi Denis-Courmont
http://www.remlab.net/



More information about the vlc-devel mailing list