[vlc-devel] Future of the update mechanism
remi at remlab.net
Thu Jul 30 11:36:05 CEST 2009
On Thu, 30 Jul 2009 10:18:01 +0200, jpd at videolan.org wrote:
> On Thu, Jul 30, 2009 at 09:19:52AM +0200, R??mi Denis-Courmont wrote:
>> On Thu, 30 Jul 2009 02:05:02 +0200, jpd at videolan.org wrote:
>> > I'm not aware that ssl is much more computationally expensive than
>> > gpg
>> With OpenPGP, the server does NOT make ANY cryptographic operation. It
>> serves a file (over HTTP) that happens to contain a digital signature...
>> And you can replicate that file to any number of mirrors without
> Right. I wasn't thinking server side. You are right that trusting
> anything merely because it came in over a secured socket is fairly
> weak and costly for our servers, and so foregoing that in favour
> of one signature check makes sense, provided the keys used are not
> But you could do that with gpg and openssl both. And x509 certificates,
> even self-signed ones, can provide better protection there, but are more
> cumbersome to setup.
I don't disagree. We can use any certificate format, OpenPGP, X.509, XML or
proprietary. But we cannot only use an off-line protocol: SMIME, OpenPGP or
whatever, but not TLS.
When we manage to keep the wiki and fora up during release periods, then we
can think about TLS...
>> And, depending on the of the SSL framework, we may have to pay an
>> x.509 certificate eternally.
> That's not an argument against using openssl in general. As noted it's
> entirely possible to create our own CA cert and use that. Any openssl
> can do it, once you figure out how.
OpenSSL, as a library, is not GPL-compatible, unless it's provided by the
operating system. If we were to switch from OpenPGP to X.509, we still
couldn't use OpenSSL, at least not on Windows (I don't know if OSX has it
built-it). Besides, when we depend on gcrypt anyway, it would be highly
suboptimal to add OpenSSL.
> It is, however, an argument against using frameworks that require
> paid-for certificates, especially since no issuers so far have managed
> to satisfactorily show they vet their issuees properly. In fact, it has
> been shown time and again that they *do not* do any such thing.
> Thus, scam. Like I said.
> But that leaves us with a minor problem: How do you detect the key
> is compromised? gpgverify will just take any supplied key. gpg will
> complain that a key isn't signed but most users won't have our signing
> key signed on their keyring so that means suppressing the warnings (back
> to gpgverify). That means that the most you'll get out of gpg outside
> of close-knit gpg-using communities is ``this signature successfully
> verifies with this key'', meaning that if the key can be trusted, the
> verified data can be trusted.
> But it gives exactly *no* guarantees about the trustworthyness of the
Hey, you don't loose the key. Of then you need to implement fetching a CRL
at regular interval (ahem, yeah right...). The point is, an OpenPGP key
*can* be protected better than a X.509 HTTP server certificate. That does
not mean it *will* be better protected.
> Using x509 certificates (including on the client side only) at least
> allows us to force checking the key against a (our) CA. That's still no
> panacea, but if we manage to make use of that it is a better proposition
> for our use case. If we don't manage our key integrity it suddenly
> much less whether we sign anything or not. I am not quite prepared to
> argue it would be better to not sign anything in that case, but it is
> conceivable that would turn out be the case. Do not expect magic from the
> mere fact that we're signing; if you do, you believe in snake oil.
Well, we really don't care about the trust model (X.509 root CA or OpenPGP
web of trust) because we don't need a trust model. We just include the
public key into VLC, and check the signature against it.
More information about the vlc-devel