← Back to context

Comment by woodruffw

5 hours ago

> It's good that PyPI signs whatever is uploaded to PyPI using PyPI's key now.

You might have misunderstood -- PyPI doesn't sign anything with PEP 740. It accepts attestations during upload, which are equivalent to bare PGP signatures. The big difference between the old PGP signature support and PEP 740 is that PyPI actually verifies the attestations on uploads, meaning that everything that gets stored and re-served by PyPI goes through a "can this actually be verified" sanity check first.

I'll try to answer the others piecewise:

1. Yes. You can use twine's `--attestations` flag to upload any attestations associated with distributions. To actually generate those attestations you'll need to use GitHub Actions or another OIDC provider currently supported as a Trusted Publisher on PyPI; the shortcut for doing that is to enable `attestations: true` while uploading with `gh-action-pypi-publish`. That's the happy path that we expect most users to take.

2. Not yet; the challenge there is primarily technical (`pip` can only vendor pure Python things, and most of PyCA cryptography has native dependencies). We're working on different workarounds for this; once they're ready `pip` will know which identity - not key - to trust based on each project's Trusted Publisher configuration.

3. Not yet, but this is needed to make downstream verification in a TOFU setting tractable. The current plan is to use the PEP 751 lockfile format for this, once it's finished.

4. That would be up to each of those tools to implement. They can follow PEP 740 to do it if they'd like.

I don't really know how to respond to the rest of the comment, sorry -- I find it a little hard to parse the connections you're drawing between PGP, DIDs, etc. The bottom line for PEP 740 is that we've intentionally constrained the initial valid signing identities to ones that can be substantiated via Trusted Publishing, since those can also be turned into verifiable Sigstore identities.

So PyPI acts as keyserver, and basically a CSR signer for sub-CA wildcard package signing certs, and the package+key mapping trusted authority; and Sigstore acts as signature server; and both are centralized?

And something has to call cosign (?) to determine what value to pass to `twine --attestations`?

Blockcerts with DID identities is the W3C way to do Software Supply Chain Security like what SLSA.dev describes FWIU.

And then now it's possible to upload packages and native containers to OCI container repositories, which support artifact signatures with TUF since docker notary; but also not yet JSON-LD/YAML-LD that simply joins with OSV OpenSSF and SBOM Linked Data on registered (GitHub, PyPI, conda-forge, deb-src,) namespace URIs.

GitHub supports requiring GPG signatures on commits.

Git commits are precedent to what gets merged and later released.

A rough chronology of specs around these problems: {SSH, GPG, TLS w/ CA cert bundles, WebID, OpenID, HOTP/TOTP, Bitcoin, WebID-TLS, TOTP, OIDC OpenID Connect (TLS, HTTP, JSON, OAuth2.0, JWT), TUF, Uptane, CT Logs, WebAuthn, W3C DID, Blockcerts, SOLID-OIDC, Shamir Backup, transferable passkeys, }

For PyPI: PyPI.org, TLS, OIDC OpenID Connect, twine, pip, cosign, and Sigstore.dev.

Sigstore Rekor has centralized Merkle hashes like google/trillian which centralizedly hosts Certificate Transparency logs of x509 certs grants and revocations by Certificate Authorities.

W3C DID Decentralized Identifiers [did-core] > 9.8 Verification Method Revocation : https://www.w3.org/TR/did-core/#verification-method-revocati... :

> If a verification method is no longer exclusively accessible to the controller or parties trusted to act on behalf of the controller, it is expected to be revoked immediately to reduce the risk of compromises such as masquerading, theft, and fraud