← Back to context

Comment by woodruffw

1 year ago

Your understanding is a little off: we worked on integrating TUF into PyPI for a while, but ran into some scalability/distributability issues with the reference implementation. It's been a few years, but my recollection was that the reference implementation assumed a lot of local filesystem state, which wasn't compatible with Warehouse's deployment (no local state other than tempfiles, everything in object storage).

To the best of my knowledge, the current state of TUF for PyPI is that we performed a trusted setup ceremony for the TUF roots[1], but that no signatures were ever produced from those roots.

For the time being, we're looking at solutions that have less operational overhead: Sigstore[2] is the main one, and it uses TUF under the hood to provide the root of trust.

[1]: https://www.youtube.com/watch?v=jjAq7S49eow&t=1s

[2]: https://www.sigstore.dev/

python-tuf [1] back then assumed that everything was manipulated locally, yes, but a lot has changed since then: you can now read/write metadata entirely in memory, and integrate with different key management backend systems such as GCP.

More importantly, I should point out that while Sigstore's Fulcio will help with key management (think of it as a managed GPG, if you will), it will not help with securely mapping software projects to their respective OIDC identities. Without this, how will verifiers know in a secure yet scalable way which Fulcio keys _should_ be used? Otherwise, we would then be back to the GPG PKI problem with its web of trust.

This is where PEP 480 [2] can help: you can use TUF (especially after TAP 18 [3]) to do this secure mapping. Marina Moore has also written a proposal called Transparent TUF [4] for having Sigstore manage such a TUF repository for registries like PyPI. This is not to mention the other benefits that TUF can give you (e.g., protection from freeze, rollback, and mix-and-match attacks). We should definitely continue discussing this sometime.

[1] https://github.com/theupdateframework/python-tuf

[2] https://peps.python.org/pep-0480/

[3] https://github.com/theupdateframework/taps/blob/master/tap18...

[4] https://docs.google.com/document/d/1WPOXLMV1ASQryTRZJbdg3wWR...

Where does the user specify the cryptographic key to sign a package before uploading?

Serverside TUF keys are implemented FWICS, but clientside digital publisher signatures (like e.g. MS Windows .exe's have had when you view the "Properties" of the file for many years now) are not yet implemented.

Hopefully I'm just out of date.

  • > Where does the user specify the cryptographic key to sign a package before uploading?

    With Sigstore, they perform an OIDC flow against an identity provider: that results in a verifiable identity credential, which is then bound to a short-lived (~15m) signing key that produces the signatures. That signing key is simultaneously attested through a traditional X.509 PKI (it gets a certificate, that certificate is uploaded to an append-only transparency log, etc.).

    So: in the normal flow, the user never directly specifies the cryptographic key -- the scheme ensures that they have a strong, ephemeral one generated for them on the spot (and only on their client device, in RAM). That key gets bound to their long-lived identity, so verifiers don't need to know which key they're verifying; they only need to determine whether they trust the identity itself (which can be an email address, a GitHub repository, etc.).

    • What command(s) do I pass to pip/twine/build_pyproject.toml to build, upload, and install a package with a key/cert that users should trust for e.g. psf/requests?