The Prompt() Function: Use the Power of LLMs with SQL

6 hours ago (motherduck.com)

  FROM hn.hacker_news
  LIMIT 100

"Oops I forgot the limit clause and now owe MotherDuck and OpenAI $93 billion."

Interesting -- is there any impact from LLM outputs not being deterministic?

  • SQL functions can be non-deterministic just fine. E.g. SQL:2003 grammar defines DETERMINISTIC | NOT DETERMINISTIC characteristic for CREATE FUNCTION. Or, e.g. PostgreSQL has IMMUTABLE | STABLE | VOLATILE clauses.

  • Aren't LLM outputs deterministic given the same inputs?

    • Not at all. Even the ones that provide a "seed" parameter don't generally 100% guarantee you'll get back the same result.

      My understanding is that this is mainly down to how floating point arithmetic works. Any performant LLM will be executing a whole bunch of floating point arithmetic in parallel (usually on a GPU) - and that means that the order in which those operations finish can very slightly affect the result.

      1 reply →

    • They are not, necessarily. Especially when using commercial providers who may change models, finetunes, privacy layers, and all kinds of other non-foundational-model things without notice.