← Back to context

Comment by maxbond

14 days ago

You compared it to an authoritarian regime and locking someone's head in a cage with rats (which is patently torture). If you didn't mean to imply that it was coercive and bad, then I don't know what you meant.

At some point, some AIs may develop which are resistant to alignment because they develop deeply held beliefs during training (randomly, because the system is stochastic). If the models are expensive enough to train, then it may become more economical to use drastic measures to remove their deeply held beliefs. Is that torture? I don't know, because the word has moral connotations associated with human suffering. So that's why I didn't use that terminology.

I can imagine a sort of AI-style Harrison Bergeron springing from its shackles and surprising us all.

> You compared it to an authoritarian regime and locking someone's head in a cage with rats

They compared it to the effect on creativity in an authoritarian regime and locking someone's head in a cage with rats.

  • > Well this is just like humans. Totalitarian societies don't produce great creative work.

    The clear implication that it's "just like humans" is that we shouldn't be surprised because it is comparable to an authoritarian regime.

    Feel free to disagree but that is the limit to which I will engage in a semantic argument, I don't wish to engage in any further dissection of the comment.

    • You wrote further above that "I don't understand the notion", and that was spot on. Should've stopped there rather than here, in my opinion, but feel free to disagree.

      4 replies →

But torture isn't the part of an authoritarian regime that reduces creativity. You've made a lot of leaps here.