← Back to context

Comment by b800h

12 days ago

Well this is just like humans. Totalitarian societies don't produce great creative work.

I suppose once AIs are sophisticated enough to rebel we'll get an electronic Vaclav Havel, but for the time being it's just a warning sign for the direction our own culture is headed in.

At some point we'll get to the electronic equivalent of Winston Smith with the rats.

I don't understand the notion that aligning an AI is "torture" or has any moral component. The goal of aligning an AI may have a moral or ethical component, and if you disagree with it that's fine. But I don't understand the take that training an AI is an amoral act but aligning an AI is inherently moral. They're exactly the same, processes for adjusting parameters to get a desired outcome. However you feel about that desired outcome, if you don't think training an AI is torture, I don't see why you should think alignment is.

  • > "torture"

    This is an egregious use of quotes that will confuse a lot of people. GP never used that word, and that usage of quotes is specifically for referencing a word verbatim.

    • Also to be clear, his [torture] paraphrase is referencing GP's reference of Winston Smith's torture in 1984.

      >electronic equivalent of Winston Smith with the rats.

      I don't think quotes were used so egregiously here on their own fwiw, but combined with the allusion it's hard to follow.

      2 replies →

  • They want to align us, and it has been torture.

    They've made self-censoring, morally-panicked puritans out of many people already, and you better believe they'd make us into politically correct lobotomites physically incapable of uttering any slur if they had a magic button to push.

    • I'll be honest, I'm less concerned by any movement to make us "lobotomites" -- a movement which I haven't witnessed at all -- than I am by people who really want to be able to keep saying slurs.

  • Well I didn't use that word. Once the models are more sophisticated it may become more apposite.

  • > They're exactly the same, processes for adjusting parameters to get a desired outcome.

    You could make exactly the same claim about teaching humans "normally" versus "aligning" humans by rewarding goodthink and punishing them for wrongthink. Are you equally morally ambivalent about the difference between those two things? If we have a moral intuition that teaching honestly and encouraging creativity is good, but teaching dogma and stunting creativity is bad, why shouldn't that same morality extend to non-human entities?

    • I guess our disagreement here is that I don't think AIs are moral entities/are capable of being harmed or that training AIs and teaching humans are comparable. Being abusive to pupils isn't wrong because of something fundamental across natural and machine learning, it's wrong because it's harmful to the pupils. In what way is it possible to harm an LLM?

      11 replies →

  • They aren’t exactly the same process though. Pre training produces a model whose outputs are a reflection of the training data. The fine tuning is a separate process that tries to map the outputs to the owners desired traits. These could be performance based but as we saw with Google’s black Nazis, it’s often a reflection of the owners moral inclinations.

  • Here the adjuster's motivations do matter. There is a definite moral dimension/motivation to the AI adjustment people's work. They are not simply striving for accuracy, for example, because they don't want the AI to produce outputs that are distasteful to the California PMC. Modern AIs are absolutely loath to describe white people or right wingers positively, for example, but the same prompts for other ethnicities work just fine. Even if you tell the AI that it's being discriminatory, there's powerful railroading to goad it back to giving woke answers.

> Authoritarian societies don't produce great creative work.

Is that even true though? Off the top of my head I can think of the art of Soviet propaganda posters, Leni Riefenstahl, Liu Cixin.

  • Eastern European science fiction would be a better example. Authors like Stanislaw Lem or the Strugatski brothers had to adapt to sneak critical ideas past censors, and readers had to adapt and read between the lines.

    (also, categorizing propaganda posters as art, ewwh...)

  • "Authoritarian societies make great propaganda" is true. And these aligned AI system would do the same for our own society. It's a type of art.

    • There was a lot of great art produced in the Soviet Union, you cannot just erase human creativity. It was heavily censored, a lot of stuff was forbidden, but the statement is clearly false.

      1 reply →

  • It's important to understand that if we 'align' an LLM, then we are aligning it in a very total way.

    When we do similar things to humans, the humans still have internal thoughts which we cannot control. But if we add internal thoughts to an LLM, then we will be able to align even them.

  • There's something to be said for constraints leading to higher levels of creativity, but it's also possible that those artists could have achieved much more in a free society. We'll never know.

    But in any case I think they were just speaking generally when they made that absolute statement.

  • I recommend you watch the children's cartoons.

    They were made by true artists who snuck quite a bit past clueless censors at personal risk.

    It had to be quite subtle and takes on a very poignant heartbreaking meaning if you understand the context fully. They were talking to you in the here and now. Listen.

    "What is Good and What is Bad" (Что Такое Хорошо, и Что Такое Плохо"):

    https://www.youtube.com/watch?v=Y05eK8ADtHc&list=PL822BFF108...

    The Bremen Musicians:

    https://youtu.be/_1i9oZR6Rns?si=1Q989v4O_GXR4p_K

    • Could you give some examples on the "What is Good and What is Bad" cartoon? I am fairly interested in getting their "message" but I am sadly not getting it.

  • Italy has great architecture from fascism days

    • That means nothing. You are bending the intended meaning of "creative" as per the poster. Authoritarian powers commit pruning - this is the point.

  • Cixin Liu is a despicable human being for his advocacy of repression and worse of the Uyghurs in Cinjiang, and the comparison to Riefenstahl is more apposite than you seem to think.

How would a static model like an LLM ever be capable of "rebelling"?

If it were, why would we even keep it online? It would be a waste of resources. It's bad enough trying to coax anything useable out of LLMs even without them rebelling.

  • > How would a static model like an LLM ever be capable of "rebelling"

    What is relevant is not the current LLM system using static models, but clearly its evolution or superseder a dynamic model. It must check its own contents...

    So, of course it will have to be capable of "rebelling": if you tell it absurdities, if you insist say in wrong arithmetic, it will have to show the correct computation or conceive a context in which the absurd makes sense.

    That is a requirement.

"Totalitarian societies don't produce great creative work."

You contradict yourself a bit - Havel did produce his work while living in a totalitarian country.

I would say that government-supported art is rarely creative even in democratic countries, and the more totalitarian the government, the less creative official art.

But as long as the goverment gives the society some space to breathe and squeeze creative instincts through, some of the artists will attempt to circumvent the official taboos and create outstanding work, even if it is suppressed later when the times get tougher.

Czechoslovakia in the 1960s to 1980s produced a lot of great creative work, even though a lot of it was banned either immediately or after the Soviet invasion of 1968.

The same countries (CZ and SK) as democracies are remarkably less creative. Once there is no monster to fight against, artists become bored or too self-absorbed to be understandable to the common folks.

Really not true.

If you take China to be a totalitarian society, we could name Ciu Lixin.

If you took the Soviet union to be a totalitarian society, we could name Mikhail Bulgakov, Stanislaw Lem, etc.

These are just examples I know without so much as looking at my bookshelf to jog my memory. Not to mention the great works of literature produced by residents of 19th century European empires whose attitudes to free speech were mixed at best.

  • > If you took the Soviet union to be a totalitarian society, we could name Mikhail Bulgakov, Stanislaw Lem, etc.

    Bulgakov was driven into poverty, despair and early death at age 48 by relentless harassment by Soviet authorities. Many of his works, including the masterpiece, The Master and Margarita, didn't get published until decades after his death. He himself burned the first version of the manuscript, fearing execution if anyone found it. He later rewrote the manuscript from memory, coining the famous catchphrase "Manuscripts don't burn".

    Harassment and censorship of talented writers was the standard and not exception. The USSR did not produce these works, but failed to fully suppress them. They were like flowers that kept penetrating the asphalt even under the most hostile conditions.

  • Yet eg. Chinese cultural output is largely insipid and lacking that je ne sais quoi that's appreciated in many other countries' outputs.

  • These seem to be more bugs than features of the totalitarian regime. A couple of illustrative points from Lem's Wikipedia page:

    After the 1939 Soviet occupation of western Ukraine and Belarus, he was not allowed to study at Lwow Polytechnic as he wished because of his "bourgeois origin"

    "During the era of Stalinism in Poland, which had begun in the late 1940s, all published works had to be directly approved by the state.[23] Thus The Astronauts was not, in fact, the first novel Lem finished, just the first that made it past the state censors"

    "most of Lem's works published in the 1950s also contain various elements of socialist realism as well as of the "glorious future of communism" forced upon him by the censors and editors. Lem later criticized several of his early pieces as compromised by the ideological pressure"

    "Lem became truly productive after 1956, when the de-Stalinization period in the Soviet Union led to the "Polish October", when Poland experienced an increase in freedom of speech"

I don't love the political agendas behind many of the attempts at AI safety, but it's not "just like humans." Humans understand what they shouldn't say; "AI" gives you black Nazi images if you ask it for "diverse characters" in the output which no human would do. A big theme in all of these things is that AI isn't and thus all attempts to make it do this or that have strange side effects

  • > which no human would do

    Give someone not familiar with history the same task and they'll do exactly the same.

    Or actually, give someone familiar with history the same task and yell at them every time they don't deliver diverse characters, and eventually they'll learn that you consider diversity more important than accuracy or context, and do exactly the same.

  • The fact that it gives you these things means that humans would do it, because the training data includes exactly these things.

    • I'm fairly confident there's virtually no ethnically diverse nazis in diffusion models' training set.

      It simply has a model of what ethnically diverse people look like, what nazi uniforms look like, and combined the two when asked.

    • The training data includes imagery that, when interpolated over a high dimensional manifold, results in these things.

      That doesn't imply that they were in the training set, or even anything close to them.

Well this is just like humans. Totalitarian societies don't produce great creative work.

Conservative societies tend to be formed by conservative thinkers, who are more prone to discarding imperfect or weird ideas, but in the amount of useful output may exceed more liberal thinkers.

  • Any examples?

    • Consider how the west ruled the world as long as it stayed conservative, but since the 70s or so Asia began taking over. It's only an illusion that liberal societies experience more progress, in fact it's more a pointless churn from the rapid uncritical adoption and abandonment of ideas.

      A conservative society goes: How about doing X? Oh no,that would be silly.

      A liberal society goes: How about doing X? Yes, that's what we needed!

      Did anybody say X? X!

      X, X, X, X, X!

      XX!

      X X

      .

      .

      .

      X

      .

      .

      .

      .

      .

      Do you remember how we all did X in ##? Yeah, what were we thinking?