← Back to context

Comment by troad

14 days ago

I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning. Then, at a sufficient level of complexity, a programmer gravitates to solutions like OOP or FP, but there's an obvious trade off in readability there. 99 Bottles of Beer implemented with a loop is intrinsically going to be easier to read than an implementation with tail recursion, even though the latter is generally better. Lisp's inside-out parentheses style adds yet more cognitive load on top of that.

Many things are socially constructed, but not everything.

> I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning.

When 6.001 (the introductory class for which SICP was written) was launched, most of the students who took it had never used a computer before. Yes, MIT students. This was around ~1980. And in the first hour of their first class they were already doing symbolic differentiation in scheme.

I think your view of what’s “natural” is a just so story.

  • > And in the first hour of their first class they were already doing symbolic differentiation in scheme.

    People heavily trained in maths can take quickly to languages designed to make programming look like maths, that's hardly a surprise.

    I wouldn't base my assumptions about what most people find natural on the experience of MIT students taking 6.001 in 1980.

    (Not to mention, 'doing' is doing a lot of heavy lifting in that sentence. I could show you a intricate sentence in French in the first hour of your first French class, but unless you came up with it yourself, are you demonstrating much learning just yet?)

  • Yeah, mid-1980s the thing incoming students had to unlearn was BASIC, not anything with curly braces. (Source: I was a 6.001 Lab TA at the time.) Of course, the next class on the rotation used Clu, where you had to unlearn "recursion is free".

Imperative programming is probably the most intuitive, but I'm doubtful curly braces and C-like syntax are anything more than coincidence. The first programming language was Fortran, and it didn't look anything like C. This is a really old Fortran program copied from a book:

     WRITE(6,28)
     READ(5,31) LIMIT
     ALIM = LIMIT
   5 SUM=0.0
     DO 35 ICNT=1,LIMIT
     READ(5,32) X
  35 SUM = SUM + X
     AMEAN = SUM/ALIM
     WRITE(6,33) AMEAN
     GO TO 5
  28 FORMAT(1H1)
  31 FORMAT(I3)
  32 FORMAT(F5.2)
  33 FORMAT(8H MEAN = .F8.2)
     END

Most modern programming languages seem to take inspiration from C, which took inspiration from BCPL, and that from Algol. Others took inspiration from Algol directly, like Ada, or Lua. And Python has indentation-based block structure, rather than having blocks of statements delimited by braces or or an "end" keyword.

  • I always liked Pascal's BEGIN and END statements instead of curly braces. There is also Basic where the blocks built into control flow statements, like FOR I=1 TO 5 [code here] NEXT I

    I'd argue a lot of programming language evolution is influenced by the capabilities of our IDEs. When you code in a text editor, the terse syntax of C is great and brings advantages over the verbosity of Pascal, Basic or god forbid Cobol. Once your editor does auto-indentation the braces seem redundant and you get Python. Smart completions from IntelliSense are essential to efficiently writing C#, and now that LSP has brought that to every IDE or smart text editor we have the explosion of popularity of more explicit and more powerful type systems (Typescript, typed Python, Rust). Programming languages are shaped by their environment, but the successful ones far outlive the environment that shaped them.

  • It really depends on your mindset. I grew up with math (composable operators, no side effects) and a lot of immutable + virtual operations software (maya, samplitude, shake, combustion) ... so to me imperative programming, with all the control flow, subtly changing state and time dependencies, coupling of concerns was almost instantaneously an fatal issue..

    Backus also shifted away from imperative inspired languages to design FP/FL language (I thought they were contemporaries of BCPL but came 10 years later, later than APL), even though he contributed to FORTRAN directly.

    • You know, you're probably right. It's just been so long since I was introduced to programming languages that I had almost forgotten (though I'm probably younger than you).

      I remember learning JavaScript as a kid (for some class) and trying to get used to the mutable variables, having to mutter to myself "Okay, here, let x be 4. After this line, x is x + 1, which is 5, a new value." From there, eventually thinking things like: "After every loop, x changes to be itself plus 1. So after the loop, x will be its value before the loop plus however many times the loop ran." Things like that. Basically informal Hoare logic without realizing it.

      I had almost forgotten, because I then went years before I programmed again, and the language I learned was C, which was probably easier because I was already familiar with while loops and mutable variables.

      Maybe it would have been equally intuitive to learn a functional language first. It's probably no more intuitive to mutter that under your breath versus stuff about the type system and equational reasoning.

      On the other hand, it seems easier to get beginners interested in programming with an imperative approach. In our assignments in that class using JavaScript, we used libraries to make little games, which imperative programming seems better-suited for.

> I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning.

Why do you believe this is anything more than an historical accident?

For example, it wasn't what Alonzo Church gravitated to when he invented the lambda calculus in the 1930s, before any programming languages or indeed general-purpose computers existed.

> 99 Bottles of Beer implemented with a loop is intrinsically going to be easier to read than an implementation with tail recursion

First, you don't need to use explicit tail recursion. See e.g. https://99-bottles-of-beer.net/language-haskell-1070.html

Second, this sounds like unfamiliarity, not anything inherent. Why is it "intrinsically easier to read"? For a tail recursive version, the main tail recursive function would look like this in Haskell:

    _99bottles 0 = printVerse 0
    _99bottles n = do
        printVerse n
        _99bottles (n - 1)

In fact, with a bit of experience you might write this as:

    _99bottles 0 = printVerse 0
    _99bottles n = printVerse n >> _99bottles (n - 1)

It's only less easy to read if you're completely unfamiliar with the concepts of pattern matching and recursion. But the same is true of any programming language.

Given the above, what's a "for loop" and why would you need one? Sounds complicated and unnatural.

OOP is imperative programming. It's just function calls where the first parameter is to the left of the function name, after all.

A better name for "non-OOP" programming is procedural programming, where you organize code in long blocks that go straight down, code duplication is accepted vs jumping all over the place, etc. Honestly underrated. It can be quite easy to understand.

Strictly-evaluated FP is also imperative. The only really different languages are the ones with different evaluation systems or that can do things besides evaluate - people like to say Haskell is the best here but I think it's actually unification languages like Mercury. Maybe even SQL with transactions.

I'd argue a FP implemenation with map (something like `[99..1].map(|n| f'{n} bottles of beer ... {n-1} bottles of beer on the wall').join('\n\n')`) is inherently as readable as the for loop, and not really more complex.

There are lots of great parts in FP, and for the last ~10-15 years imperative programming languages have made a lot of effort to add them to their syntax. You just need to leave out the more dogmatic parts that make FP popular in academia.

  • Hehe, it's easy if you ignore half the song, the singular for n=1 and the whole n=0 case! (Not that we're talking about rocket science if you don't, but c'mon, oranges to oranges!)

    I agree with you otherwise though.

Assembly is imperative, so there's a lot to be said for a language that mimics how the computer actually works. Lisps always leave me saying, "oh, that's clever."

> even though the latter is generally better

Why is tail recursion better generally? I'm not familiar with FP very much, but it feels like loops more closely resemble the way computers execute them than tail recursion.

  • It's more concise and illustrates the common subproblem. Loops make you model a state machine in your head, which I'd rather leave to the computer.

  • Very fair question. It may seem surprising, but loops (especially `for` loops) don't really reflect the underlying machine code very well. There is no distinct loop concept in machine code; instead, there are ordinary instructions followed by conditional jumps (if {predicate} go to {memory address} and keep on executing from there), which may or may not return to an earlier point, and will thereby conditionally repeat. Tail recursion, provided it's done in a compiler that understands tail recursion optimisation, will in some ways mirror this better than a `for` loop. (Though an old school, C-style 'do {code} while {predicate}' - note the order, and lack of any loop state variables being created or modified - is closest to the machine code).

    Loops, while not bad per se, do have a lot of foot-guns. Loops tend to be used to make all sorts of non-trivial changes to outside state (it's all still in scope), and it can be nightmarish to debug errors that this may produce. Let's say you're looping over chickens in your upcoming Hen Simulator 2024, and you call a function from inside your chicken loop to update the henhouse temperature, which has a check to see if the temperature has gotten too high, which might result in a chicken overheating and passing on into the great farm in the sky, which changes the amount of chickens remaining, but wait, isn't that what you're looping over? Uh oh, your innocuous temperature update has caused a buffer overflow and hard crash. In a rare and possibly hard to reproduce case. Have fun debugging!

    Generally, functional programming prefers encapsulated solutions - arguments go in, results come out, nothing else happens - which makes it easier to reason about your code. The most common replacement for loops is something like map, which just applies a lambda to each member of a list. This should make it somewhat harder to achieve the mess above (the other chickens shouldn't be in scope at all, so your temperature update function should complain at compile time).

    With tail recursion, you could make a function that takes a list of chickens to update. You pop the first chicken, update it, and recur on a list of the remainder of the chickens. Because this needs to be a function (so you can recur), you have control of the arguments, and can determine what exactly is passed to the next iteration. You can't overflow the buffer, because you're passing a new 'remaining' list every time. This is also where you can get a little clever - you can safely change the list at will. You can remove upcoming chickens, you can reorder them, you can push a new chicken into the list, etc. If a hen lays an egg mid-loop, it can be updated as part of the same loop. Plus you have the same scope safety as you do with map - you can't do anything too messy to the outside state, unless you specifically bring it in as an argument to the function (which is a red flag and your warning that you're doing something messy with state).

This is completely socially constructed.

Lisp was once a very popular introductory programming language and students learned it just as easily or easier than any other language.