← Back to context

Comment by sakras

12 days ago

Is it an innate property of humans that the curly-brace style is more natural? I wonder if in an alternate universe where Lisp took off as the browser language people would find it more natural instead. It seems like somewhat of a chicken-egg problem.

I think it's innate that having differentiated syntax for different types of grouping is natural. Look at mathematical papers where people will introduce new brackets with new meanings. (Indeed look at the entirety of QM for a clear, simple case)

  • Some Scheme and Lisp dialects have that. For example, Racket often uses square brackets instead of parentheses for things like clauses of a cond expression, and Clojure uses square brackets for vector literals and curlies for hash map literals.

  • > "Look at mathematical papers where people will introduce new brackets with new meanings"

    Common Lisp has left brackets like {} and [] to the user (aka developer). It supports "reader macros", where the user can extend/supersede the syntax of s-expressions.

    So, specialized tools/libraries/applications can introduce these brackets for their own use. Examples are embedded SQL expressions, notations for Frames (special objects, in kind of a mix of OOP and Logics), grammar terms, etc.

    Thus it explicitly supports the idea of "people will introduce new brackets with new meanings".

Because lisp is trivial to parse, it’s easy to make an extension for vs code that shows / edits Common Lisp / scheme looking completely different; you can make it unrecognisable. If the () are the only thing bothering people, then this is very simple to resolve. Can hardly be the only thing though. It’s also easy to build operators similar to the ones you have in Python etc like filter, map etc so you don’t have to apply recursion. These are there already but you can build them yourself in a few hours.

So it’s probably just what people learn first + lack of ‘marketing’ or negative PR (there are no libraries or ecosystem! The thing that least bothered me about CL but people with npm leftpad experience seem bothered by it).

It’s interesting as I worked I almost everything in production; c/c++ (including the MS 90s flavour), Delphi, VB, Perl, PHP, Java, C#, Haskell, F#, Common Lisp, Erlang, TS/JS, Python, Ruby, asm (z80, arm, x86) and I simply have not had a better overal experience than CL. The others are better at some things but a an overal experience, CL just is a pleasure.

The curly braces themselves are 100% irrelevant, as evidenced by the many, many successful and well-liked languages which don't use them, including Python, which is in the running for the most-used language these days. They're an implementation detail.

What's closer to innate is the Algorithmic Language, Algol for short, the common ancestor of the vast majority of languages in common use (but not, notably, Lisps).

Algol was designed based on observational data of how programmers, who had to somehow turn their ideas into the assembler to run on machines, would write out those ideas. Before it was code, it was pseudocode, and the origins predate electronic computers: pseudocode was used to express algorithms to computers, when that was a profession rather than an object.

That pseudocode could have been anything, because it was just a way of working out what you then had to persuade the machine to do. But it gravitated toward a common vocabulary of control structures, assignment expressions, arithmetic as expressed in PEBCAK style, subroutine calls written like functions, indexing with squared brackets on both sides of an assignment, and so on. I revert to pseudocode frequently when I'm stuck on something, and get a lot of benefit from the practice.

So I do think that what's common in imperative languages captures something which is somewhat innate to the way programmers think about programs. Lisp was also a notation! And it fits the way some people think very well. But not the majority. I have some thoughts about why, which you can deduce an accurate sketch of from what I chose to highlight in the previous paragraph.

  • > Algol was designed based on observational data of how programmers, who had to somehow turn their ideas into the assembler to run on machines, would write out those ideas. Before it was code, it was pseudocode, and the origins predate electronic computers: pseudocode was used to express algorithms to computers, when that was a profession rather than an object.

    I believe you, but do you have a source for this? I can't find papers on how they chose to develop the syntax of Algol in the beginning.

Does decades of empirical evidence not prove that people are more comfortable with imperative, curly brace programming over s-expressions? It's not a chicken and egg problem. The egg has hatched and nested parentheses lost.

  • You may be right, idk, but I want to point out that you’re conflating two orthogonal concepts: S-expressions and imperative vs. functional programming.

    There are lisp dialects that are very imperative, for example elisp, but they still use S-expressions. Historically they might have been considered “functional” because they have first-class functions and higher-order functions like mapcar, but nowadays practically every modern programming language (except go!) has these.

    The thing all lisp dialects have in common is not where they land on the imperative vs. functional spectrum, but rather the fact that the syntax is trivial and so it’s easy to write powerful macros.

    • I think the simple uniform syntax is the main reason why Lisp never became popular.

      Code is communication, and communication needs redundancy for error correction. You can see it in natural languages, and it makes sense to have it in programming languages as well. Using different kinds of syntax for expressing different ideas is an easy way to increase redundancy without making the code more verbose.

      6 replies →

  • Decades of empirical evidence prove that people are more comfortable with functional, reactive, beging/end delimited programming, i.e. Excel.

  • No, it doesn’t.

    What has happened in reality is that C became really popular and then all the people designing languages they wanted to be popular, rather than to be experimental, or to push boundaries, etc obviously chose a syntax which was familiar with most programmers, ie a syntax like C’s.

    Further, one can disprove that the syntax is particularly important by simply pointing to Python which became immensely popular despite a lack of curly braces and even worse with significant white space simply because colleges and bootcamps decided it would be a good language to teach programming to beginners.

    • Arguably python and c are much more similar than any of them compared to a lisp.

      I would argue the important part are the blocks in the former two, which sort of gets lost in the homogeny of lisps. Whether a block is marked with curly braces or indents doesn’t matter much - they being dissimilar to a regular expression does. Of course well-formatted lisp code tries to indent as well, but still there is a lot of visual noise there making it harder to visually inspect the code, I would guess.

      Of course familiarity with a given way is significantly more important. We pretty much learnt the non-intuitive writing of math, to Chinese people their writing system is the intuitive one, etc.

  • Hmm, can't find the paper (mostly clutter from language bootcamp results) but around a decade or so back there was an education research project that concluded that teaching SQL first, rather than any imperative language (regardless of punctuation), was better for getting students to develop reasonable mental models for computing. (Unfortunately without the reference I can't address what the criteria for "better" were - but "what people get paid to do" isn't really proof of comfort at any level...)

  • I think it's quite telling that almost all of the innovations in lisp (garbage collection, first class functions, repl etc) have been absorbed into more popular languages except for s-expression syntax, which remains a small niche despite many great implementations of s-expression based languages.

    • Because as soon as you adopt the s-expressions, what you got is no longer <language>, but lisp itself. Something like this:

        static char _getch() {
          char buf;
      
          if (read(0, &buf, 1)) return buf;
      
          return '\0';
        }
      

      would become:

        (define _getchar ()
          (declare static)
          (return-type 'char)
          (let ((buf (char)))
            (if (read 0 (& buf) 1)
              buf
              "\0")))

      4 replies →

  • Does over a century of empirical evidence not prove that people are more comfortable with keyboards whose top row is laid out "QWERTYUIOP"?

  • Has there ever been research on this? Perhaps this situation has come about because the schools people must go to to get the programming jobs only teach the Javascript way? It seems circular logic to say that the current paradigm must be superior for the fact that it is the current paradigm. Is it possible that there are other reasons it reached that status?

    • does n=1 count? :)

      some time ago I tried Racket, and just no. recently I tried Scala ZIO HTTP, and yes.

      Maybe it's the types? Maybe it's the parens. But probably both. I cannot really recall my experience, just that manipulating code was ridiculously clunky. My assumption was that the IDE will manage the parens for me and when I'm moving something somewhere it'll figure out if I messed up the parens.. and ... no, nothing. I had to balance them with hand.

      2 replies →

  • No, because people who start in programming do not go to a syntax comfort clinic, where they are tested, and then assigned to a programming language.

  • It only proves that those languages are the most learned because they are the most popular in industry.

    It says nothing about what makes a language easy to learn.

I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning. Then, at a sufficient level of complexity, a programmer gravitates to solutions like OOP or FP, but there's an obvious trade off in readability there. 99 Bottles of Beer implemented with a loop is intrinsically going to be easier to read than an implementation with tail recursion, even though the latter is generally better. Lisp's inside-out parentheses style adds yet more cognitive load on top of that.

Many things are socially constructed, but not everything.

  • > I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning.

    When 6.001 (the introductory class for which SICP was written) was launched, most of the students who took it had never used a computer before. Yes, MIT students. This was around ~1980. And in the first hour of their first class they were already doing symbolic differentiation in scheme.

    I think your view of what’s “natural” is a just so story.

    • > And in the first hour of their first class they were already doing symbolic differentiation in scheme.

      People heavily trained in maths can take quickly to languages designed to make programming look like maths, that's hardly a surprise.

      I wouldn't base my assumptions about what most people find natural on the experience of MIT students taking 6.001 in 1980.

      (Not to mention, 'doing' is doing a lot of heavy lifting in that sentence. I could show you a intricate sentence in French in the first hour of your first French class, but unless you came up with it yourself, are you demonstrating much learning just yet?)

      2 replies →

    • Yeah, mid-1980s the thing incoming students had to unlearn was BASIC, not anything with curly braces. (Source: I was a 6.001 Lab TA at the time.) Of course, the next class on the rotation used Clu, where you had to unlearn "recursion is free".

  • Imperative programming is probably the most intuitive, but I'm doubtful curly braces and C-like syntax are anything more than coincidence. The first programming language was Fortran, and it didn't look anything like C. This is a really old Fortran program copied from a book:

         WRITE(6,28)
         READ(5,31) LIMIT
         ALIM = LIMIT
       5 SUM=0.0
         DO 35 ICNT=1,LIMIT
         READ(5,32) X
      35 SUM = SUM + X
         AMEAN = SUM/ALIM
         WRITE(6,33) AMEAN
         GO TO 5
      28 FORMAT(1H1)
      31 FORMAT(I3)
      32 FORMAT(F5.2)
      33 FORMAT(8H MEAN = .F8.2)
         END
    

    Most modern programming languages seem to take inspiration from C, which took inspiration from BCPL, and that from Algol. Others took inspiration from Algol directly, like Ada, or Lua. And Python has indentation-based block structure, rather than having blocks of statements delimited by braces or or an "end" keyword.

    • I always liked Pascal's BEGIN and END statements instead of curly braces. There is also Basic where the blocks built into control flow statements, like FOR I=1 TO 5 [code here] NEXT I

      I'd argue a lot of programming language evolution is influenced by the capabilities of our IDEs. When you code in a text editor, the terse syntax of C is great and brings advantages over the verbosity of Pascal, Basic or god forbid Cobol. Once your editor does auto-indentation the braces seem redundant and you get Python. Smart completions from IntelliSense are essential to efficiently writing C#, and now that LSP has brought that to every IDE or smart text editor we have the explosion of popularity of more explicit and more powerful type systems (Typescript, typed Python, Rust). Programming languages are shaped by their environment, but the successful ones far outlive the environment that shaped them.

    • It really depends on your mindset. I grew up with math (composable operators, no side effects) and a lot of immutable + virtual operations software (maya, samplitude, shake, combustion) ... so to me imperative programming, with all the control flow, subtly changing state and time dependencies, coupling of concerns was almost instantaneously an fatal issue..

      Backus also shifted away from imperative inspired languages to design FP/FL language (I thought they were contemporaries of BCPL but came 10 years later, later than APL), even though he contributed to FORTRAN directly.

      1 reply →

  • > I would argue that imperative programming is most natural - it's what everyone gravitates to in the beginning.

    Why do you believe this is anything more than an historical accident?

    For example, it wasn't what Alonzo Church gravitated to when he invented the lambda calculus in the 1930s, before any programming languages or indeed general-purpose computers existed.

    > 99 Bottles of Beer implemented with a loop is intrinsically going to be easier to read than an implementation with tail recursion

    First, you don't need to use explicit tail recursion. See e.g. https://99-bottles-of-beer.net/language-haskell-1070.html

    Second, this sounds like unfamiliarity, not anything inherent. Why is it "intrinsically easier to read"? For a tail recursive version, the main tail recursive function would look like this in Haskell:

        _99bottles 0 = printVerse 0
        _99bottles n = do
            printVerse n
            _99bottles (n - 1)
    

    In fact, with a bit of experience you might write this as:

        _99bottles 0 = printVerse 0
        _99bottles n = printVerse n >> _99bottles (n - 1)
    

    It's only less easy to read if you're completely unfamiliar with the concepts of pattern matching and recursion. But the same is true of any programming language.

    Given the above, what's a "for loop" and why would you need one? Sounds complicated and unnatural.

  • OOP is imperative programming. It's just function calls where the first parameter is to the left of the function name, after all.

    A better name for "non-OOP" programming is procedural programming, where you organize code in long blocks that go straight down, code duplication is accepted vs jumping all over the place, etc. Honestly underrated. It can be quite easy to understand.

    Strictly-evaluated FP is also imperative. The only really different languages are the ones with different evaluation systems or that can do things besides evaluate - people like to say Haskell is the best here but I think it's actually unification languages like Mercury. Maybe even SQL with transactions.

  • I'd argue a FP implemenation with map (something like `[99..1].map(|n| f'{n} bottles of beer ... {n-1} bottles of beer on the wall').join('\n\n')`) is inherently as readable as the for loop, and not really more complex.

    There are lots of great parts in FP, and for the last ~10-15 years imperative programming languages have made a lot of effort to add them to their syntax. You just need to leave out the more dogmatic parts that make FP popular in academia.

    • Hehe, it's easy if you ignore half the song, the singular for n=1 and the whole n=0 case! (Not that we're talking about rocket science if you don't, but c'mon, oranges to oranges!)

      I agree with you otherwise though.

  • Assembly is imperative, so there's a lot to be said for a language that mimics how the computer actually works. Lisps always leave me saying, "oh, that's clever."

  • > even though the latter is generally better

    Why is tail recursion better generally? I'm not familiar with FP very much, but it feels like loops more closely resemble the way computers execute them than tail recursion.

    • It's more concise and illustrates the common subproblem. Loops make you model a state machine in your head, which I'd rather leave to the computer.

    • Very fair question. It may seem surprising, but loops (especially `for` loops) don't really reflect the underlying machine code very well. There is no distinct loop concept in machine code; instead, there are ordinary instructions followed by conditional jumps (if {predicate} go to {memory address} and keep on executing from there), which may or may not return to an earlier point, and will thereby conditionally repeat. Tail recursion, provided it's done in a compiler that understands tail recursion optimisation, will in some ways mirror this better than a `for` loop. (Though an old school, C-style 'do {code} while {predicate}' - note the order, and lack of any loop state variables being created or modified - is closest to the machine code).

      Loops, while not bad per se, do have a lot of foot-guns. Loops tend to be used to make all sorts of non-trivial changes to outside state (it's all still in scope), and it can be nightmarish to debug errors that this may produce. Let's say you're looping over chickens in your upcoming Hen Simulator 2024, and you call a function from inside your chicken loop to update the henhouse temperature, which has a check to see if the temperature has gotten too high, which might result in a chicken overheating and passing on into the great farm in the sky, which changes the amount of chickens remaining, but wait, isn't that what you're looping over? Uh oh, your innocuous temperature update has caused a buffer overflow and hard crash. In a rare and possibly hard to reproduce case. Have fun debugging!

      Generally, functional programming prefers encapsulated solutions - arguments go in, results come out, nothing else happens - which makes it easier to reason about your code. The most common replacement for loops is something like map, which just applies a lambda to each member of a list. This should make it somewhat harder to achieve the mess above (the other chickens shouldn't be in scope at all, so your temperature update function should complain at compile time).

      With tail recursion, you could make a function that takes a list of chickens to update. You pop the first chicken, update it, and recur on a list of the remainder of the chickens. Because this needs to be a function (so you can recur), you have control of the arguments, and can determine what exactly is passed to the next iteration. You can't overflow the buffer, because you're passing a new 'remaining' list every time. This is also where you can get a little clever - you can safely change the list at will. You can remove upcoming chickens, you can reorder them, you can push a new chicken into the list, etc. If a hen lays an egg mid-loop, it can be updated as part of the same loop. Plus you have the same scope safety as you do with map - you can't do anything too messy to the outside state, unless you specifically bring it in as an argument to the function (which is a red flag and your warning that you're doing something messy with state).

  • This is completely socially constructed.

    Lisp was once a very popular introductory programming language and students learned it just as easily or easier than any other language.

Then why does this web page use indentation to clarify who's replying to whom, instead of {}s?

I don't know if it's innate but it's what we have. Lisp has been around about as long as programming, it's had plenty of time to catch on, it hasn't.

Maybe innate, maybe it's an offshoot of teaching math in an infix style, 1 + 2 vs. + 1 2.

  • I don't think it's been tested at all. for people who took and finished a course in Lisp as their first programming language, how many "hate parens"?

    I have no trouble with lisp's parens, i like them. What I never liked though, is that the first item in the list was an operator, a verb lets say, and the rest were the nouns; whereas, you could also have nested lists say of numbers where there were no operators. Never felt right (not that I can think of a better way, not worth adding more parens)

  • But good college math departments teach reverse Polish notation; i.e., Hewlett-Packard over Texas Instruments. It’s demonstrably more advanced / efficient.

  • Lisp became very popular, then died off rapidly due to association with the AI Winter.

C-like syntax is brutally hostile to programming beginners. There is not a shred of anything natural about it.

  • There's nothing natural about programming, because no ones like to be that formalized in their thinking process, especially with things that should be common sense (although ambiguity is still an issue). It's the recursive definition that get people. Especially when pointing that the things that the computer can do form a very small set. It's just that they can do it very fast.

    You can see that when observing novices programming (without stack overflow or similar help). They often assumes that it will get done (magically) as soon as they call that function. And their code organization reflects the ad hoc thinking instead of a planned endeavor.

    • The formality of programming is unnatural to many (though the impulse to formality and robustness as displayed in logic & math clearly is millenia old, however niche), but a fair bit of it is as natural as language itself, or magic.

That alternative universe was the early 1980s where Lisp was very popular to learn due to bring consider the best language for AI.