← Back to context

Comment by aag

3 months ago

I wouldn't have spoken up except for this comment. As a freshman, I took 6.001, the MIT course that Structure and Interpretation of Computer Programs was based on, and I loved it. As a graduate student, I taught 6.001 three times, twice as head TA under Prof. Sussman and once under Prof. Abelson. In addition to helping make up problem sets, quizzes, and exams, my responsibilities included teaching seven or eight one-hour, five-student tutorial sessions per week as well as teaching a forty-student recitation section once per semester. I graded assignments as well. My point is that I have a lot of experience with what students found challenging in the course.

Prof. Harvey's claim rings completely true to me. Students understood the syntax quickly, and spent little time on it. It was not a point of frequent confusion. There were plenty of difficult concepts in the course, but the details of the programming language were not, for most students, among them.

Students who already had programming experience when they started the course often had more trouble than inexperienced students, but mostly because they had to unlearn imperative habits since the imperative features of the language, except for I/O, weren't used until late in the course.

SICP covers a huge breadth of material, from basic computational ideas to algorithms and data structures to interpreters and compilers to query languages to concurrency, and does it in an entertaining and challenging way. Even decades later, I find myself pulling ideas from it in my daily programming work.

I worked at Google for almost twelve years, and I can't count the times I found myself muttering, when reading a design document, "I wish this person had read SICP."

I'm certainly biased, but I would encourage anyone who would like to become a better software engineer to read SICP and study it carefully. Take your time with it, but do read it.

I've had an introductory Scheme course in a smaller university, and have experience designing data structures, creating parsers & interpreters, and with multi-threading and networking.

I was never one to really dig lisp. I prefer the structure and the groundedness of a statically typed systems language (I mostly do systems work). But I took on reading SICP in the hope of finding something new and interesting, and to level up my skills. However, I got bored by the it. Probably made it through more than half of the book.

It's a bummer because I'm left with the feeling of missing out. Am I not worthy or too obtuse to get what's so great about the book? Or maybe I am in fact not the target audience, having too much practical experience that the book doesn't seem worth my while.

  • If you're comfortable writing interpreters you've probably already picked up on most of the "big ideas" SICP is trying to teach. It's a very good introductory book, but it is still just an introduction.

Honestly GP is making two very valid points though.

Something that Clojure does is differentiating between () = lists of calls, [] = vectors (the go to sequential data structure), {} = maps. This definitely helps the eye to see more structure at a cursory glance. It has a little bit more syntax compared to Scheme, but the tradeoff seems to be worthwhile.

Secondly, I think it's very healthy to be wary of indirection and abstraction. I'm not sure if I agree with the tone and generalization about modernism, but I think there's a burden of proof, so speak, when it comes to adding abstractions, especially in the long term.

  • I think Scheme works well for the kind of conceptual overview the course is trying to provide. I think there is something to the argument that Scheme syntax is not ideal for readability of larger programs, but I would wager that the bigger reason some students find SICP confusing is the same reason it blows others’ minds - the whole approach is at a higher level of abstraction than most “intro to programming” classes.

  • Yes, I agree these are two good points. I also experienced teaching SICP and would say the overall position of the GP is incorrect and results in a less profound understanding of programming.

> Take your time with it, but do read it.

And do the harder exercises. Really do them, not just read and tell yourself you understand how to do that one and move on.

Thanks for speaking up. At this point no one is really presenting any evidence so it’s a necessary evil to offset the Lisp slander even if it is, like the parent comment, not much more than an appeal to authority / popularity.

Syntax is absolutely neither natural nor unnatural, by nature, to humans, but it’s a fact that fewer symbols to memorize is easier than more symbols to memorize. The problem is a failure to launch. Some people never truly understand that it’s not just syntax, it’s semantics. Data is code, code is data. That’s why it all looks the same. This artificial distinction in “C-like languages” is more harmful for the enlightened programmer than it is helpful. Unfortunately not everyone that reads SICP experiences enlightenment the first time (or ever, I guess?)

  • Information hierarchies are empirically important and are an essential part of communications design. Uniform syntax makes information hierarchies harder to parse, because the boundaries around different types of information all look the same. It's the same reason we have different sized headings, bold text, etc. They are distinct markers.

    So yes, fewer symbols means easier memorization, but you could take that to the extreme and you'll find that binary is harder to read than assembly.

    I think Lisp is really elegant, and the power to treat a program as a data structure is very cool. But scanning Lisp programs visually always takes me a little more effort than most other languages.

    • My impression has been that people complaining about Lisp's parentheses are complaining about them because they are the most obvious difference between Lisp and other languages, but that they're not what is actually causing them problems. It's the functional approach, where everything is in some sense just algebra, that really throws people off. Of course I can't see inside people's minds, but whenever I discuss this with someone for long enough, that's the impression I get.

      Parentheses are just a scapegoat.

      1 reply →

  • I really want to like that idea you're describing, however I've found in practice, there absolutely is a practical difference between code, data and types. I mean, they literally live in different sections of a process. If you design a program that runs on a real machine, and you spend a lot of time thinking what the program should do, how it can put the limited resources of the system to good use -- you absolutely need to think about code and data separately. Mostly think about data, really.

    The one area where "code is data" remains a nice idea in my mind is for metaprogramming. And whenever I've done more metaprogramming than small doses, I've come to regret it later, no matter what the language was. (Small doses of metadata can be done even in statically typed, AOT compiled languages without RTTI).

    The reason is I think, just basic data structures and simple procedures built in to a language allow you to express most everything you need, in a very direct manner. The number of distinct concepts you come up with as a programmer can usually be directly defined in the base language. Metaprogramms won't create new concepts as such, it's only code run in a different phase. There is definitely a case for generic/templated data structures but it seems it's best to use them sparingly and judiciously. Be wary of them duplicating a lot of code, fatting up and slowing down your system at compile time and/or runtime.