Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
MIT 6.001 Structure and Interpretation (1986) [video] (youtube.com)
256 points by _ghm2 on June 22, 2020 | hide | past | favorite | 76 comments


If you want the modern, grad-level version of an MIT scheme class, I highly recommend Sussman's 6.945. I don't know if there's an open courseware version but the entire class is centered around psets, each of which starts with a base environment provided by Sussman and the teaching staff, and which you are required to work in / extend. The class itself was extremely entertaining: you watch Sussman walk back and forth between two transparency projectors, switching the slide on each one as he arrives. Yes, you got very good at reading scheme code very quickly while listening to him talk at the same time.

I understand why MIT decided to make the introductory CS classes more accessible -- they wanted to expand the CS department beyond just those who had already had programming experience by the time they got there, I think? It makes a lot of sense and frankly I found the "replacement" classes pretty great, even while I was taking this class at the same time. Instead of going on at length about how much of a shame it is that MIT "got rid of its scheme class", consider 6.945.

This was one of the handful of classes that I felt were worth the price of admission; I've written code differently ever since.

http://groups.csail.mit.edu/mac/users/gjs/6.945/


There's a book that GJS and Chris Hanson are writing, based on the course:

https://www.penguinrandomhouse.com/books/669475/software-des...

It should be coming out soon. I'd expect it to be available online as well.


Thank you for this link, I didn't know they were writing a book but I just pre-ordered it!


> I understand why MIT decided to make the introductory CS classes more accessible -- they wanted to expand the CS department beyond just those who had already had programming experience by the time they got there...

Not sure that motivation really applies as back when 6.001 Was introduced most MIT freshman had not programmed a computer before arriving. Also the course was taught a few times to non course 6 faculty who wanted it — I remember helping a course 21 (history) prof with psets and he had previously never even used a computer.

Also making things “accessible”* isn’t really an MIT thing,mor wasn’t back when I was there anyway.

* in the sense used in your comment I mean


Like all big changes, I bet it happened for many reasons, and which one is considered the primary driver depends on who you ask. I can tell you that I based this on recollections of talking with the 6.01 instructor when I took it, as well as talking with Abelson and Sussman about it in passing, but also that my memory of that time in my life is extremely fuzzy and should be taken with a grain of salt.


Can't find the link, but as I recall in one of the interviews, Hal Abelson explained the reason behind switching to Python: SICP/Scheme was good when systems where small and a single person could understand/create whole systems. Nowadays, systems are huge, lots of parts/components, and programmers mostly write glue code.


Maybe you're thinking of this?

http://wingolog.org/archives/2009/03/24/international-lisp-c...

Search for "Sussman."


Yeah, seems like it, thanks! There should be the video somewhere.


Here are video recordings of similar material, but taught at Google in 2009 by Sussman and Hanson:

https://archive.org/details/adventures-in-advanced-symbolic-...


Whoah, thank you for this, from skimming the lecture videos this seems very close to the class as I remember it.


I re-watch this course every bunch of years, and every time I get something different from it. But mostly it just makes me happy - it's infectious and weird. Especially the "revelation" (spoiler!) https://youtu.be/aAlR3cezPJg?t=2088


RIP. I think I was one of the last years to take this wonderful class. I guess it would have been in 2004 or 2005, since I took it as an elective (being an Econ major) on the strength of its reputation. It was hard but so interesting!

Since then MIT has switched its intro CS class to be python based. I suppose that makes a little more practical sense but there's something magical about Scheme that I fear they're missing out on.


I remember being a freshman and seeing the lecture hall 32-123 being temporarily hack-renamed to 6.001, and I didn't know what that was all about. I think it was the last lecture of 6.001, but I didn't know what it was back then.


I tried to go through SICP and stopped short at Chapter 2. This time around, I used Brian Harvey's CS61A lectures and I'm almost done. He does a good job teaching you the book as well as the content in the book if that makes sense. It's worth a watch if you feel stuck in SICP: https://www.youtube.com/watch?v=4leZ1Ca4f0g&list=PLhMnuBfGeC...


SICP is my quarantine project, going through the whole thing taking my own notes in org-mode, doing the exercises, and checking with https://github.com/zv/SICP-guile as I go along. It's been fun!

I'm on chapter 2 now; if you want to form a study group, HMU at <username> at gmail!


Thank you. Having mentioned this, I wonder if there are similar-in-spirit lectures for Common Lisp? PG has written two books on CL, and I've heard of some Industrial Strength apps written in CL. So, if starting today, it seems like maybe CL is the Right Choice?


I'm not sure about CL lectures in general.

In regards to applying CL to SICP, I would argue that the course is more about the underlying concepts than the language itself. You could probably read a CL reference and write your exercise code in CL instead of Scheme, or just learn CL after the course and it wouldn't be a big leap.


As someone who actually ditched the Scheme version and went for and completed the Python one (and Python version is way easier), I'm actually disappointed to see that MIT dropped the Scheme version 15 years ago.

I mean it's MIT, and MIT is supposed to have the sharpest minds and the most difficult courses. And the Scheme version, albeit that I dropped it due to difficulty, DOES have a certain "flavor" that I did enjoy. Can't say what it is though.


I've heard this among other Course 6 students and alumni, and I think I understand your point of view, but I respectfully disagree.

Python is relevant to 90% of the jobs to which I've applied in my lifetime. In contrast, I've never seen any developer or manager mention Scheme seriously in a professional setting (just my honest experience), nor have I ever been asked in an interview about my experience with Scheme.

Python is extremely versatile, and works very well with many different design patterns. It offers a gentle-enough learning curve that you can teach it in the first part of a course, then turn around and use it as a platform to teach concepts in the second half of the course. 6.01 does this with robotics, for example.

MIT is often difficult for pointless reasons. Some of the professors are brilliant engineers but piss-poor educators. TA's don't all have the tact or capacity to tutor students about basic concepts. Python's simplicity, ubiquity, and large online community make it a perfect language for self-study, and time-constrained self-study is often a hallmark of an MIT education.

Most direct contact with educators in 6.01 doesn't involve the professor, but rather some other student who took the course a semester or two ago. For example, one idiot 6.01 TA of mine deleted two hours' worth my code from the course's Python environment (no CTRL+Z at the time) when I asked him to help me understand a concept, just because he didn't like something about the order in which I had written some computations. Then he said "do it again" and walked away. (I think he was from Russia; not sure if that's how education works over there.)

If that happens to you in a well-documented, community-supported language like Python or Java, you can go to someone else for help. With Scheme, you're kind of left without resources.

I'm all for challenges, but they should be reasonable and realistic, not contrived nor quirky-for-quirk's-sake.

Unlike you, I never took the 6.001 version, so I admit that I have a limited vantage point. But I think my logic here is not too far off.


I agree. I remember back then hating that 6.001 wasn't taught in a "practical" programming language, but now I completely see the value (and perhaps also magical aura!) of Scheme.


Perhaps simplicity? I find that Scheme makes it very easy to see the bare patterns


I think it's the way it teaches you to build things from ground-up and forces you to understand the principles. I'm reading CSAPP 3rd chapter and it gives me the same feeling.


That's exactly why 6.001 was dropped. There is no building from the ground up anymore; virtually all software development is modifying or extending something that already exists to fit some application.


I guess it's reasonable, but as a hobbyist I have the advantage to ignore such constraints (to seek a job), and such courses/books do seem to be attractive.


My impression is that dropping Scheme is something that some students have demanded for a long time, maybe even since the beginning. Back when I took 6.001 (in the 90s) there were certainly fellow Course 6ers who'd rather learn more "practical" languages like C/C++. But the fact is, Sussman could teach all the Scheme one needs for the semseter in one lecture (back then that was R4RS, minus continuations and macros), whereas C would be too low level and C++, well, we'd spend half the term just jumping through syntactic hoops.


How did they do the metacircular evaluator parts of the book in Python? That is one of the parts where the homoiconicity of Scheme really shines. Python just doesn't have that.


I'm studying CS but haven't heard any concrete examples of why I should learn a lisp (scheme, Clojure). Every time I ask i get pseudo-intellectual answers of how it will expand my mind, make it easier to express complex human problems/domains into code or even make me re-evaluate things in life on a philosophical level. What? On the one hand, i'm sooo curious to study all of this and see for myself, but on the other hand, life's too short and I'm already spending so much time in front of a PC to get my degree, learn a couple of more languages, make side projects on github, personal blogs etc to make myself more employable.

Anyway, i know this is HN i'm asking, but can you give me at least one compelling reason why I should study lisp,scheme or clojure ?


http://minikanren.org

I was trying to come up with examples to demonstrate the value of lisp in my own work, but I think this one captures it better. Imagine being able to use your language to create a language, that still interacts with everything else, and express it so clearly that people can translate that into dozens of other languages.

If you were to write that initially in Java or C or something it would be hard to separate the essential complexity (the problem of solving relational problems) from the accidental complexity (managing the underlying language's semantics and data structures and such) while keeping the entire code base clean and clear.

Lisps do a better job of eliminating or reducing accidental complexity compared to many mainstream languages, especially the enterprise-y ones like Java and C#, which are commonly used in CS education unless you're at a unique school or one of the top schools in the US (without experience, I can't comment on what non-US schools teach).


I've noticed something. No one ever blows me away with something they've done in Python or PHP or C# or JavaScript. Whenever I see someone solve some problem in a way that is just obviously better than I could ever do they are almost always doing it in a Lisp, or APL, or Haskell, or some other language that is out of the mainstream for most programmers.

Those kind of languages are harder to learn, and probably for most things you can still do them just fine in the more mainstream languages without much trouble. And so it is easy to not bother to learn them.

If you are still doing this 40 years from now, though, I bet you'll look back and see that there were enough of those times where those other languages could have really made things easier for you that you will wish you had taken the time.

I think there is an implicit assumption we make about programming: we aren't going to be doing it long term. We'll be code monkeys for a few years, then move on to management, or ownership, or take our IPO money and go pursue some non-programming passion and leave the cranking out code to someone else. We don’t learn the harder but more powerful stuff because we don’t expect to be programming long enough for that to pay off.


Maybe your area of interest is dominated by older programmer solving problems in languages they're most familiar with, which wouldn't be a recent(ish) one.

I get the vibe from this comment that you think people are doing entirely unimpressive work in Python, PHP, JS, and C#. PHP seems to more or less be WordPress + extremely vulnerable packages if you don't update with great regularity, so I'll reluctantly agree. What about Python - machine learning, neural networks, and a common interface for scientists and more SWE types. C# - Unity games is an obvious example, but dotnet core looks very nice too.

I guess a better question would be, what _do_ you find impressive?


I may have been unclear. I am often greatly impressed by the work people do in those kinds of languages. But what impresses me are the algorithms they used. If I'm told the algorithms and understand them, and then look at an implementation in those languages it is almost certainly going to be in the ballpark of what I expected to see, and similar to how I would have implemented it.

If I'm just told what the problem is, and not the algorithms used and shown the code that solves the problem I might not understand what is going on until I figure out the algorithm from the code, but all the statements in the program will make sense at least as far as what they do--I might not get why they do it until after I've figured out the algorithm, but I'll understand what.


If you need someone online to justify for you learning new things in your field you have probably already lost. This field is unrelenting, it requires constant learning. I'm nearing 40 and it hasn't stopped. The one advantage is the more you learn the more skills transfer from one thing to the next.

So the most compelling reason I have is because learning new things should be something that you find exciting. What makes Lisp unique is that it is a lot different than other languages out there. If you know one imperative language, learning another may expand your horizons but not in the same way something from a different paradigm would.

If you already know a functional language or two, then it might not be worthwhile. That is very much up to you. But even though I have never used Lisp I enjoyed learning it immensely. Although I'd say I enjoyed Haskell more.

I will say back in the day I watched these lectures and found them incredibly satisfying and thought provoking.


> If you need someone online to justify for you learning new things in your field you have probably already lost.

You can't learn everything though. You absolutely should contemplate whether something is worth learning before embarking on that process -- otherwise you fill your head with useless new things.

Nobody is saying that I need to learn COBOL to stay relevant in the job market today, even if that would be a new language for me.


I started programming in COBOL since 1988 and I am still. I have seen several other languages come and go in these 3+ decades but COBOL is only growing younger. I am no more excited to learn a new language or tool invented every six months at my current age near retirement, hope COBOL will continue to feed me for few more years if not decades!


I think the point was that many people talk breathlessly about Lisp but are often quite hand-wavey about specifics. When this concern has come up before I've seen them dismissed with comments along the lines of "if you have to ask you won't get it" or "just read SICP".

I've written and read a bunch of lisp and I'm slowly working my way through SICP at a leisurely pace, but I totally understand what these OP means. Some talk about lisp as something magical (it appears on xkcd as a sort of "god's programming language" - https://xkcd.com/224) yet often seem cagey about actually sharing exactly why they think so. A great example of what I mean is this pretty famous essay by pg: http://www.paulgraham.com/avg.html.

In fact he says "Lisp is so great not because of some magic quality visible only to devotees, but because it is simply the most powerful language available" - which seems to dispel the "magic" bit ... but then throws out the tantalising "most powerful language" without (in my opinion) giving a compelling argument. He mentions how lisp macros are more powerful than those in C and does say "Lisp code is made out of Lisp data objects" and a few other bits and pieces about how it was higher-level than other popular languages at the time (the mid-1990s). It certainly made me curious to explore the language, but definitely left me with unanswered questions.

My opinion is that this phenomenon is a result of a number of things, including:

1. some people went through a transformative experience through learning lisp, and are deliberately vague to entice others into exploring it without ruining some of the fun of self-discovery

2. some people had the same, but think that it's enough to declare these things and don't care if you believe them or not

3. some people parrot what they have read some smart folks say and simply can't formulate a good explanation even if they wanted to


SICP is hard but remarkably complete. If you are vaguely familiar with the maths that is chosen as the domain then you can mostly begin the course from scratch.

It starts with some really simple stuff like (+ 2 2)

Then introduces some concepts like recursion along the way. Next you get data structures and recursive data structures before they pull off the greatest recursive magic trick to show that your code is data itself.

Along the way you get a pretty accessible intro to functional programming without having to get bogged down in types, proofs and monads because Scheme is actually imperative.

With my teacher hat on, this is a masterwork for pulling together so many concepts into one (relatively) tiny standalone course.

With my student hat on it is profound on so many levels. I learn something new each time, even if only to appreciate names like Church and Curry being namedropped as bits of history because I'm not thinking a million miles an hour to process all the new information. The passion Sussman and Abelson bring the the course is authentic, endearing and unfakeable - you know they really felt it.

In terms of what you do day in day out, SICP would be my go to for covering most the fundamentals of "computational thinking". You learn how basic algorithms and space/time complexity, recursion and iteration, functions and objects, data structures and finally program language construction (with some teeny peaks at the machine itself). And I was able to learn all this without a computer - using just pen and paper because Scheme is so simple. This freed me up from the distraction of trying to make a computer do something and actually using my mind to reason. As someone who did not do a CS degree there are not many other resources that are that high up for grounding you in all those subjects to get you ready for the job or put you on the path for further reading.

Homoiconicity itself is such a mind bendingly pleasing thing to encounter the first time round, the only thing that came close was the stack machine - presumably because the list and the stack encapsulate something primitive to computation and language.

I remember the first time the penny dropped when I realized that the cons operator for a list was also the spread operator for arguments in a function signature and thinking whoaaaaa


I’m not having trouble with SICP and do not doubt it’s efficacy as a teaching tool. Take another look at my comment - I’m saying people are often very vague when describing why they find Lisp so mind expanding, and this can be a real puzzler for the “uninitiated”


I suppose it's a bit of 1 or 2 or 3. But there's also the issue that emergent properties which fall out of the overall design are quite hard to explain, and even advanced students of the subject don't fully grok yet.

For example, "code is data" is true in the most trivial sense for machine code, yet having code as structured data makes it much easier to manipulate. Having easy to manipulate code gives you the power to manipulate code at compile-time. Having code run at compile time inside the compiler, which is also an interpreter running the same language that you are compiling is kind of magical and mind-bending.

In fact, more recent research into this has gone down the rabbit hole even deeper, and it turns out you can have an entire tower of meta-levels (see for example http://www.phyast.pitt.edu/~micheles/scheme/scheme22.html). This is all continuing on the same basic ideas which were present in early Lisps, just crystallized and refined further and further.

Also, as research into macros has gone on over the years, hygienic macros were discovered, which Rust has adopted as well (and there are other languages which have them as an add-on, like Sweet.js). Deeply understanding how this stuff works gives you a better grip on the issues with macros in older languages (notably C).

And that's just macros (and homoiconicity). There's also lexical versus dynamic scoping. This is a lot less "magical" now than it was 10 years ago, when almost no mainstream language even had closures.


And, oh yeah, grokking continuations makes it a lot clearer how generators, iterators, exceptions and coroutines work. One concept which subsumes all the others. And if you want to get into compiler construction, understanding CPS conversion makes it much easier to do code transformations.

There's just tons of research which went into Lisps, because it's such a nice vehicle for language experimentation. There's half a century of research going into Lisps. Other languages haven't been around for that long, so I suppose there's something to find in Lisp history for everyone, but it might not be the same thing for everyone!


Are Rust macros at all comparable to Lisp macros? Can you re-write the syntax of Rust with Rust macros? Or are they more like C/C++ macros?


> Are Rust macros at all comparable to Lisp macros?

There are two major kinds of macros in Rust. One of them, macro_rules, is vaguely similar to Lisp macros. It was partially designed and implemented by some big Racket fans, in my understanding.

> Can you re-write the syntax of Rust with Rust macros?

Within limits.

> Or are they more like C/C++ macros?

Neither kind of Rust macros are like C or C++'s macros.


> If you need someone online to justify for you learning new things in your field you have probably already lost.

This is really unhelpful gate keeping. You're respond to someone who literally just said they spend excessive time studying the field already.


> This is really unhelpful gate keeping.

I don't think it is. If you don't have time to learn interesting stuff because you "have to write blog articles to make yourself more employable" then you will probably lose motivation or interest in CS.


I didn't read through the entirety of SICP, but I read through the first ~3 chapters, and it helped me to understand the way that functions are associated with the environment that they are defined in and how this fact can be used to construct closures. I haven't seen as good of a description of this up until that point.


I don't like lisp, not code in lisp; not even a line.. and I have read TONS of lisp stuff.

Why?

The #1 reason: The lisp people have a lot of cool things to teach.

One of the most obvious examples:

https://www.infoq.com/presentations/Simple-Made-Easy/

So, you can learn enough of Lisp or APL or oCalm or Haskell to tag along with somebody smart on the field (that use certain lang, maybe for very good reasons, maybe is just what he like) and understand stuff.

Most of the real gems are kind of easy to learn with the most basic understanding of a language.

---

A lot of times, is THAT kind of people that have the better insights of why certain lang matter.

Continue with the example of Rich Hickey:

https://dl.acm.org/doi/abs/10.1145/3386321

---

You can translate a lot of ideas from a lang to other, as the most basic and simply benefits.

Is just the case that certain langs fit the minds/goals/niches better, so is there where to look for better answers...


> I'm studying CS but haven't heard any concrete examples of why I should learn a lisp (scheme, Clojure). Every time I ask i get pseudo-intellectual answers of how it will expand my mind, make it easier to express complex human problems/domains into code or even make me re-evaluate things in life on a philosophical level. What?

Lisp is very close to the Tao, and the Tao that can be spoken of is not the true Tao. That's why no one can tell you in concrete terms the greatest benefits of Lisp's eldritch power, and they all end up in spirals of circumlocution about "lack of syntax" or "homoiconicity" or somesuch. You have to experience it for yourself before you "get it".

The fact that McCarthy devised (or discovered) a powerful, high-level language with an extensible, morphable syntax that's definable in terms of itself in seven functions (and those functions were themselves readily convertible by hand to machine code) should be a massive clue, though.

> but on the other hand, life's too short and I'm already spending so much time in front of a PC to get my degree, learn a couple of more languages, make side projects on github, personal blogs etc to make myself more employable.

That's part of your problem. You are too busy chasing good-boy points from the job market to really sit down and study this stuff. My suggestion is to focus on your collegiate studies for now. If you have time, study an additional language/CS topic or two, but only stuff you can get really interested or engrossed in. There's a good chance that eventually, Lisp will call to you. It's one of those ideas in CS that are so profoundly good that if you are interested in CS, you will be interested in Lisp, at least the history of Lisp and what it brought to the table.

If you are just looking for practical reasons to learn a language that translate directly to résumé padding and more zeros on your starting salary, then JavaScript is an acceptable Lisp and you may as well stick with something like that.


Lisps have a uniform syntax in S-expressions. This makes it unfamiliar to new users, but it means syntax just melts away, and you're expressing the abstract syntax tree of your program directly.

So writing code to manipulate abstract syntax trees becomes easy, since you're using to writing & thinking about programs in that form, and it's just manipulating lists, which is easy. For example, it becomes easy to write a program to symbolically differentiate an input expression.

You also think about functions as something easy and natural to pass around.

They also encourage pure functions, which allow lazy evaluation, easy memoization, and other fun things.


Don't overthink it. You've clearly come across Lisp and wondered about it enough to read up on people's opinions and to post a comment. That's reason enough.

I'd encourage you just go and try it out. Spend an evening or a Saturday reading a book / working through a tutorial to see for yourself. You cannot lose anything. At worst, you find out you don't like it, but at least you will have some first-hand opinion and an idea of what everyone is talking about.

Good approach for all those things you've always been hearing/wondering about but never tipped in your toes.


Lisps capture something fundamental about computation more cleanly than other languages. In Alan Kay's words,

"Yes, that was the big revelation to me when I was in graduate school—when I finally understood that the half page of code on the bottom of page 13 of the Lisp 1.5 manual was Lisp in itself. These were “Maxwell’s Equations of Software!” - lifted from [0]

[0] http://www.righto.com/2008/07/maxwells-equations-of-software...


I think you can spend your time learning much more useful things in Software Engineering than learning Lisp or Scheme.

Examples:

* Learn about type systems: advantages and disadvantages of various modern (or older) type systems: c++, java, typescript, rust, go, etc... Understanding pros and cons of various type systems would help you improve as a software engineer, and make you better at designing proper abstractions and apis in whatever language you are working in.

* Learn about concurrency primitives, asynchronous event loops, parallel programming, etc. In modern software engineering pretty much every type of work involves concurrency, understanding what type of primitives exist in different platforms and languages, and what are their pros and cons would make you much more effective as a software engineer overall.

* Learn about data structures in scope of data storage and data transfer in practice. Get understanding of network i/o, disk i/o, how things get stored in memory, how things get stored persistently. This stuff is way more complex and way more useful in practice than learning about slightly different ways of doing computation or calling functions in a programming language.

* and many more...

I would say in modern software engineering, part of "how to express basic computation" is really the least complex and least interesting of the bunch. You can skin that cat many ways but compared to many other challenges, it is in my opinion at the way bottom in terms of importance.


I think the primary thing which differentiates SICP's approach in particular is the wishful-thinking-driven-development, or "top-down development". You start at the top-level function, and write it completely, calling functions which you have named but haven't written yet. Then implement those, and so on. Yes, this is possible in other dyn langs as well, but SICP really emphasizes this approach.

One of the most interesting bits which differentiates lisps in general are macros. This gives you a lot of power over the language. You aren't stuck waiting on the language designers to give you new language-level features -- you can write them yourself!

The first example which comes to mind is asynchronous programming. In Clojure, core.async is a library, not a language-level feature! Stop and really think about that.

Also illustrative is to download the Clojure source and 'grep -r defmacro src/' and look through the results. A lot of what would be keywords in other languages are simply macros in a lisp. That means you could have written them yourself!!! The "threading macro" ('->' and '->>') is the one I like to point to. Any user could have come up with that!

This is like being able to add list comprehensions to the Python language, but as a user!

(disclaimer: I haven't written Clojure in anger, yet!)


> could have written yourself

This. Is a divide between programmers. Some are thrilled by this. Others, like me, see the power and flexibilty and front disagree it can be good. But their day to day problems are not lacking the most expressive, efficient Lang. Its communication, process, deadlines, change, domain knowledge, etc.

I didn't and don't want to invest my limited mental power in learning and keeping up with all about compiler Lang design CS theory. I want to use a language people much smarter and experienced with compiler and language design and theory have made for me. I want to build a program using the domain specific knowledge I've invested in. Not build program to write other programs..


Since it sounds like your objective in CS is primarily career oriented (i.e. 'to make myself more employable'), Lisp probably won't offer you something you or the majority of your future employers are likely to value. You will occasionally come across an employer who will be looking for exposure to less mainstream languages/concepts where their absence will be a negative, but that's going to be a minority in the job market if that's your focus.


First of all, I want to mention that SICP is so much more than learning lisp. As a matter of fact, learning lisp only takes 10 minutes in the first lecture. The rest of the course is about learning programming techniques (and _magic_ as they call it) with the explicit intent of learning to manage complexity. Even if you're completely fluent in lisp, taking this class will be massively mind opening.

With this out of the way, what's so special about lisp? Where did it get this reputation for mind-expansion? Part of it is historical. 20-30 years ago, the industry was focused around _stricter_ (for lack of a better term) languages like C++ or Java, and lisp presented some features that were super powerful, like REPL, interactive debuggers, higher order functions, lexical scoping, lambdas, etc. Lots of mainstream languages have since adopted these features (python, perl, ruby, javascript, ...), making lisp a bit less exotic than it used to be.

Still, today, there're some things that are unique to lisp that might be worth studying. The most important thing is macros. Macros are only possible because of the (lack of) syntax of lisp. It's a programming technique that consists of _programming your compiler_, for lack of a better term. It allows you to define a language specific to the domain you're working on. Pretty unique technique, and like all the other techniques, sometimes smart, sometimes useless, sometimes abused. I still believe it's worth learning, because understanding how macros work will explain a lot of things about the other tools you're using as a programmer.

I hope this makes things a bit clearer.

TLDR: A lot of what made lisp special is now available in other languages. Learning lisp is still worth it to expose you to _macros_. Regardless of how well you know lisp, going through SICP will be massively beneficial.


You do realize that LISP is older than both C++ and Java (by at least a few decades) :)


You'll learn how to solve problems using recursion and by passing functions around as variables (because you'll have to). Which is useful, because sometimes those are by far the most elegant way of doing things. That's the main practical pay-off I've found. Before going through parts of SICP I found understanding Python code that used lambdas and recursion hard to understand, now I often end up writing such code myself.


> [it will] make it easier to express complex human problems/domains into code

What is "pseudo-intellectual" about that answer?

> i'm sooo curious to study all of this and see for myself

Good thing SICP is available online, and there are even recorded classes of it with two legendary professors available online somewhere.

> life's too short and I'm already spending so much time in front of a PC

Yet, here we are.


I don't think you really need to "learn a lisp" but I recommend that curious developers read SICP at some point.

John McCarthy famously wrote a lisp interpreter in a half page of lisp in the lisp 1.5 manual. The simplicity of the interpreter makes it possible for SICP to really focus in on the essence of what programming languages are and how features can be incrementally added. Imagine trying to learn programming by building a java compiler. Lisp is a much simpler language for learning how languages work.

The lisp interpreter is so simple because the code of lisp is written in the data structures of the languages: a bunch of nested lists. So the parsed syntax tree is very similar to the code. This means that meta-programming on it is much easier to reason about. You can write macros in other languages to change the functioning of the language itself but it's much more complicated for complex syntax. This makes lisps languages for creating your own language. Some people probably love lisp because they're inspired by it but people who consider its power indispensable (pg) probably customize it a lot with macros.

Someday you may find yourself implementing a lisp unintentionally (https://en.wikipedia.org/wiki/Greenspun%27s_tenth_rule) . IMO this is because it's natural to start with some text based data structure format (JSON?) for some config file or something and then add features incrementally. Suddenly you're making a lisp. Understanding lisp can help you realize you're building a bad one. This has happened to me before.

Lisp is also very interesting if you want to write code that writes itself. For instance, if you were toying with genetic algorithms the fact that every piece of lisp code within matched parens is itself a valid lisp program is a nice feature to have.

In the end though, I don't think actually using lisp is as important as understanding lisp. I can see the insane power of meta-programming for creating higher abstractions for myself or a small team of like minded individuals. But whether it's lisp macros, scala implicits, annotation based byte code rewriting in java or method_missing in ruby, I think that meta-programming becomes a form of obfuscation for really large teams. If you take away meta-programming from lisp you still have some really cool languages. But I think many of those cool things about lisp can be achieved in more mainstream languages if you understand what you're trying to achieve.

After all, Brendan Eich was inspired by scheme when he created Javascript in ten days. But he was told to make the syntax "like Java". That's a bit ironic given that Guy Steele (one of the creators of Scheme) helped write the original Java specification. Neither language is very scheme-like in the end. But both authors contend that the languages would have been a lot worse if they hadn't been around to put a bit of lispiness into them.

So the last reason to learn lisp is that if you ever get asked to make a programming language for a large company with lots of stupid restrictions in a very short time frame you might be able to inject enough lisp into it that people will still use it under duress 25-30 years later.


> That's a bit ironic given that Guy Steele (one of the creators of Scheme) helped write the original Java specification.

Note that "write the specification" in this context meant creating a text to describe a design that had been done by other people before he joined. Guy made it quite clear at OOPSLA 2003 (which had become essentially a Java conference) that the language would have been different if he had been able to have any input.


> Anyway, i know this is HN i'm asking, but can you give me at least one compelling reason why I should study lisp,scheme or clojure ?

In Common Lisp:

    * (eql 'i 'I)
    T
In Clojure:

    user=> (= '(lisp,scheme or clojure ?) '(lisp, scheme, or clojure ?))
    true
Downvotes accepted for the grammar snark, I only mean it as a light ribbing (and casual warning that programmers looking at resumes are often, probably unfairly, going to penalize such mistakes more harshly than they reward complex side projects).

More seriously: do you like programming? And furthermore, are you interested in studying programming itself, not (just) programming as a means to other things? Are you perhaps interested in studying CS itself, not just exam material of whatever collection of courses that grants you a CS degree? If no, then you probably won't be compelled by any one reason to study anything beyond that which might let you get a well-paying programming job (perhaps as a gateway to getting a management job in tech), and lisp is not such a necessity. Nothing wrong with that, but inspect your motives carefully and plan accordingly. If yes to any of those, studying new languages will be rewarding, and lisps have a lot to offer. For instance, if you studied Common Lisp, you would have an environment suitable for practical programming of applications in any domain, you would not want for language features either because Lisp supported them anywhere from recently to 50+ years ago (realize it took until Java 8 for Java to get anonymous functions, and even then you're better served by SICP to learn tasteful application of programming with them, just as you're better served by Clojure to learn tasteful application of immutable data structures) or you can add them yourself without awaiting a new version, you would have a language good for researching CS topics such as new languages (and compilers), data structures, garbage collectors, type systems, theorem provers, AI, etc., and you might save future-you some hours or anxiety when it comes to "unrelenting, constant learning" because you can frame this week's fad into concepts you already know from Lisp.

Maybe the most compelling reason to study though is that of there not being a better time. If you are so curious, just go study already! And if several hours of study makes you lose interest, or decide it's all fluff evangelized by lisp zealots not worth pursuing further, then go ahead and study something else that interests you. But the time is now. Post-college, you'll likely have less opportunity, less desire, and less time to satisfy your curiosities, so indulge while you can.

And since this is HN, a recent pg twitter thread to contemplate: https://twitter.com/paulg/status/1274632047303315456


Having had formal education at a Java college, where you had to have at least one Class definition to run your code, discovering this course was a revelation.

To learn from the people that invented their own programming language AND chip to go with it. For free, as long as you had internet access, is something I'm eternally thankful for.

This video series is a national treasure. It changed my life, and so many others, without a doubt, for the uncountable better.

(THANK YOU)


Here's some info (and links to pictures) I posted earlier about Lynn Conway's groundbreaking 1978 MIT VLSI System Design Course, in which Guy Steele designed his Lisp Microprocessor:

https://news.ycombinator.com/item?id=8860722

I believe this is about the Lisp Microprocessor that Guy Steele created in Lynn Conway's groundbreaking 1978 MIT VLSI System Design Course:

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/MIT78.html

My friend David Levitt is crouching down in this class photo so his big 1978 hair doesn't block Guy Steele's face:

The class photo is in two parts, left and right:

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Class2s.jp...

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Class3s.jp...

Here are hires images of the two halves of the chip the class made:

http://ai.eecs.umich.edu/people/conway/VLSI/InstGuide/MIT78c...

http://ai.eecs.umich.edu/people/conway/VLSI/InstGuide/MIT78c...

The Great Quux's Lisp Microprocessor is the big one on the left of the second image, and you can see his name "(C) 1978 GUY L STEELE JR" if you zoom in. David's project is in the lower right corner of the first image, and you can see his name "LEVITT" if you zoom way in.

Here is a photo of a chalkboard with status of the various projects:

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Status%20E...

The final sanity check before maskmaking: A wall-sized overall check plot made at Xerox PARC from Arpanet-transmitted design files, showing the student design projects merged into multiproject chip set.

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Checkplot%...

One of the wafers just off the HP fab line containing the MIT'78 VLSI design projects: Wafers were then diced into chips, and the chips packaged and wire bonded to specific projects, which were then tested back at M.I.T.

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/Wafer%20s....

Design of a LISP-based microprocessor

http://dl.acm.org/citation.cfm?id=359031

ftp://publications.ai.mit.edu/ai-publications/pdf/AIM-514.pdf

Page 22 has a map of the processor layout:

http://i.imgur.com/zwaJMQC.jpg

We present a design for a class of computers whose “instruction sets” are based on LISP. LISP, like traditional stored-program machine languages and unlike most high-level languages, conceptually stores programs and data in the same way and explicitly allows programs to be manipulated as data, and so is a suitable basis for a stored-program computer architecture. LISP differs from traditional machine languages in that the program/data storage is conceptually an unordered set of linked record structures of various sizes, rather than an ordered, indexable vector of integers or bit fields of fixed size. An instruction set can be designed for programs expressed as trees of record structures. A processor can interpret these program trees in a recursive fashion and provide automatic storage management for the record structures. We discuss a small-scale prototype VLSI microprocessor which has been designed and fabricated, containing a sufficiently complete instruction interpreter to execute small programs and a rudimentary storage allocator.

Here's a map of the projects on that chip, and a list of the people who made them and what they did:

http://ai.eecs.umich.edu/people/conway/VLSI/MIT78/MPC78map.g...

1. Sandra Azoury, N. Lynn Bowen Jorge Rubenstein: Charge flow transistors (moisture sensors) integrated into digital subsystem for testing.

2. Andy Boughton, J. Dean Brock, Randy Bryant, Clement Leung: Serial data manipulator subsystem for searching and sorting data base operations.

3. Jim Cherry: Graphics memory subsystem for mirroring/rotating image data.

4. Mike Coln: Switched capacitor, serial quantizing D/A converter.

5. Steve Frank: Writeable PLA project, based on the 3-transistor ram cell.

6. Jim Frankel: Data path portion of a bit-slice microprocessor.

7. Nelson Goldikener, Scott Westbrook: Electrical test patterns for chip set.

8. Tak Hiratsuka: Subsystem for data base operations.

9. Siu Ho Lam: Autocorrelator subsystem.

10. Dave Levitt: Synchronously timed FIFO.

11. Craig Olson: Bus interface for 7-segment display data.

12. Dave Otten: Bus interfaceable real time clock/calendar.

13. Ernesto Perea: 4-Bit slice microprogram sequencer.

14. Gerald Roylance: LRU virtual memory paging subsystem.

15. Dave Shaver Multi-function smart memory.

16. Alan Snyder Associative memory.

17. Guy Steele: LISP microprocessor (LISP expression evaluator and associated memory manager; operates directly on LISP expressions stored in memory).

18. Richard Stern: Finite impulse response digital filter.

19. Runchan Yang: Armstrong type bubble sorting memory.

The following projects were completed but not quite in time for inclusion in the project set:

20. Sandra Azoury, N. Lynn Bowen, Jorge Rubenstein: In addition to project 1 above, this team completed a CRT controller project.

21. Martin Fraeman: Programmable interval clock.

22. Bob Baldwin: LCS net nametable project.

23. Moshe Bain: Programmable word generator.

24. Rae McLellan: Chaos net address matcher.

25. Robert Reynolds: Digital Subsystem to be used with project 4.

Also, Jim Clark (SGI, Netscape) was one of Lynn Conway's students, and she taught him how to make his first prototype "Geometry Engine"!

http://ai.eecs.umich.edu/people/conway/VLSI/MPCAdv/MPCAdv.ht...

Just 29 days after the design deadline time at the end of the courses, packaged custom wire-bonded chips were shipped back to all the MPC79 designers. Many of these worked as planned, and the overall activity was a great success. I'll now project photos of several interesting MPC79 projects. First is one of the multiproject chips produced by students and faculty researchers at Stanford University (Fig. 5). Among these is the first prototype of the "Geometry Engine", a high performance computer graphics image-generation system, designed by Jim Clark. That project has since evolved into a very interesting architectural exploration and development project.[9]

Figure 5. Photo of MPC79 Die-Type BK (containing projects from Stanford University):

http://ai.eecs.umich.edu/people/conway/VLSI/MPCAdv/SU-BK1.jp...

[...]

The text itself passed through drafts, became a manuscript, went on to become a published text. Design environments evolved from primitive CIF editors and CIF plotting software on to include all sorts of advanced symbolic layout generators and analysis aids. Some new architectural paradigms have begun to similarly evolve. An example is the series of designs produced by the OM project here at Caltech. At MIT there has been the work on evolving the LISP microprocessors [3,10]. At Stanford, Jim Clark's prototype geometry engine, done as a project for MPC79, has gone on to become the basis of a very powerful graphics processing system architecture [9], involving a later iteration of his prototype plus new work by Marc Hannah on an image memory processor [20].

[...]

For example, the early circuit extractor work done by Clark Baker [16] at MIT became very widely known because Clark made access to the program available to a number of people in the network community. From Clark's viewpoint, this further tested the program and validated the concepts involved. But Clark's use of the network made many, many people aware of what the concept was about. The extractor proved so useful that knowledge about it propagated very rapidly through the community. (Another factor may have been the clever and often bizarre error-messages that Clark's program generated when it found an error in a user's design!)

9. J. Clark, "A VLSI Geometry Processor for Graphics", Computer, Vol. 13, No. 7, July, 1980.

[...]

The above is all from Lynn Conway's fascinating web site, which includes her great book "VLSI Reminiscence" available for free:

http://ai.eecs.umich.edu/people/conway/

These photos look very beautiful to me, and it's interesting to scroll around the hires image of the Quux's Lisp Microprocessor while looking at the map from page 22 that I linked to above. There really isn't that much too it, so even though it's the biggest one, it really isn't all that complicated, so I'd say that "SIMPLE" graffiti is not totally inappropriate. (It's microcoded, and you can actually see the rough but semi-regular "texture" of the code!)

This paper has lots more beautiful Vintage VLSI Porn, if you're into that kind of stuff like I am:

http://ai.eecs.umich.edu/people/conway/VLSI/MPC79/Photos/PDF...

A full color hires image of the chip including James Clark's Geometry Engine is on page 23, model "MPC79BK", upside down in the upper right corner, "Geometry Engine (C) 1979 James Clark", with a close-up "centerfold spread" on page 27.

Is the "document chip" on page 20, model "MPC79AH", a hardware implementation of Literate Programming?

If somebody catches you looking at page 27, you can quickly flip to page 20, and tell them that you only look at Vintage VLSI Porn Magazines for the articles!

There is quite literally a Playboy Bunny logo on page 21, model "MPC79B1", so who knows what else you might find in there by zooming in and scrolling around stuff like the "infamous buffalo chip"?

http://ai.eecs.umich.edu/people/conway/VLSI/VLSIarchive.html

http://ai.eecs.umich.edu/people/conway/VLSI/VLSI.archive.spr...


It's worth noting that the Lisp Microprocessor was something quite different from Lisp Machines. Lisp Machines like the CADR and the Symbolics 3600 had conventional CPUs with enhancements like support for parallel tagbit and bounds checking and hardware acceleration of GC to make Lisp programs run faster. For the Lisp Microprocessor, a representation of Lisp was the instruction set.


And Lisp Machines were wire wrapped! There was a robotic wire wrapper at the AI Lab on the 9th floor of 545 Tech Square.

Lynn Conway's VLSI design course was the first time students designed and fabricated their own integrated circuits!


This was a great watch. I took this class as freshman, and watching this video now I realize how much of the material was wasted on 17-year old me. :-)

It's a testament to the quality of the teaching that it was eye-opening and inspiring back then at 17, and even more so now when I am 40.


A priceless masterpiece (we used the book in our algorithms and data structures 1 course in 1993).


Had a professor show us the first few minutes of the first lecture here. Saying "Computer Science" wasn't a science and that it was really about computers was mind blowing for a freshman CS major.


> and that it was really about computers

I think Abelson said in the first SICP lecture that "it was NOT really about computers"


I've seen the videos and went through -most- of the book exercises. This is a beautiful way to teach and think about programming and it continues to fascinate me.


Anyone know of similar lectures for the SICM (Structure and Interpretation of Classical Mechanics) book that Sussman also wrote?

I'm also interested in creating a study group to go through the book in Clojure, if possible.


I've watched this before. It would have been so great to be in some of these classes and a Course 6 student there!


I was looking for some complete open source Lisp programs to review. I wanted to experience the “magic” of Lisp, and bathe in its suffusion of blue [1], while becoming enlightened in the language of the gods.

I found the Hemlock [2] source code, which is an Emacs like clone. I don’t know how good it is, maybe someone else can comment on it. But the source code appeared to be rather clean and well organized.

This guy’s comment [3] on Lisp got me interested in it again: “Lisp is executable XML with a friendlier syntax.“.

And it appears to be like a key-value pair, where the data and code can interplay, and rewrite itself, and execute itself. Sorta, but if this can be done, then this would make for an interesting self-generating AI system. Of course, there are still other ways to skin this cat, like using a database.

I still need to do some more work to reach that enlightened state of Lisp Zen.

Perhaps someone else can share their enlightened states that they’ve reached with Lisp?

[1] https://xkcd.com/224/

[2] https://github.com/bluelisp/hemlock

[3] http://www.defmacro.org/ramblings/lisp.html


  Then, I reached a moment of Lisp epiphany.

  Where my VR coding system,
  Would build computational blocks of pure logic.
  A graphical metaphor, for the engineer to define, 
  Functions, variables, and conditional branching to the nines.

  Their results are returned, and fed into something new,
  Each rigorous, discrete, and unit tested too.
  Like a jigsaw linked together by logical flows and light, 
  Further enclosed, they add to its computational might. 

  And Lisp would be the ideal, 
  To achieve such a metaphorical fusion of thought.
  From conception to code,
  Between the virtual and the real.

  The key-value pairings, 
  Between the operator and its operands,
  All enclosed by parentheses,
  Bringing closure, and harmony to its lands. 

  Less is required, the tedious keyboards of our day, 
  Instead, on the VR you would tinker and play, 
  Like a meticulous craftsman of another age,
  Building until perfect, each module of the sage.

  The world’s combined intellectual prowess,
  Would be available at your needs, 
  Collected in a public digital library,
  The modules and code, for you to integrate into your feed.

  And then it would run, the great engine of light,
  Bringing forth its magic, and manifesting in twilight. 
  Running on pure thought, and reliable to the nines,
  The Engineer would sit back, and marvel at his find.

  And then I awoke from my dream,
  Suffusion complete.


this course is so fun to watch and listen to.


N.b. should read "Structure and Interpretation".


Fixed now. Thanks!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: