Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I want to be clear about what Bel is and isn't. Although it has a lot more features than McCarthy's 1960 Lisp, it's still only the product of the formal phase. This is not a language you can use to program computers, just as the Lisp in the 1960 paper wasn't. Mainly because, like McCarthy's Lisp, it is not at all concerned with efficiency. When I define append in Bel, I'm saying what append means, not trying to provide an efficient implementation of it.

That doesn't seem to be their goal.



OK, that's a fair point. But punning lists as functions is a still mistake for other reasons. For example, it makes it impossible to tell the difference between a list that is actually intended to be a function, and one that just happens to end up looking like a function by happenstance. It's fine to represent functions as lists, but having functions actually be lists rather than a first-class primitive is a huge mistake. If you're going to do that, you might as well just start with McCarthy's original primitives, or heck, even just the untyped lambda calculus. What is the point of adding characters and streams?


I think one reason to add characters and streams is so you can implement read and print, the R and the P in a Lisp REPL. The original Lisp assumes s-expressions are given, but provides no mechanism for creating them out of characters read from keyboard or printed to screen. Bel is taking the same axiomatic approach of the original Lisp but to a more realistic system.

Similarly, ccc exists so onerr can exist, which exists so saferead can exist, which exists so the REPL can show errors on screen.

That hangs together for me. But I still don't understand the distinction you're making between representing as lists vs being lists.


> so the REPL can show errors on screen

And how do you formalize the screen? (BTW, this is just speculation on your part. The word "screen" doesn't actually appear anywhere in the article.)

You can't have it both ways. Either you are going to formalize everything, or you're going to punt at some point and yield to practical considerations. And formalizing I/O is particularly hard.

> I still don't understand the distinction you're making between representing as lists vs being lists.

(lambda () t) is a list. In McCarthy's original lisp, as in Bel, this list IS a function. This would work:

    (setq l '(lambda () t))
    (l) --> T
But in modern Lisps you have to EVALUATE this list in order to get a function that you can call. In Scheme:

    ts> (define l '(lambda () #t))                  
    l
    ts> l
    (lambda () #t)
    ts> (l)
    Error: illegal function 
    ts> (define p (eval l))
    p
    ts> p
    #<CLOSURE>
    ts> (p)
    #t
    ts> (procedure? l)
    #f
    ts> (procedure? p)
    #t
    ts> (list? l)
    #t
    ts> (list? p)
    #f


> (lambda () t) is a list.

This is not right, and it's a very basic thing to get wrong. I don't think you understand what you're criticizing.

Even replacing the Bel name, this is not right:

> (fn () t) is a list

You _do_ have to eval it.

    bel> (eval '(fn () t))
    (lit clo nil () t)
I see the returned list as akin to a serialization format. It's an s-expression rather than some opaque representation that can't be printed. That seems nice, more debuggable. I don't see what makes it harder to compile than an opaque representation. You can compile it just fine, you're just carrying around the source code in say a debug segment of the image.

> (BTW, this is just speculation on your part. The word "screen" doesn't actually appear anywhere in the article.)

Look, you asked what the point of streams was. I pointed out what the metacircular source code does with streams.

If someone asks what the purpose of stdout is in Unix, it seems reasonable (and a hoary tradition) to justify it in terms of printing to screen. stdout is about much more than a screen, but "screen" is an easy answer to "why does my program need stdout?"


> I don't think you understand what you're criticizing.

Oh, I'm pretty sure I do.

    Functions are represented using lists. For example, here is a function that takes one argument, and returns that plus 1.
    
    (lit clo nil (x) (+ x 1))
    
    The first element, lit, says that this is a literal object, not to be  evaluated.


What you want in this kind of situation is for those symbols to be internal objects that cannot be written. Or perhaps symbols in a private system package, e.g.:

  (sys:lit sys:clo nil (x) (+ x 1))
No program using symbols in the ordinary namespace will produce this object by accident. It has to use the sys: stuff and when it does that, the language specification can say that it's sticking the proverbial fork into the proverbial toaster.


That doesn't help at all. These symbols don't have to be produced by the reader. All you need to do to get access to these symbols is to call FN once. That gives you a list whose first element is LIT and second element is CLO. So all you need to "stick a fork" in the Bel toaster is to be able to call FN and XAR.


Yes. In a nutshell, the problem is that (consp x) being true does not rule out (functionp x), which it should. (Unless consp is hacked in an unspeakable way.)

In a real implementation, you'd want a new kind of cons cell with a separate type tag so that it fails consp. It's okay if car and cdr like it.


> (consp x) being true does not rule out (functionp x), which it should

Exactly. That's what "punning lists" means.

> It's okay if car and cdr like it.

That's arguable. But the real problem is if XAR and XDR accept it. Functions, however they are implemented, should not be mutable. That way lies madness.


I don't know where the disconnect is, but I'll stop here. The statement you just quoted disagrees with what you said on two occasions before:

> It's fine to represent functions as lists, but having functions actually be lists rather than a first-class primitive is a huge mistake.

> (lambda () t) is a list

You said this would work in Bel:

    (setq l '(lambda () t))
    (l) --> T
This is false even after fixing the Bel names for things.


The disconnect is that PG misuses the word "represented". Functions in Bel are not just represented as lists, they are lists. Specifically, they are lists whose first element is the symbol LIT and whose second element is the symbol CLO. Functions can also be represented as lists that start with the symbol FN.

    You would not ordinarily express a function using its literal representation. Usually you'd say
    
    (fn (x) (+ x 1))
    
    which yields the function above.  [i.e. (lit clo nil (x) (+ x 1))]
So a list that starts with FN represents a function, but it is not a function. It has to be evaluated to produce a function, but the function that it produces is just another list, one which starts with LIT rather than FN.


In Bel you have to eval (fn (x) (+ x 1)) before you can call it. Do you agree? Following your terms in https://news.ycombinator.com/item?id=40409467, in this respect Bel feels like a modern lisp. In other words, this quoted passage of yours is not correct:

    (lambda () t) is a list. In McCarthy's original lisp, as in Bel, this list IS a function. This would work:

    (setq l '(lambda () t))
    (l) --> T
This would not work in Bel.


Yes, that's right, but you have missed the point. In McCarthy's Lisp, functions were lists whose first element was the symbol LAMBDA. In Bel, functions are lists whose first element is LIT and whose second element is CLO. But the details of what is in those lists is irrelevant. The point is, in both langauges, functions are lists. You can call CAR on a function and get a result. In McCarthy's Lisp, that result will be LAMBDA and in Bel it will be LIT. If you call CAR on a function in a modern Lisp you will get an error.

    ? (car (lambda () t))
    > Error: The value #<Anonymous Function #x302000E7B61F> is not of the expected type LIST.
Contrast with:

    ? (car '(lambda () t))
    LAMBDA
In addition, there is a special form in BEL called FN which, when evaluated, produces a function, i.e. a list whose first element is LIT and whose second element is CLO. McCarthy's original lisp had no such function. The special handling of functions, i.e. lists starting with LAMBDA, was hard-coded into the definition of EVAL. But that is also an irrelevant detail.


Ok, I think we're past the misunderstanding now.

I don't understand why this makes it hard to compile Bel. Application code won't check the first two elements of the list to determine if a term is a function. Instead it will say ((isa 'function) ___). It seems fine that a program using the former approach will run slower.


> I don't understand why this makes it hard to compile Bel.

Because Bel is supposed to be a specification, not an implementation. And if you specify that functions are lists then you are no longer free to implement them in a way that does not preserve the illusion that they are lists. Worse, in Bel, lists are mutable. That makes reasoning about programs nearly impossible. Every time you call CONS (which in Bel is spelled JOIN for some reason) you may or may not be producing a function. Every time you mutate a pair you may or may not be creating a function, or destroying one, or changing a function's environment.

There is not and cannot be any barrier between the mutable and the immutable in Bel. It is not possible to add a data type that guarantees invariants. Mutable pairs are all you have to build with, and so whatever you build, not matter how carefully you design it, the rug can get pulled out from under you at any time by an errant call to XAR or XDR.


Generalizing from the representation of functions to the absence of invariants makes your point crystal clear, thanks.


Why would you need read and print in a language that isn't yet intended to program computers?

You don't need I/O in order to describe input and output in an abstract language. Literals, including quoted lists, are enough to encode input into the program as part of its body. The result of a function can be described using a notation like (+ 2 2) -> 4. It's not actually running so nothing needs to print that.


> But punning lists as functions is a still mistake for other reasons

Formally speaking, you would struggle to do it any other way. If you introduce a type for functions, you need some way to get to it from lists in order for macros to work. Then that totally muddies the definition of eval. You'd basically have to replace it with some "compile" function that turns an expression into a set of lambdas then executes them. One starts to question the need for lists at all at that point, but when you start representing data as lambdas you lose the well-foundedness of lists. Suddenly lambda isn't something you can write down but just an output from "compile". If you wanted to write lambda in terms of lists, they would need be written in terms of lambdas and you end up with an infinitely recursive problem. So lists are never really written down, just constructed by "read" and input to "compile". What you have at that point is no longer lisp, but just a compiler to lambda calculus.

I think the lesson here is really that lisp is a convenient syntax but a lousy model for computation. From that regard, saying "I'll do a purely formal lisp" is a mistake. The formal component of lisp is a waste of time and only its practical aspects are useful.


> Formally speaking, you would struggle to do it any other way.

Um... why?

> If you introduce a type for functions, you need some way to get to it from lists in order for macros to work.

Sure. So? That has been a solved problem for over fifty years.

> Then that totally muddies the definition of eval.

No, it doesn't. There is absolutely no reason that a function has to be represented as a list. The only reason it was represented as a list in McCarthy's original design is because there was nothing else.

See here:

https://flownet.com/ron/l.py

for an example of a Lisp interpreter written in Python where functions are represented as Python objects. You will note that the other than being written in Python, the definition of eval is not at all muddled.

> I think the lesson here is really that lisp is a convenient syntax but a lousy model for computation.

I think you don't know what you're talking about. The syntax has absolutely nothing to do with punning lists. You can pun lists in any language, and it's an equally bad idea in any language.


When you mentioned lambda calculus I assumed you meant functions were to be added as a primitive executable type. Then you would logically have lists defined as:

  (define (cons a b) (lambda (x y) (x a b)))
  (define nil (lambda (x y) (y)))
Which is problematic because you can write down the list-form of a lambda but not the actual value of it.

When you suggest the addition of an additional record type to the language because it's good from a software engineering perspective, I think this goes against the idea for this document (that the language should be entirely formal and ignore software engineering considerations). There are all sorts of things you might want types for, but I think the point here is to demonstrate that you can use lists for all of them.

> I think you don't know what you're talking about.

Rude.


> Which is problematic because you can write down the list-form of a lambda but not the actual value of it.

That's not true. Nothing prevents you from defining a serialization format for lambdas. That is in fact what all modern Lisps do.

> When you suggest the addition of an additional record type to the language

When did I suggest that? Pointing out that Bel doesn't have a record type is not the same as suggesting one should be added. Bel is intended to be a specification, not an implementation.

> Rude.

Do you think so? I was responding to this:

> I think the lesson here is really that lisp is a convenient syntax but a lousy model for computation.

Calling Lisp a "lousy model for computation" is not only objectively false -- Lisp is the foundation of all modern programming languages -- it also denigrates the work of hundreds of people over decades. Maybe that's not what you intended. Maybe you are simply ignorant. But I actually thought that mine was a pretty measured response.


> That's not true. Nothing prevents you from defining a serialization format for lambdas. That is in fact what all modern Lisps do.

If you have a list defined "al la lambda calculus" as a lambda term, you can't really write it down fully because to do that you'd need to write a lambda and a lambda is written in terms of lists and lists are written in terms of lambda. As an example:

  (a b) ;; is a shorthand for
  (a . (b . nil)) ;; and with lists that's where it ends

  ;; but with lambdas
  (a . b) ;; is a shorthand for
  (lambda (f g) (f a b)) ;; is a shorthand for
  (cons lambda (cons (cons f (cons g nil)) (cons (cons f (cons a (cons b nil)))))) ;; which expands to
  (lambda (f g) (f 'lambda (lambda (lambda (f g) (f 'f (lambda (f g) (f 'g nil)))) (f (f g) (lambda (f g) (f (lambda (f g) (f 'f (lambda (f g)  (f 'a (lambda (f g) (f 'b nil))))))))))))
  ;; which is a shorthand for ...
You never reach the end of it.

> Lisp is the foundation of all modern programming languages

Not really. People like to say that lisp had a much greater influence than it actually did. If I called Smalltalk the "foundation of all modern programming languages", you'd instantly see that it isn't true. Lisp was one step taken by some people along one particular path, not the first or the last and hardly the foundation.

> it also denigrates the work of hundreds of people over decades

No it doesn't. If you think it does then name them. To my knowledge the people researching programming don't use list-based methods of computation because they are too complex and the people implementing programming languages don't use list-based methods of computation because that's not how hardware works.

> Maybe you are simply ignorant

Maybe you are simply rude.


> If you have a list defined "al la lambda calculus" as a lambda term, you can't really write it down fully

Here is an existence proof that you are wrong about that:

https://flownet.com/ron/lambda-calculus.html

> To my knowledge the people researching programming don't use list-based methods of computation

The ignorance you display here is truly breathtaking.

> Maybe you are simply rude.

Yep, that's possible too.


> The ignorance you display here is truly breathtaking.

Remember when I said "if you disagree, give me an example of such research"? Now is the time to do that.

> Here is an existence proof that you are wrong about that:

No it isn't. Because they have lambda defined as an axiom. If you define lambda in terms of lists, and lists in terms of lambda, it's pretty trivial to say you have a circular reference on your hands.


> Now is the time to do that.

Sigh.

https://www.amazon.com/Essentials-Programming-Languages-MIT-...

> they have lambda defined as an axiom

"They" is me -- I wrote that article. And no, "they" do not have lambda defined as an axiom. There are no axioms in that article because there are no proofs. It's just a demonstration of how you can do math using lambda calculus, with pairs being built along the way (and ultimately abandoned in favor of Church numerals) along with an actual implementation of a serializer for lambdas called lc-expand. If you don't see how that refutes your claim that "If you have a list defined "al la lambda calculus" as a lambda term, you can't really write it down fully" I'm not going to try to explain it to you. You are apparently beyond my ability to help.


> https://www.amazon.com/Essentials-Programming-Languages-MIT-...

That is a textbook, not research.

> "They" is me -- I wrote that article.

Makes sense. The HTML elements are broken in some of the code snippets by the way. A few of &amp; scattered about the place.

> There are no axioms in that article because there are no proofs.

Google "Curry-Howard correspondence". Computation is defined through axioms; you cannot compute without them.

> You are apparently beyond my ability to help.

I think you are just incapable of reading and processing ideas from other people without insulting them due to some sort of ego-trip where you imagine you are the messiah sent from heaven to educate us poor ingrates about programming when you clearly are not. Be less obnoxious in future and more open to new ideas.


> That is a textbook, not research.

Where do you think textbooks come from? Have you ever looked at a textbook? Have you noticed that textbooks typically include a section called "References" (as this one does)? What do you think that is for?

> The HTML elements are broken in some of the code snippets by the way.

I know. That page was just exported from MS Word. Fixing it is not high on my priority list.

> you cannot compute without them.

What does that have to do with the fact that the article I pointed you to has neither proofs nor axioms in it?

> I think you are just incapable of reading and processing ideas from other people

> Be less obnoxious in future

Do you hear yourself?


> Do you hear yourself?

Yes, and I'm not the one compulsively ending every comment with a jab at the other guy's intelligence.

> What does that have to do with the fact that the article I pointed you to has neither proofs nor axioms in it?

So because you didn't write them down, you think they don't exist?

> as this one does

Quite frankly, I'm not going to spend $77 and wait for shipping to find out what's in the references section of some random text book. It seems like it should be an awful lot easier to find one of these hundreds of people you know.


> because you didn't write them down, you think they don't exist?

No, I just think they are irrelevant to adjudicating your claim that "If you have a list defined "al la lambda calculus" as a lambda term, you can't really write it down fully...". I've refuted you with an actual implementation (which you have obviously not even bothered to look at). The fact that my implementation could be axiomatized has nothing to do with it.

> I'm not going to spend $77 and wait for shipping to find out what's in the references section of some random text book.

EOPL is not "some random text book."

But it doesn't really matter. Pick up any textbook you like on programming languages published in the last 40 years. I'd be surprised if you could find even one that did not reference McCarthy's original Lisp paper.


> which you have obviously not even bothered to look at

Again, you presume my ignorance because the only alternative is that I understand you and still think you are wrong. How do you think I knew about the contents of your article if I didn't read it? Half the reason I noticed the HTML elements is because they gave me a syntax error why I tried to evaluate your code.

> I'd be surprised if you could find even one that did not reference McCarthy's original Lisp paper.

Well that's quite interesting. So the fact that some text books reference one specific paper is proof of thousands of hours of research across hundreds of people, all of whom would be insulted by my claim that lists aren't a very good processing primitive? Honestly, I suspect most of them would agree with me that they're better as syntax. You seem quite obsessed with lambda calculus yourself based on your articles and talks and you utterly despise all the list-based stuff from Del. Given that you always seem to write the lambdas with list syntax, you seem to agree with me in every point. So why are you looking for people to be offended by something that we both seem to believe?


> you presume my ignorance

No, I presume nothing. I conclude your ignorance because you continually supply me with evidence for it.

> the fact that some text books reference one specific paper is proof of thousands of hours of research across hundreds of people

No, it is not proof of anything. It is evidence that your claim that "People like to say that lisp had a much greater influence than it actually did" is wrong.


> No, I presume nothing.

So what is the word for when you say I haven't looked at your website based on no evidence and in the presence of evidence to the contrary? I would say "presume" fits the bill. And that is something. So it does seem that you have presumed something. And that is the exact thing I was saying you presumed. It's a pretty basic fact and it's written down. But you will still dispute it and claim I'm being stupid.

> It is evidence that your claim that "People like to say that lisp had a much greater influence than it actually did" is wrong.

I thought we were discussing the "hundreds of people" that my previous assumption apparently denigrates. But if you want to talk about inflating lisp's reputation, a good example would be talking about how a single paper on list being cited in _lisp_ text-books is evidence that it is "the foundation of all modern programming languages".


EOPL is not a lisp text book. It is a textbook about programming languages in general. (Which you would know if you had bothered to look at it.)


Would you say there are properties of this textbook that make it more related to lisp and lisp interpreters than the average textbook, and hence more likely to reference the early material about lisp and lisp interpreters?


What would you consider the "average textbook"?


Can you really not answer a question that simple? Every textbook has a certain amount of lisp in it. Take the average of all of those amounts (the mean, lets say), and you will get the amount of lisp in the average textbook. This is a substantially smaller amount than the one you just picked. Can you really not work that out of your own accord? Can you think of any definition of the average textbook that means it has as much lisp as the one you just picked out? No. I think you are just being obtuse.


To be honest, as I've read more of the Bel document, I have to say streams and characters feel rather "bolted on", especially since streams only work with binary. It feels like it's trying to straddle both the practical and theoretical, and ends up being a bit of a mess. I appreciate people exploring new concepts, but honestly this doesn't feel that novel.


Perhaps to make it a real programming language that you could use to do stuff?


What part of "This is not a language you can use to program computers" wasn't clear?


The "language" part. In context it seems to refer to the implementation, not the specification. I take it that pg does intend to specify a useful programming language, but not implement it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: