Lambda (was: Refactoring in Tunes)
Thu, 20 Jan 2000 11:48:32 -0800
> From: Laurent Martelli [mailto:email@example.com]
> Subject: Re: Lambda (was: Refactoring in Tunes)
> >>>>> "billy" == btanksley <firstname.lastname@example.org> writes:
> >> In oher words, your system is correct, and mine is broken just
> >> because if different from yours ? :-)
> billy> Yes. Just as broken as if it didn't understand the
> billy> annotation which says "the following name is the title of a
> billy> new definition." Just as broken as a C system would be if it
> billy> didn't understand the keyword 'int' when used in the
> billy> declaration of a parameter.
> But our system is not designed yet, so we can decide what we want to
> put in it.
Right. And I'm telling you what I think our system should do.
It sounds as though you're saying that for any standard we decide on, you
would feel free to leave out parts of the standard, and then blame the
standard-compliant programs for not running on your system.
Your standard proposal isn't wrong, but your suggestion that my standard is
wrong because it allows annotations is certainly wrong.
> billy> ML functions have one and only one type. However, ML's infix
> billy> operators are overloaded, and thus textually ambiguous.
> billy> Thus,
> billy> define square x :- x*x;
> billy> is of ambiguous type, and the compiler will refuse to accept
> billy> it (is it float or int?). The solution is simple: the
> billy> programmer provides a type annotation at any point in the
> billy> program where it helps. For example, this:
> billy> define square:(int->int) x :- x*x;
> billy> (I think caml may have the same characteristics, by the way.
> billy> I don't know, having never used caml.)
> billy> Of course, the oddity here is the result of what I would
> billy> consider a design mistake -- overloading is a strange action
> billy> in a type-inferenced language, doubly so when you realise
> billy> that in spite of the overloading you still have to specify
> billy> which version of the function you're going to use.
> Overloading is a UI concept. Therefore I think it's bad to put that in
> a language. Because I think it's evil to put UI stuff in a
> language. (but you probably know tha now :-)
No, I don't know that -- in fact, I disagree stridently with it. A computer
language is nothing BUT a UI. It's an interface between the programmer and
> billy> Perhaps a language without overloading would not require
> billy> typechecking.
> I can even imagine that ML could accept "define square x :- x*x;" and
> define as many overloaded function as possible.
I imagined that as well at the time. There are a few problems with doing
that -- for example, the number of functions grows exponentially. The type
inferencing is already NP-complete.
> >> Proving that some code has some properties should not be mixed
> >> with interpreting that code.
> billy> But nobody's proving anything about the code -- these type
> billy> annotations are used to help the programmer write to the
> billy> interface, and allow the compiler to do typechecking.
> Isn't typechecking proving that no type errors will occur if you run
> the program ?
>From an academic viewpoint, I suppose so. Most programmers find it more
useful when it proves that there IS a type error, and this is the use for
which I designed this.
> billy> I don't feel good about splitting these things, but would an
> billy> acceptable compromise be a C-style header file? You lose a
> billy> LOT of power that way (since you can't place the annotations
> billy> anywhere in the source),
> What's so powerful about that ?
I'm lost again. Sorry. Why would you say otherwise? How can adding
restrictions like that FAIL to reduce the power? Can you even come up with
one nontrivial example where it doesn't?
> Anyway, if you have somewhere :
> (defun f (x) (* x x))
> and somewhere else
> (deftype f (int int))
> It's very easy for a GUI to gather the 2 thing and display
> int f(int x) = x * x ;
> The fact that the annotation refers to f places it next to f. It's up
> to the UI to display this in a convenient manner.
I don't get it, though. Why would we deliberately make things more complex?
Why shouldn't we design for simplicity?
> Laurent Martelli