Abstraction (or, Lambda Calculus for Dummies)
Tue, 16 Dec 1997 14:05:24 +0100 (MET)
> Why is a meta-operator different from a function? Doesn't higher-order
> mean functions can be meta-operators?
lambda operates on parsed syntax, not on objects.
It's a language construct.
In expression "(lambda (x) x)",
you don't apply a function "lambda" to objects "(x)" and "x"!
Functions operate on objects.
In expression "(f x)", function "f" operates on "x".
The idea is that there is a difference between source code (which
might itself be considered as an object in a reflective system),
and the objects that it denotes. Surely, there might be somewhere
a function that takes source code, and returns a function, but this
has to do with reflection (in this case, absorption), not with
higher-order functions (functional objects that take as parameter and/or
return other functional objects).
> Let me just make sure: when you say lambda you are talking about
> abstraction. When you say abstraction, to do it you would use lambda. I
> was using the words as if they were the same thing. Or at least that
> lambda is the symbol used to denote abstraction (I believe you said this
> somewhere, I'm just checking).
Well, lambda is the syntax with which we do abstraction,
as inherited from researches of German mathematicians in the 1930's.
> Is this true:
> The difference between starting function f, and the unnamed returned
> function, is described by:
> f was not a function until it was abstracted. The act of abstraction
> created the function by placing future values into an expression.
> In other words lambda's expression argument is taken as parts, X is
> inserted into the parts, then the whole is simplified as much as it
> can without knowing the value of X. Am I getting close?
This would be true for a "call-once" function.
However, if/when the function can be called several times,
it must avoid clashes between values of parameter "X" among different
call sites, hence generate a new value holder (variable/symbol/whatever)
for "X" every time it is called (a process known as alpha-conversion).
If it doesn't, then you have "dynamic binding" of function parameters,
which interacts quite badly with function nesting, recursion, and
After alpha-conversion takes place, then yes, substitution and
simplification happen as you described.
> You didn't answer my question: are all functions initially created by
Depends on the syntactic primitives. It has been proved early on
that instead of using lambdas, you could do everything in terms of
higher-order combinators. The most well-known set of combinators
that generate all functions expressible with lambdas is S, K, I:
I x -> x
K x y -> x
S x y z -> ((x z) (y z))
(proof by double induction on the structure of lambda terms,
eliminating lambdas from the innermost to the outermost).
> The semantics for handling typed abstraction... is the type simply passed
> through: given f is the result of abstracting x and sin(2*x+1), then any
> type valid for x in sin(2*x+1), is also valid as an argument for f?
Things get more complicated if/when you define recursive functions, tho.
>> [BTW, was the "standard reference" the Barendregt book?]
It's a great book from a mathematical point of view,
but it doesn't treat the computational point of view much,
and doesn't talk much about typed calculi.
> Finally, an example of where abstraction would be useful. I thought of
> this: You have function f that takes type int for argument. But you want
> to make a generic function that takes any numeric type for an argument.
> So you do lambda(f,generic_numeric_type) and it results in a new function.
Not exactly. You can't go from a specific *object* to a more generic one;
you can only go the other way around. Of course, assuming the original
*source* for f didn't use require integer-specific assumptions from its
arguments, then you could take it and make a more generic function out of it.
So you could have a function
Inc: lambda(R:Number_Ring).with R.lambda(x:R).x+1
[Here, "with R" would tell the system that "+" and "1" are to be understood
as refering to an operation and a constant in R].
> I wondered how you can be sure the new function will work? Will it work
> in this case because numeric type is more general than integer type? What
> if you wanted to go the other way?
The proper semantic frame for that is either parametric type polymorphism,
whereby a numeric type would be explicitly given as a parameter to a
function, or subtype-polymorphism, whereby the integer type would
implicitly match the number type, by subtyping, but the function would
also have to match some elaborate patterns to be correct in presence of
subtyping (there are lots of articles about subtyping in the litterature).
> The C++ers would call it specialization or inheritance to take a function
> a that handled any number
> and to derive a new function b that handled only integers. However
> intentional programming would call it one aspect (a different
> implementation of the function a).
And that would be right, if only inheritance was a clean way to handle
subtyping. Unhappily, it doesn't in the general case (inheritance is just
a broken approximation to subtyping). However, C++ now allows to resort
to parametric polymorphism, with templates, so you could give a numeric type
as a template parameter to a generic algorithm template (yuck).
> So am I talking about abstraction in
> the first case (int to general number),
You could, in a proper syntactic setting.
> and is this the opposite of
> another operation (general number to int)
Only abstraction and function application usually happen at different
semantic levels (well, there are also "flat" models used to interpret/rewrite
lambda-expressions; but in models used to compile programs, abstraction
happens statically, whereas function application happens at runtime).
> and how does this relate to C++
> and other languages that do not have abstraction or are not reflective.
C++ has a limited kind of abstraction through first-order functions,
and a less-limited kind of abstraction through higher-order templates.
> Is C++ not reflective because you can not access definitions of a function
> by other functions? Is abstraction the key function that accesses the
> internals of functions, or is it just one?
Reflection is quite different from abstraction.
Abstraction is the syntactic capability of defining functions
out of expressions. Reflection is the capability of going from syntax
to objects (absorption), and objects to syntax (reification).
Usual functional programming languages have abstraction and no reflection
(though LISP dialects often have some reflective features in the form
of macros, MOPs, or meta-circular interpreters).
FORTH has reflection (though low-level semantics interfere with it)
and no abstraction.
== Faré -=- (FR) François-René Rideau -=- (VN) Ð£ng-Vû Bân -=- firstname.lastname@example.org ==
Join a project for a free reflective computing system! | 6 rue Augustin Thierry
TUNES is a Useful, Not Expedient System. | 75019 PARIS FRANCE
http://www.eleves.ens.fr:8080/home/rideau/Tunes/ -=- Reflection&Cybernethics ==