Surprised by Joy
iepos@tunes.org
iepos@tunes.org
Tue, 14 Mar 2000 07:13:34 -0800 (PST)
> Ah, but the type information is useful and shouldn't be kept hidden. In
> addition, TUNES has this "thing" about wanting everything to be first class.
I guess yes, it would be a good idea to be able to find out at
runtime what the type of something on the stack is...
> >And, mentioning "-" reminded me of something else... One interesting
> >primitive one could add to the system is an "undo" primitive...
> >Such a primitive would take a quoted program off the stack, and
> >would "undo" the quoted program. This probably seems like a
> >wild idea; "undo" is usually thought of as a fancy feature often added
> >to text editors and things... to treat it as a fundamental primitive
> >of a system is getting weird... it would probably make the system
> >impossible to fully implement.
>
> Probably. However, at the same time, it _is_ possible; existance proof is
> in the language 'J', which implements an adverb 'power'. n-th power repeats
> the given verb n times, but if n is negative is repeats the inverse of the
> verb n times.
Wow... so the idea has already been implemented in a fairly general way...
> >Then, as an example, one could form a "dohalf" primitive, that
> >takes a quoted program and executes it a "half" a time:
> >
> > [[dotwice] cons] undo
>
> I don't see that one.
Oops... I should have explained this... First of all, a general
rule about "undo" that I expect we could accept is that (for all "foo")
foo [foo] undo == id
Now, using this rule, I can show that
[foo foo] dohalf == foo
where "dohalf" is the "[[dotwice] cons] undo" that I mentioned above
this is what we'd expect a "dohalf" to do... Now, to show why this is...
Basically, the key is to realize that
[foo foo] <==> [[foo] dotwice] <==> [foo] [dotwice] cons
These are not exactly identical Joy programs, because of the
opacity of quotation... So, I've adopted your "<==>" usage;
they are Joy programs that push equivalent programs onto the stack.
Now using this equivalence, we see that
[foo foo] dohalf
== [foo] [dotwice] cons dohalf (equivalence above)
== [foo] [dotwice] cons [[dotwice] cons] undo (definition of my "dohalf")
== [foo] (undo cancellation)
Oops, so my "dohalf" is not quite right... We should have gotten
"foo", not "[foo]". So, a corrected "dohalf" is
[[dotwice] cons] undo i
> A simpler way to express -2 would be:
>
> [dotwice] undo.
I don't think this is right... "dotwice" is a program that takes a
quoted program off the stack and executes a variation of it;
so undoing "dotwice" would have the effect of undoing some
action and then pushing a quoted program onto the stack.
Yet, I'm pretty you don't want your "-2" to always push a quoted program
onto the stack; rather, you'd expect it to take a quoted program...
A correct "-2" I think is
dup [undo] dip undo
If you run this with "[foo]" on the stack, you get
[foo] undo [foo] undo
which is what you want, I think...
> >I think that is right... and, the "dotwice", "dothrice", and such
> >I used earlier took a quotation and resulted in a quotation;
> >as I use them now, they do not result in a quotation, but rather
> >just execute the quotation they are given so many times...
>
> Right. Fortunately, it's trivial to use quotation to allow them to result
> in a quotation.
Hmm... let's see... is it really trivial?
How about the quotation-yielding version of "dotwice"? ...
Well, it can be constructed as "dup concat"...
but we want to construct it from the ordinary "dotwice"...
given "[foo]" on the stack, this quotation-yielding version
should give us
[foo foo]
It will also be okay if it gives us
[[foo] dotwice]
All right, a quotation-yielding version of "dotwice" is
[dotwice] cons
I used this above in my example with "dohalf" and didn't notice it...
It looks like this pattern will generally work... If "n" is a
Church Numeral, then "[n] cons" is its quotation-yielding version...
So, I guess it is trivial, like you said...
> > [add] cons undo
>
> >would be a subtraction program...
>
> It would be a program which undid adding something. I don't see why you
> can't do
>
> [add] undo.
>
> , though.
That would not be correct. "add" pops two numbers off and gives one
back. So undoing "add" I guess would pop one number off and give you
back an arbitrary two (which must total the original); this is
not the same as subtraction.
> >And, of course, some programs do not make sense to be run a negative
> >number of times, or a fractional or imaginary number of times
> >(although some do), or some might make sense in multiple ways...
> >Not sure how a system would deal with this...
>
> Nor am I. J deals with it by only implementing power partially: only
> specific primitives, and specific ways of combining those primitives, have
> inverses.
I think I'm just not going to worry about the "undo" idea for a while...
> >The basic idea is to make "[foo]" denote the program of pushing
> >the program "foo" onto the stack; then, for instance, "[add]" and
> >"[swap add]" are equivalent programs, since each pushes the "adding"
> >program onto the stack. And... if you told the mythical system to
> >do "[add] [swap add] =", it would push "True" onto the stack
> >(although it might not realize it, if it is acting in a lazy style).
>
> Okay. As far as this goes, that's great -- but it's not possible in general
> to test the equality of programs. I would recommend that '=' treat programs
> as incomparable unless they're actually the same program. Now, it's
> possible for ==> and => to test the equality of programs under a finite set
> of transforms, and in such a treatment, given the commutative axiom, your
> two programs would result in True.
This is a good idea, I think... I'm starting to dislike the idea
of having unimplementable primitives mixed in with all the others...
> I think a better solution is to provide two new types and two new operators.
>
> The first type is 'symbol'; an object of this type may or may not represent
> a function in this or some other context.
>
> The second type is 'function'; an example of the function type is
> demonstrated by running
>
> [+ +] uncons
>
> The result of this is a function followed by a list (which contains a
> function).
Hmm... Running "[+ +] uncons" would work by running "+ [+]", right?
It would add two numbers and then push "[+]" onto the stack...
I don't think I see your "symbol" and "function"... Are these
supposed to be new types on par with Ints and Lists ?
> The first new operator is "=>"; it acts like Joy's notation, but returns
> true if and only if one of its first argument can be rewritten as its second
> argument using a single rule.
>
> The second new operator is "==>"; like the first, it checks whether its
> first argument can be rewritten to look like its second with any number of
> steps.
Yes, it would be nice to have these operators... But, wouldn't
it be better if they were constructed in the system (taking advantage
of introspection) instead of being taken as primitives? ...
or maybe that is already what you had in mind...
By the way, I've been thinking a bit about how "=>" might be
implemented... In the Joy manual (in "The Algebra of Joy"),
the author gave several general rules about program equivalences,
for instance:
swap swap == id
dup pop pop == pop
dip pop == [pop] dip i
Yet from these and all the other equivalences given, I see no
way to prove this simple fact:
swap swapd swap == swapd swap swapd
Each program reverses the top three things on the stack...
And I'm guessing that there are many other apparently non-provable
facts, including many that can be written in terms of just
"id", "swap", "dup", "pop", "dip", which form the core of Joy,
I think (but then again, there is "i", "concat", "cons", and others
too)... here is another one I think they missed
concat dip == [dip] dip dip
This expresses the distributivity of "dip" across concatenated programs,
and could be used to show that
[x y] dip == [x] dip [y] dip
Here is my question: is there a finite set of these kinds of axioms
that entails some interesting kind of completeness?
A perhaps harder question: is there a similar finite set of axioms
that uses "=>" instead of "=="?
It might be interesting to note that an analagous situation happens
in applicative Combinator theory... in this situation, it turns out
that there are several finite sets of "==" axioms with interesting
kinds of completeness... unfortunately, it has been shown that
the same completeness is not possible with "=>" axioms...
(at least, this is true for the so-called "eta-strong" completeness;
as far as I know, it may be possible for the weaker "beta-strong"
completeness, which would be good enough for me)...
Joy also has an analogue of "eta-strong" and "beta-strong" theories...
In both theories it would be possible to prove
swap swapd swap == swapd swap swapd
concat dip == [dip] dip dip
But, in a eta-strong theory, and not a beta-strong one, it would
be provable that:
swap swap == id
In a beta-strong theory, this would not be acceptable, because
the left side uses two parameters, while the right side uses none;
however, a similar fact would be provable:
swap swap == [[] dip] dip
Essentially, "[[] dip] dip" is like "id" except that it requires
two parameters...
I'll have to think about this some more...
> For example, you recall my [0 add] example, where my program expected to be
> able to uncons the input, but the optimizer had optimised the "0 add" away.
> Under my new system, the input would be better expressed as something like
> [0 #add], where #name returns a symbol.
Hmm... not sure if I get your "#" syntax...
Would "#add" then be a program that just pushes a single thing
onto the stack (the "symbol" for "add")?
If so, why not just use "[add]"... ?
> >My answer is that you do not need lists; just push all the elements
> >directly onto the stack (perhaps with an additional integer, indicating
> >how many there are). Then, make all the "list" operators
> >(such as "reverse", "concat", and "cons") operate on the bare
> >stack.
>
> You know what? I really dislike this idea. It seems so messy!
Yes, it is too messy in some cases... lists are still nice to have...
So, this was probably not really a very good idea...
But, I think it would have use in an implementation,
in that the whole lists might be kept on the stack...
> I don't think this is a good idea. However, it does point out a good idea
> -- right now Joy has an "unquote" combinator. I think we also need an
> "unlist" function, which dumps everything in a list (or quotation) on the
> stack without executing any of it.
Yes, I had thought about having such a "unwrap" primitive, which would
take a list and dump out all its elements, followed by an integer
telling how many there are. There would also be its inverse, "wrap",
which takes a bunch of things off the stack, and an integer telling
how many to wrap, and yields a list with all those things in it...
Yet, I'm not sure how appropriate these primitives would be under
Joy's current way of doing lists...
> Multithreading (without the capability of message-passing) is possible
> simply by saving the data stack pointer, return stack pointer, and
> instruction pointer (and not sharing stacks).
Yeah... by the way, there is an interesting way I was planning on
using to eliminate the need for a return stack... usually
one might implement "dip" using a return stack, by saving the
dipped stack item onto the return stack, to be loaded back after
the quotation was executed... but, a return stack is not necessary;
instead, save the dipped stack item onto the "instruction stack"
using a literal...
Maybe I should illustrate... suppose we want the system to
execute "1 2 3 [swap dup] dip"; then the original state
of the system is
data stack :
instruction stack: 1 2 3 [swap dup] dip
The first four instructions push things onto the data stack;
after they are executed we have
data stack : 1 2 3 [swap dup]
instruction stack: dip
Now, after executing "dip", this is what will happen, I suggest:
data stack : 1 2
instruction stack: swap dup 3
Essentially, "dip" takes the "3" off the data stack and pushes
a 3-pushing instruction onto the instruction stack; then it
pushes the unquoted "[swap dup]" onto the instruction stack...
After the remaining instructions are finished there will be
"2 1 1 3" on the data stack...
By the way... the unquoting program "i" of Joy can be constructed
from "dip" as
foo swap dip pop
for any literal "foo" (by "literal", I mean a pure procedure that
uses no parameters but pushes a single one onto the stack)...
"foo" is the program that will be saved by "dip" although
afterward popped off... Suppose we pick "0" for "foo"...
Then, you can see that "0 swap dip pop" works as an "i"...see
what happens when it is ran with "[foo]" on the stack:
data stack : [foo]
instruction stack: 0 swap dip pop
after a couple steps we get
data stack : 0 [foo]
instruction stack: dip pop
then, after the action of "dip":
data stack :
instruction stack: foo 0 pop
Thus, "foo" will be run, and then "0" will be pushed and popped...
Of course, this is not really a very elegant construction
(it reminds me of the construction of "I" as "SKx" for any "x",
for instance "SKK")... one would probably rather keep "i" as a primitive.
> You mentioned an overhead due to linear logic; however, may I point out that
> in this case there _is_ no overhead to linear logic, because each item is
> referred to by exactly one pointer, and when we transfer the item to another
> thread we're actually transferring only the pointer (and the right to use
> it, of course). If any copying actually happens, it would have happened
> anyhow.
Hmm... pointers... I think pointers might be necessary at some point
in order to achieve efficiency... But, I had been thinking before of
literally stuffing everything onto the stack... In this case,
it seems like a transfer between threads would really require
a memmove; just transfering a pointer wouldn't work (at least,
it would cause some complications), because programs that read
structures from the top of the stack expect the real structures,
not pointers to them...
But, again, pointers would probably be needed internally in the
system at some point, in order to achieve efficiency...
one example... suppose one has a couple large structures
(say, several kilobytes big) on the top of the stack which one
is growing roughly simultaneously... growing the bottom one (the one
buried by the one on the top of the stack) will involve bumping
the top one out of the way, which would involve a slow memmove
every time something is added...
A solution to this is to store the buried large structure
off somewhere in the heap where there is lots of room to grow,
and just keep a pointer to it on the stack...
In general, if one is doing a lot of inserting and deleting with
items deep in the stack, then the system is probably going to
want to split the stack into smaller pieces, so that it does not
take so long to move things out of the way...
> >Exactly. Ahh... we're talking again about my applicative system...
> >I'm still interested in it, although I'm now more interested
> >in implementing a refinement of the Joy system than in implementing
> >that purely applicative system... In a bit I'll mention some
> >refinements I have in mind...
>
> Wow. Cool. :-)
Yet, applicative systems are still interesting...
one particularly interesting thing is the way in which
quotation-manipulating functions like "i", "dip", "concat",
can be constructed from the ordinary combinators, whereas
this does not seem possible in concatenative systems...
> Yes, refinements are _certainly_ needed.
Oh, I forgot to list what refinements I was thinking of...
Well, really the basic system is not bad; the main refinement
I might make would involve fixing the quotation construct
to conform with application-of-T (I know... you don't think this
is an improvement)
Other possible refinements
- leaner base (and get rid of weird things like "infra" and "stack")
- stronger type system
- better implementation
- support for concurrency
- support for exception-handling
I'm not sure exactly how exception-handling might be taken care of...
An extension in probably not strictly necessary; one could already
do C-style exception-handling by having every failable procedure
yield an extra result, a result summarizing the successfulness of
the operation; but, then one would manually have to propogate
exceptions, which might not be considered the best thing...
One option is to have "fail" and "catch" primitives...
"catch" might operate much like "i", executing a quoted program,
except that there is another quoted program beneath it to be
ran if the first program fails...
Then, the other refinements... Concurrency we've already discussed
a bit...
Better implementation... this could be achieved if the system used
a compiler... it might also want to employ a linear storage
management system (no copying by reference).
Stronger type system... it would be nice if a programmer, after
writing a program, could verify certain properties about it;
in particular, a programmer would probably like to verify that
a program can accept parameters of such-and-such types off the stack
and pushes results of such-and-such types back on. It might
become tricky for the system to infer this in some cases;
but then again, maybe it is always very easy... I'm open
for ideas about how this might work...
Leaner base... I'm interested in investigating a purer subset of Joy
which only has only combinators like "swap", "dup", "pop", "dip",
"i", "concat", and such (things like characters, integers, sets,
strings, and lists being omitted)... This could be of use
in developing a rewrite system for simplifying programs...
- "iepos" (Brent Kerby)