Tunes compiler
Patrick Premont
premont@cs.toronto.edu
Fri, 1 Mar 1996 18:04:49 -0500
> >>> It doesn't serve to imagine constructs hotter than Scheme's
> >>> if we're not even able to make Scheme efficient.
> >> Scheme constructs or intrinsically inefficient
> >> because of trivial static typing
> >> (I refuse to use the word "dynamic typing").
> >
> > I'd say that there is no static typing. All types are dynamic concepts in
> > Scheme. This makes efficiency a real problem for the compiler.
> Beyond the terminology, we agree.
> Shall we start with an efficient Scheme compiler like Bigloo ?
> Maybe we could just write a front-end to the Bigloo code generator ?
Wouldn't that prevent us from using the low level representations we want ?
> > Forcing the compiler's host language to be the interpreter's source
> > language prevents that. I'd rather have a stable host language for the
> > compiler. And once a new language feature has been proposed, I think
> > we should implement it in the compiler before rewriting the compiler
> > with this new feature. That is : update the compiler's source language
> > before we update its host language. This way it will become obvious
> > if a construct cannot be implemented efficiently, before we have
> > started using it extensively.
> You are right. Then, either we choose to stick
> to the original meta-language for long,
> in which case I'd like to use a most expressive extension
> of Scheme (rather than a least subset) as the meta-language,
Yes.
> or we choose to bootstrap the language by writing extension compilers
> in restricted previous versions,
I like that. For exemple, the structure for the assembler suggests
creating a rule based function (which may sometimes be inverted). It
is likely that such a construct would be of use in the whole
compilation process, not just assembly (although we might not care
anymore about inverting the function). This would lead to us to a
dillema as to which compilation technique to use : a standard Scheme
function that does the transformation (as I have done in my
translation to date) or a rule based function. This rule based
function is basically a very general generic function where the rules
are the methods. We could write an extension compiler to add these
ruled based (possibly invertible) functions to the language and get an
extended Scheme which is better suited to the task of compilation.
Ruled based functions could be implemented as (translated to) standard
one arguement functions. But I suggest we give them the ability to
accept "messages" through a second optional argument. This would allow
us to implement high-order and introspective functions on the ruled
based functions. For exemple passing 'invert as the second arguement
could return the inverted function. Passing '(add-rule somerule)
could add a rule. Then we'd hide these hacks in normal functions
"invert" and "add-rule" which operate on rule based functions.
As Fare suggests further down, it would be nice if the pattern-matching
engine was the basis for the future type system. What I just suggested
shows how this can happen. Generic functions choose a method according
to the type of the arguements. The rule based function can be seen as
a generic function with uses a very powerful type system.
We may eventually create versions of ruled-based functions which allow
many rules to be applicable at the same time. Many rule selection
techniques are then possible (writing AI rule-based systems with this
would be very appropriate). On rule selection technique could be to
chose the rule whose matching part is subsumed by the others, the
least general rule. That would implement inheritance.
> in which case we'd rather compile
> to the host operating system before we compile to Tunes.
What do you mean here ? The extension compiler is compiling from the
extended language to the unextended language, which is most likely to
be independent of the operating system.
> >>> MIT-Scheme has records and structures (particular case);
> >>> it should probably be enough
> >>> (perhaps with a simple generic function added)
> >>> so that the first iteration of our OS/compiler be OO.
> >>> We can enhance the object system and lots of other things
> >>> by changing the compiler thereof.
> >> Do you suggest we shall adopt MIT-Scheme,
> >> excluding any other Scheme implementation ? Why not.
> >
> > We'll have to choose a particular implementation anyway for macros.
> Where are the docs for MIT-Scheme macros ?
It is not in the reference manual. I've got a file macros.txt in my
doc directory. This is a very recent release (7.4.2 for OS/2). Do you
have that file ?
> >>> I like the general structure:
> >>> a database of rules defines by part a bijection
> >>> between assembly code and machine code.
> [Not exactly a bijection, but that's a detail]
When the function is not injective, this will mean that many rules
will match at the same time in the inverted function. This is
something we will have to deal with eventually as I said previously.
So yeah, that's not a big problem.
> > I'm sending you a Scheme source I've had hanging around. It implements
> > logical extensions to Scheme. That's an overkill but the unification
> > routines might be useful. Just in case. Also I think we should decide
> > if we allow bijections in the patterns or if we use variables
> > in their place and conjoin an additional constrain like y=f(x) :
> >
> > For exemple (using an improvised syntax to make it clear) :
> > rule : (mov (var reg) 10) -> (#x45 (reg-code (var reg)) #x0a)
> > that was using bijections (reg-code) in the patterns the other way is :
> > rule : (mov (var reg) 10) -> (#x45 (var regc) #x0a) and
> > (var regc) = (reg-code (var reg))
> >
> > See what I mean ? They second way may make it easier to implement
> > the reverse transformation. Or not.
> I'm not sure I get what you mean.
> Why would either way not be as easy/difficult to implement as the other ?
> It seems to be that they expand to just the same constraints for
> any unifier.
Yes. The only difference is the syntax. I only mean that the second
syntax it more symmetric and thus might slightly simplify the code
that deal with the inverted function. The second syntax also allows
writing the functions as relations (like in prolog) which might be
needed enventually when the relations do not correspond to bijections
of even functions (like relations of a number of arguements which is
not 2). But we don't need that now. And I guess both syntax will work
eventually. So let's use the first one since it is simpler.
> Actually, I'd like to have the pattern language be the basis for the
> future type system and formal specification system.
Absolutely. See my earlier comment.
> In particular, there would be patterns to match lambda's,
lambdas ? lambdas are source and are therefore standard s-exps which
can be matched by any list-matching pattern. Or maybe you meant closures ?
Right now these are closed so we can't match the closures components.
Wait. I think I understand what you want. You want a pattern that will match
any lambda's source like :
(lambda (var args) (var body))
But that is not a property of the pattern matching system. It is simply
a particular pattern. Do I understand you ?
> patterns to match patterns, taking contextual effects into account, etc.
> [This is why, again, it seems to me that first-class contexts are of
> foremost importance: to allow to easily construct such patterns].
I don't see where first-class contexts come into this.
> > I don't think we need modules yet (to write the compiler). The main
> > problem they solve is that they allow many namespaces. Sure it would
> > be nice to have them but they are closely related to contexts and I
> > don't think we can settle the definition of contexts right away. We
> > can simply use the seperate files method to allow easy updating of the
> > code when we finally introduce modules. As for macros, we could use
> > MIT-Scheme's macros.
> I'm not convince. But let's write code and convince
> ourselves/each-other the only truely convincing way:
> by writing the actual compiler...
Absolutely. Except that I have slightly modified my position as I now
suggest adding the pattern-matching and rule based functions to the
language (through an extension compiler) before we write the main
compiler. But in a way that is writing the compiler (from the top).
I suggest this extension because I have a very precise idea of how to
do it (outlined earlier) and it is clearly something we need.
Fare, do you understand and agree with my suggested extension ?
It might look like :
(define f (rule-based-function pattern-which-can-limit-the-domain
(rule input-pattern1 output-pattern1)
(rule input-pattern2 output-pattern2)
(rule input-pattern3 output-pattern3)))
(add-rule f (rule input-pattern4 output-pattern4))
(f some-structure-in-the-doamin-of-f)
((invert f) some-structure-in-the-range-of-f)
(list-rules f)
...
Patrick