Everyone else (was: Tunes compiler)
Eric W. Biederman
ebiederm@cse.unl.edu
Sun, 3 Mar 1996 21:47:24 -0600
Patrick Premont said:
> How about everyone else on the list ? Are you lost ? Do you have comments ?
Are you even listening ;-> ?
Yes, I for one am. As for why I'm listening I don't have that much
time right now and am mostly trying to profit from the `tunes
experience'. :)
First Several Minutia
Trying to get invertible functions at your assembler level is a little
bit ridicoulus. Mathematically it's not guaranteed, and it's also
quite unlikely in practice.
A note about strong typeing. This is definently a good way to gain
effeciency. But a word of caution, you need to let the garbage
collector know about all of your types.
As for levels of typeing I have heard it characterized that all
languages can be either explicitly typed, or type information can be
gleaned as to what type the variables are at compile time, which
amounts to explicit typeing for the compiler.
Several things less trivial
What seems to me to be a good abstraction for this kind of problem is
the idea of a compiler, _and_ an assembler. Both machine dependant.
For a useful assembler I am imagining a program that takes a subset of
a high level representation of _all_ assembly languages (the machine
implemented subset of course) and converts those to assembly language.
This assembly language could be called the LLL if we choose?
example:
for a risk type chip
(add R1 R2 R3) # add R2 and R3 and put in R1
(add R1 R2 24) # add R2 and 24 and put in R1
for a pc
(add R1 R1 R2) add ax,cx ; or something like that
Some of the discusion indications you are moving in this direction
anyway. For the machine mostly independent compiler it would have to
have special knowledge on how to properly construct assembly language
routines, and how to do other machine dependent things that only need
to be variables for now.
Now I also see the inevitibility for an efficient very high level
language to need to be able to compile expressions at run time. For
either lambda expressions, or to reduce run-time constants in inner
loops and other places to constants instead of variables.
Additionally a disassembler based on a machine independent assembly
language could be used to aid in manipulating already compiled
expressions.
My thoughs towards contexts pattern matching etc,
There is a lot of that in your discussion, and some of the higher
level programming concepts I just don't quite get.
The one thing I am quite sure about is that for a language to be able
to expand and grow without major growing pains it needs a uniform
syntax that is the same for both language features and added features.
The same for data. Lisp almost provides both of these, and Forth provides
the first. A disassembler with uniform syntax, and some super
environmental mumbo jumbo might just complete the mess.
I think the explicit switching between compiling and runnging states
in Forth is a better model then macros in any language. So please
include this.
With a nice extensible base as I have imagined, especially if it
includes garbage collection you should be able to implement nearly
anything you want on top of that you can imagine. Then you need only
to identify those features that are needed for code compatibility.
Writing the services of an OS in such a language would be a good acid
test to see if you have made it compatible.
To a Garbage collector
I haven't heard or seem much about this except that you are going to
have it. I did some research a while back and the best ones are
generational copy collectors. With a little work you can also make
these incremental which is desireable in the long run, but incremental
collectors are much harder to implement, so best left until we have
something running.
For optimizing the garbage collector it really should know which
objects need finalization, and it where pointers are so it might be
worth while to do some long range planning in this area as well.
Just a Few comments toward an OS
The most portable frameworks of functionality have been built on
microkernels using multiple servers. One for authentication and
others for filesystems ( For us possible persistent address space
systems but as a unix file == an address space etc) That talk through
mach like message ports, to allow use as a distributed OS.
The current research in mach has optimized these calls to the point
that on the same machine they can be faster normal kernel calls.
The other reason I propose this model is that it works very well with
current stated aims of tunes, and can be implemented on top of nearly
any multiprocess OS using IPC, so we could avoid some of the low level
nitty gritty for a while until we have workable code. Then we could
find the premium kernel to run it on. I like mach (Can you tell?)
----
I don't think I have helped to settle any debates, but I hope this
helps anyway.
Eric