New kid on the block / introduction

M. Edward (Ed) Borasky znmeb at cesmail.net
Sat Nov 4 08:48:45 PST 2006


Tom Novelli wrote:
> On 11/4/06, Armin Rigo <arigo at tunes.org> wrote:
> 
> Hey, Armin... just the guy we need to talk to!
> 
>> Newsgroups like comp.lang.python are full of statements like "just
>> implement Python in Lisp and everything will be great".  By now, you'd
>> think that someone would really have tried, but there is no such
>> implementation showing great results - even as a prototype.
>>
>> It doesn't work.  There are several reasons for that.  One reason (try
>> google for more detailled answers) is that Python, Ruby, etc. are 3%
>> syntax, and 97% semantics defined by a large library.  It's trivial to
>> translate the AST of these languages to S-exps, e.g. with a Ruby-to-Lisp
>> parser.  Then you only solve 3% of the problem.  The difficult and
>> tedious bit is to implement the semantics and library and object model,
>> and Lisp compilers are not going to magically make these 97%
>> super-efficient.  (I should add that I consider the 3% the boring bit;
>> parser generation stuff was solved decades ago, but sadly many compiler
>> construction lectures still focus mostly on that.)
> 
> I figured as much.  I just noticed that Python's lexical scope rules
> are incompatible with Lisp, and broken, IMHO.  In LISP, I could have
> defined a few nested functions, but in Python there's no way to modify
> the outer function's local variables from the inner functions.
> 
> Is anyone looking at the big picture?  LISP seems to have a mature,
> well-designed semantic model, while these script languages just have
> ad-hoc semantics cobbled together over the years, still changing every
> year, and borrowing more and more from LISP.  Has anyone tried to
> define a variant of Python (or Ruby, ECMAscript, etc.) that's
> compatible with the Lisp/Scheme model?  I ask because I'm interested
> in doing that, and I wouldn't want to waste my time if it's already
> been done, especially if it's not promising!

Programming languages, their creators and their users co-evolve. It's
that simple. There is no "big picture" -- it's a complex adaptive
system, as defined by John Holland.

The languages that have been around a long time -- Fortran, Lisp and
Algol, in its current dialects of C, C++, C# and Java, are actually
recognizable to people who learned them long ago. I can still (probably)
program in Fortran IV on a modern compiler, though I think Fortran II's
"READ INPUT TAPE" and "WRITE OUTPUT TAPE" are gone from most current
dialects. And I can still (probably) write LISP 1.5 code in any of the
Common Lisp systems out there.

Now, given co-evolution and competition, you end up having niches. The
so-called scripting languages evolved as a replacement for csh/ksh
driving sed and awk. So they pretty much all do regular expressions,
pipes, and, as Armin notes, wrap humongous collections of libraries.

And they all have their own unique well-defined C language interfaces.
The SWIG project is an attempt to collect all of this interface
knowledge into a package, and a noble effort, and eminently useful. That
would be where I'd start -- look at the semantics SWIG has to deal with
in wrapping nearly all of C and C++ into something like a dozen
scripting languages, including Clisp and a couple of Scheme
implementations, Perl, Python and Ruby, PHP4, and Lua, Pike and OCaml.

When you come right down to it, there are really only half a dozen or so
truly unique programming languages. My list is *macro* assembler,
Fortran, Lisp, Forth, APL and Smalltalk. I've gotten into arguments
about whether Algol should be in there or not, but so far, nobody has
convinced me that it represents anything more than a merging of concepts
from Fortran and Lisp. And perhaps macro assembler and Forth don't need
to both be there.



More information about the TUNES mailing list