Scheme vs. CommonLisp vs. the World

Henry G. Baker
Tue, 13 May 1997 08:35:47 -0700 (PDT)

> I don't quite understand why you think CL's type specifiers are more
> complex they can be given Common Lisp's big set of types. The syntax
> could be improved, i.e. declare function parmeter and return types in
> the lambda list, have a type-specfying let, but all this has already
> been done by individuals for their own use amd doesn't require a CL
> imcompatiblity. The biggest issue in getting simpler is to cook the
> number of types down by moving things from a core language to
> libraries.
> Martin

The major problem with type-specifiers is to solve the C-versus-Ada
problem.  C uses things like 'short' and 'long' to specify sizes, but
never says how big they are.  Ada allows you to provide ranges like
0..65535, but never says how this is mapped into the hardware.  Common
Lisp provides the Ada capability of specifying arbitrary ranges, which
together with the ability to do unions and intersections of ranges
makes the type system totally ridiculous.  See my paper  (also .ps.)

for how bad things can really get.

A truely reflective system would give the user control over how these
mappings were done.  A 'poor man's' reflective system gives the user
access to the compiler, if necessary.

Probably the most conservative solution is adding specific types to
the compiler on an as-needed basis.  Thus, rather than providing for
arbitrary numeric ranges, one might provide fixnum16's, which are
_exactly_ 16-bit 2's complement integers (with associated functions to
gain access to overflow bits).  This solves the C problem, because it
pins down exactly what size the integers are, without eliminating the
possibility of adding new datatypes as the need should arise.  Of
course, good programming practise would suggest hiding these decisions
carefully within small, separate modules, so that the entire module
could be replaced with a different module if one wanted to utilize a
different sized fixnum.

This is not really such a bad idea, because there aren't nearly the
wide variety of hardware datatypes that there used to be.  I haven't
seen 18-bit or 36-bit integers in a long time.

Henry Baker
www/ftp directory URL: