[gclist] Quality of implementation in compilers.
Sat, 20 Apr 1996 09:35:19 -0500
> | Suppose the above crashes during the first iteration. That does *not*
> | prove that the system has no gc -- maybe the system just doesn't have
> | enough memory. Suppose it doesn't crash after "many" iterations. That
> | doesn't prove it *does* have gc -- maybe it just has lots of memory, and
> | maybe if you had been willing to wait for one more iteration, it would
> | have crashed. In practise, you can know how much memory is on a
> | particular machine, and write the above test accordingly, but *language*
> | *definitions* don't normally say anything about how much memory is
> | available.
> However, it is not difficult to add a measure of memory usage to a
> language definition. This can be done in terms of abstract units so
> that proportionality constants can be accepted. Any implementation
> that uses less abstract storage than that specified by the language
> semantics is acceptable, but one that uses more would not.
Purists call this a quality of implementation issue. For example
how C++ treats #pragma is undefined like a lot of other things.
So a legal C++ compiler could see #pragma and format your hard disk.
This would be legal but a poor quality of implementation. Come to
think of it, it would be a legal implementation of C but might land
the programmer behind bars. Imagine trying to defend against a
charge of computer sabotage on the grounds that the implementation
of #pragma is undefined in the standard.
A legal Lisp compiler could simply not have a garbage collector.
This would be a poor quality of implementation. You could easily
add something to the standards that would require a garbage collector.
A endless loop which created unrooted doubly linked lists for example
could be defined as running until you halted it by cancelling the
program. The same is true of tail recursion. However poor implementations
are rarely saleable.