[gclist] Get Real! Require GC (was re: quality of impl., etc.)
Henry G. Baker
hbaker@netcom.com
Mon, 22 Apr 1996 10:44:45 -0700 (PDT)
> >From nickb@harlequin.co.uk Mon Apr 22 03:55:21 1996
> >Date: Mon, 22 Apr 1996 09:54:45 +0100
> >From: Nick Barnes <nickb@harlequin.co.uk>
> >
> >Rumour has it that some early Lisp Machines had broken GCs. Unlike
> >[typical programs in] some other languages, [some typical programs in]
> >Lisp allocates slowly enough that the absence of a GC used to be
> >acceptable.
>
> Good point. As I understand it, the early Lisp Machines had simple
> GC that was so bad that people would often turn it off, let the machine
> run until swap space was exhausted, and reboot.
Let's keep things in context. The early Lisp Machines had very little
memory -- 2 Mbytes or so -- and had very little control over the placement
of objects. Therefore, things like pop-up menus could get paged out and
cause a lot of paging to get back in. Thus, due to memory starvation,
the early Lisp Machines were barely usable at all -- GC or no GC.
One of the main reasons for developing the MIT Lisp Machine was DEC's
intransigence about making very large address spaces available for
single programs. (Yes, the VAX offered large address spaces, but due
to its page table structure, it handled large _discontiguous_
addresses very badly.) The natural language projects, symbolic
algebra projects, speech understanding projects, etc., had all run
into the address space problem. It is my understanding that the ARPA
speech understanding project utilized > 10 separate processes running
on the DEC-20 in order to be able to operate, and everyone agreed that
huge amounts of time were being wasted 'shoe-horning' large projects
into small address spaces.
Unfortunately, I think most of us underestimated the amount of _real
memory_ that would be required to run these very large problems. As
a result, I don't think that the Lisp Machines became really usable
until they got 16-32 Mbytes of random-access memory.
Since the GC very definitely competed with the running program for
real memory space, _any_ GC turned an already sluggish machine into
a dog. So, you either turned GC off completely, or had GC run after
you left for the day, which is a completely reasonable thing to do.
> One of the consequences of the lame GC's on the early Lisp machines was
> that people did in fact end up programming in "Lisp minus GC" rather
> than Lisp. To avoid exhausting swap space quickly (and running slowly),
> they went to a lot of trouble to use side effects instead of applicative
> techniques. In effect, the poor performance of the GC reintroduced
> problems Lisp was supposed to have solved at the language level.
Unfortunately, much of the Lisp Machine SW was done _before_ there was
any GC, so there was a premium put on reducing excess consing. Most
people feel that that was a mistake, and that SW development would have
gone smoother & more bug-free had it been possible to rely on a GC.
> There were also facilities for stack-like allocation that you could use
> manually. (A lot like GNU obstacks.) This was the source of endless bugs,
> as I understand it, because it's not nearly as safe as programming in real
> Lisp---you get dangling pointers, etc. While I don't think there's anything
> wrong with providing such language extensions, I think there's something
> broken about a language implementation that effectively requires their
> use, just to get remotely passable performance. These days, GC is
> well enough understood that there's simply no excuse.
>
> (Maybe Henry or somebody else experienced with the old Lisp Machines could
> comment here---I'm getting out of my depth.)
I have been told that the Lispm compiler made use of stack allocation,
so that it could recover its memory afterward. Since compilation was
done extensively, and there was initially no GC, this was considered
a 'temporary' measure (like Bldg 20, for those who know MIT!!). However,
as Paul points out, this was a source of many bugs & restrictions -- e.g.,
I think that consing done by user-defined macros created dangling
references, and the compiler was not re-entrant (how many compilers do
_you_ know of that are? :-).
Once again, I think that most people would agree that being able to
rely on a GC would have saved a lot of work, and made the code much
more readable.
--
Henry Baker
www/ftp directory:
ftp.netcom.com:/pub/hb/hbaker/home.html