[gclist] Fragmentation issues

Paul R. Wilson wilson@cs.utexas.edu
Thu, 29 Feb 1996 15:39:34 -0600


>I was wondering what people's experience has been with memory fragmentation
>in garbage collected systems.  In particular, conservative collectors
>might like to use certain allocation mechanisms to allow them to easily
>identify pointers to object interiors.  (For example, all objects on a
>page might have to have the same size.)
>
>Does anybody have any ideas/anecodes/experiment results about the impact
>of such decisions on overall memory fragmentation?

My impression is that the best guide is anecdotal evidence from the
PARC conservative collectors.  I don't know of any good measurements of
fragmentation for GC'd systems.

Most of the measurements of fragmentation for non-GC'd systems are totally
bogus.  The ones that aren't totally bogus usually don't separate out
the costs very well---you can't tell what's "real" fragmentation and what's
overhead due to headers, footers, and minimum block sizes.

Given that caveat, check out the results in our allocator survey on our
web site.  Simple segregated storage typically gives significantly higher
fragmentation than best fit or address-ordered first fit, but it's not
awful.  And some simple optimizations may get most of that back---e.g.,
keeping track of whether there are *any* live objects in a page, and
reclaiming the whole page at a whack if it's completely empty.  I think
that usually works OK, but don't know of any good studies.  (You can
make that work better by using some unit smaller than the page size, 
e.g., 512 bytes.)

My take on it is that fragmentation is mostly a myth---given a good allocator,
most programs don't give much fragmentation.  So non-copying collection is
much better than people generally think.

>Cheers,
>-Sanjay Ghemawat
>
>
>