[gclist] zillion objects

Ji-Yong D. Chung virtualcyber@erols.com
Fri, 24 Aug 2001 03:00:43 -0400


    In my preceding email,

> [I wrote]  This question is a general one. In brief:
>if you have a large number of objects, a small
>fraction of which die (very slowly), is there a
>method of automatic memory management
>(i.e., gc), which would not (1) try to copy most of
>those objects or (2) trace most of those pointers?

    Perhaps some background material on this would
be helpful.

    Basically the question asks for a strategy for 
implementing an automatic memory manager 
(perhaps GC) for a high-performance servers that 
does lots of caching.

    With today's decline in memory price, servers like 
to put *everything* in memory (few gigs?).  Garbage
collectors or automatic memory managers for such 
servers, need to be able to handle large
data without significant collection pauses.  The 
emphasis on caching will increase as 64 bit machines
become more prevalent.  

    I don't know if generational collector would be
good for the above described large cache oriented servers.
A generational collector will push all stable data to
its oldest generation arena.  And eventually, it will
still need to perform garbage collection on that

    When it does, it has to scan perhaps up to few gigabytes
of data (perhaps tens or hundreds of gigabytes in
extreme cases).  I am guessing that this has to 
result in significant GC pauses.  
    When one has so much data on RAM, cache
miss caused by client requests will be relatively
rare.  This means the rate of garbage generation,
(garbage generated by replacing pointers to old 
object by the poiner to a new object read in from a disk
due to the cache miss) would be slow as well.