[gclist] Distributed Garbage Collection Revisited 1

Giuliano Carlini giuliano@ix.netcom.com
Wed, 03 Jul 1996 20:44:49 -0700


H.C.C.Rodrigues wrote:
> I think that the degree of conservatism you mentioned here is important. In
> your back search you can find some object marked locally reachable by the last
> local collector which is not locally reachable in the current state. This would
> make you giving up and some amount of work will be wasted!

It would. The key is the scheduling policy for when to start back searches
on zombies.

> Furthermore, I think the algorithm is more complicated. I think there is a
> problem of termination introduced by the barriers. It may happen that a local
> collection is performed during the first or second trace. This local collection
> would delete the "dirty" information and consequently live objects can be
> wrongly collected or it would be necessary another confirming trace, and
> another.....
I'm not sure what the problem you describing is. I'll give a whack at answering
what I think you are saying. If I've guessed wrong about the nature of the problem
you see, perhaps you get explain a bit more.

First guess: You are concerned the local GC will affect a zombie's reachability
graph. But, a local mutator can't affect a zombie's reachability graph, and thus
the local GC won't either. Only recieving an RPC which mentions the zombie can
cause a change in its graph. But, this marks the zombie as dirty, which will cause
any back searches which contain the zombie to abort.

Second guess: You are concerned that a local GC will recliam the bookkeeping for
the back-searches or the dirty scion data. This is simply handled by making sure
these are rooted in either a global variable, or from a scion.

> I would like to know what do you think about this.

Thanks for you comments. I'm curious if you could tell us more about your approach?

> Thank you.
> Helena Rodrigues.

g