[gclist] Distributed Garbage Collection Revisited 1
H.C.C.Rodrigues
hccdr@ukc.ac.uk
Wed, 03 Jul 1996 18:29:07 +0100
>>From my reading of the algorithms they traverse the graph twice. Once to see
>if any node in the graph is reachable and a confirming trace to determine if
>the graph has changed since the first traversal. During the first trace each
>object in the graph notes when this node was visited. Between the time the
>node is first visited and the time the confirming trace of the graph is done a
>barrier is built around the object and any new references are noted. When the
>confirming trace is done the node indicates that it has leaked a reference and
>therefore this chain can not collected. Some algorithms pass the first pass
>information around with the traversal information. I am not sure how where
>they keep the last time updated information.
I agree with this. I was thinking they were relying in the "dirty" information.
I has already checked that they don't do that. At least Gustavo.
I think the only problem is the degree of conservatism of the algorithm: we
can give up from collecting a cycle after a certain amount of work, when the
cycle is garbage.
>Before the object can be collected one must somehow determine if the last read
>of the node happened after the first trace. Determining that the object is
>locally unreachable at the time of the confirming trace reaches it is not
>enough since the reference could have been leaked and the path to the leakage
>could have been removed before the confirming trace arrives.
>
>
>
>- Rick Hudson
Thank you.
Helena Rodrigues.