[gclist] Unreal
Nigel Bree
nbree@kcbbs.gen.nz
Sat, 17 Jul 1999 23:38:25 +1200
Tim Sweeney wrote:
> I'm a game developer who is interested in garbage collection. The most
> recent game engine I worked on (Unreal, see http://www.unreal.com/)
> contains a built-in scripting language quite similar to Java, that is
> based on garbage collection for memory management.
Nice to play, too!
> My script language maintains class "meta-data" listing all variables and
> their references, so it is a simple process to determine where object
> references reside in other objects.
To satisfy the curious (and to cut down on suggestions about things you
already know about), it would be interesting to tell the list from which
source you drew your meta-objects; for instance, CLOS (e.g. _Art of the
Meta-Object Protocol_) or Smalltalk or one of its variants, like the late
lamented Actor from Whitewater which provided Borland (nee Inprise) with
ObjectWindows.
> 1. Since >80% of my objects are constant at runtime, I can be clever and
> precompute information about their internal references and cycles (since
> they're known not to change after construction), and not have to ever check
> them in realtime for my garbage collector. Is this a generally known /
> researched technique, or did I invent something here? :-)
It's reasonably well-known, and some Smalltalk systems (and probably a few
Lisp systems - there have been so many!) have done this. It's often observed
in systems where code & data are treated uniformly in other respects that
code is not often mutated, for example; this can be viewed as an extremum of
the generational hypothesis.
If you have a generational collector then there's little effort involved in
placing objects from your persistent state into the "oldest" generation,
which is only collected on rare occasions. If you've got a generational
collector, you can this get this "for free".
On the other hand, this might tie in more with your approach to VM than to
anything else; the constant objects will be _persistent_ objects in the level
files, and so can be evicted from memory and reloaded on demand (on Win32,
which is your main platform, using the various page management APIs can let
you do tricky things).
Unreal definitely requires large (by general consumer standards) amounts of
physical memory, and one of the questions you no doubt spend a lot of time
considering is how much
effort to go to to deal with the fact that the Win32 platforms do such a bad
job of managing VM.
> In this case, assume that the transformPointByMatrix function is defined in
> some other module, so I can't compile-time inline it. But with the
> "visitor" keyword, the compiler *can* realize that the objects "p" and "m"
> can't possibly gain any roots in the code above, so it can be explicitly
> destructed, without any GC overhead. Whereas, without the "visitor"
> keyword, both "p" and "m" become garbage and the collector eventually has
> to deal with them.
I haven't done UnrealScript programming, so I'm not entirely certain quite
what "compile time" means for you (as opposed to "link time" or "load time"),
but... your compiler can fairly easily do "escape analysis" on code to
determine what parameters passed to methods _cannot_ result in capturing a
reference, since you don't permit long-lasting closures or continuation
capture. In the long run, your compiler will be better at it than most humans
(and it will work with existing code).
> So I'd like
> the language to make it *impossible* for a programmer to sneakily "save off
> a copy of the object".
But if the compiler can _check_ for this property (and without the ability to
check the truth of the programmer's assertion, they can get it wrong) then
the compiler can just as easily _infer_ it. The value of the keyword is thus
either in a) documentation, or b) to deal with limitations of your separate
compilation model.
> Is this a theoretically sound technique?
Well, there's kind-of a precendent in the UNIQUE types of the Clean
programming language, but there's a clear contrast as well. The guarantee
provided by UNIQUE is that there will only ever be a _single_ reference to an
object, which is not only a vastly stronger guarantee, it opens up a bigger
optimization opportunity - Clean is a pure-functional language, and unique
objects provide a means to use both destructive assignment and to deal with
side-effecting operations such as I/O.
In your case, having adopted a C++/Java style of object, destructive
assignment is a given and so the benefits are less clear :-)
> Has anyone found a way to implement "explicitly managed object lifetimes"
> in OOP soundly?
Charles Fiterman at Geodesic has a lot to say about this!
[Hi Charles! Remember me from Mark Williams? Still working on "Braid"? ]
Are you familiar with Eiffel? While we often think about such things (handles
which need immediate reclamation) in storage management terms, that's not
really always appropriate. Eiffel's precondition/postcondition model puts the
issue of whether e.g. a window is open or closed into a much wider realm of
correctness guarantee based on preconditions and postconditions.
Now, Eiffel uses a lot of static analysis to make the condition checking
useful, and Eiffel doesn't have a C-like separate compilation model either.
But Meyer's contract-driven model of design is a powerful one. I doubt
whether your target audience would enjoy such things, but if the bulk of what
they are doing is re-using your basic UnrealScript objects (which is what I
would suspect) then it may be practical.
> 4. References between objects in my game (and in 3D environments in
> general) tend to be clusered: they form groups, where objects within a
> group contain tons of references to other objects in the group, but
> objects in a group contain a fairly small number of references to *other*
> groups. So I've been trying to devise a GC algorithm that takes this
> into account. Any suggestions?
Not really, although I'd observe that using particular kinds of copying
strategies can ensure that such "cliques" stay together in memory. If the
logical connectedness between the objects is matched by temporal locality of
reference (which is likely given how your game works) then that may be
beneficial for caching purposes.
[ The cache thrashing Unreal goes through when encountering new areas within
a level is quite striking. I'm looking forward to a K7 sometime soon myself
:-) ]
> 5. Can anyone point me to techniques I can use in my language / compiler to
> make garbage collection simpler or faster? I'm lucky to have freedom over
> designing the language, compiler, and runtime, so anything is possible. And
> I need to manage thousands objects in realtime and 60 frames per second, so
> performance is important.
The one thing I don't care to guess (and which you may know) is the rough
distribution of object lifetimes in UnrealScript. I happen to be a big fan of
Cheney-style generational copying collectors (in my case, I've written a
partially-conservative one for work - our application, Ghost
http://www.ghost.com contains a lot of, ah, "legacy code") but if your
newly-allocated objects not only tend to live a reasonable time but tend to
be _large_ then you may wish to eschew copying. Since your application isn't
exactly platform-neutral, how your intended platforms deal with caching will
likely affect your decisions here.
> 6. Since this is a game, realtime performance is more important than
> anything else. I can afford to waste a certain percentage of available
> memory for garbage objects, and I can afford the time to maintain reference
> counts and incrementally scan to collect garbage, but I *can't* afford to
> have any long pauses -- and here in game land, spending more than 1 msec on
> garbage collection per 15 msec frame is the limit of acceptability. So,
> what is the best incremental "do a little bit of GC work at a time"
> approach? Note that I don't have to worry about "references to objects
> stored on the stack", since I can design my game to do garbage collection
> once per frame, in my update loop where I'm 100% sure there are no
> references to object on the stack.
Well, this is tough. In reality, having played Unreal, I can testify that the
slowdowns in there already (e.g. when entering new areas for the first time)
are tremendously disruptive to the immersion. So, I appreciate where you are
coming from :-)
To deal with this, I think the answer to your questions 4. and 5. need to be
considered first. The real-time algorithms I am familiar with tend to exact a
heavy penalty on terms of overall algorithmic efficiency, and in any case I
would conjecture that cache considerations are your number one order of
business.
Also, the "grouping" of objects in your gameworld is to a large extent
already mostly-statically managed separate from UnrealScript by e.g.
areaportals. That seems like something to take advantage of if at all
possible. I don't know how much connection between the world geometry and the
UnrealScript collector there is at present, but it's something to consider.
- Nigel Bree
(not speaking for Symantec New Zealand, not in the Visual Cafe group either)