HLL/INT: What is an object, anyway?

Jecel Mattos de Assumpcao Jr. jecel@lsi.usp.br
Wed, 14 Jun 1995 15:33:33 -0300


rideau@clipper.ens.fr (Francois-Rene Rideau) wrote:
>    So my definition for an object would be:
> "anything whose semantics can be finitely coded into a Turing Machine".

That seems like a more sophisticated way of say that objects are like
little computers ( see A. Kay below ) :-)

> > I'll paraphrase Mario Tokoro's ( philosophical ) definition of objects:
> > 
> >    1 - it has a unique ID
>    That's quite the idea (hehe). Now what is a unique ID ? In what space
> does it live ? This becomes a circular definition. Identifying for
> the sake of identifying is no use. Of course objects can be differentiated
> from each other, but this should be done according to useful semantical
> differences. Which is why I rejected this point. Just saying "we consider
> objects" is enough and encompasses it.
>    [Hey ! I'm adding this comment to the "object" Glossary entry...]

I don't agree - most object systems allow you to have separate objects
( each with its own ID ) with no semantical differences at all. This
might not be useful, but that is the way things are. On the other hand,
there are many systems where "things" don't have unique ID. Saying
"we consider objects" is enough *if* we agree that they have an ID, so
I don't see the sense of leaving it out of a definition. What are these
IDs? That is a major implementation issue.

> > To get a concurrent object, he adds:
> > 
> >    3 - it has computing power
>    Yes, but *what* is "computing power" ? Won't an integer be an object ???
> Again, if this means anything, it is encompassed by the fact we are talking
> about software objects.

Not at all! Most object systems have passive objects with no computing
power of their own. A separate "task" or "thread" object is needed to
make it run. In this model, the sender object "passes" computing power
on to the receiver and waits for it to come back with the answer. A
single object may be active in several threads at the same time.

A concurrent object is able to do things on its own and does them one
at a time ( it has a built-in virtual CPU ).

> > The second part of the definition implies that an object has a private
> > memory.
> >[...]
> > When we try to come up with an implementation oriented definition, we
> > see that 2 has something missing.
> >[...]
> > It is hard to define the inside of
> > the object in a complete way.
> >[...]
>    Yeah, and I think the fact that there is a memory of a state that changes
> would be a far better characterization. But this seems to flow from the fact
> we are using finite computers in a dynamical world. So why not just say
> that ? This is much more simple than this point 2 that only postpones
> all the problems.

I've seen some languages where the objects don't have a private memory
( in a conventional sense ) but remember their state by the list of
messages they have received ( see DinnerBell from Sony Labs ). In actor
languages the object also doesn't have a conventional memory but reacts
to a message by specifying a replacement object to handle future messages
to its previous ID. Conventional OO systems do use private memories, of
course.

The important point is that the object looks like one thing from the
outside and like another from the inside. You also have this in
modular programming, but not in logic or functional programming.

> >   I have decided to make this explicit in Merlin
> > by say that "an object divides the system into an outside, an inside
> > and a deeper inside". There are now *two* interfaces. The new one,
> > between the object inside and the deep inside, I call the "reflective
> > interface".
>    This is I think a distinct problem. I interpret it by the fact
> that objects are defined inside some space; their semantics are expressed
> in terms of other objects, etc. That is, every object's semantics are
> definable as a meta-object. Circularity is avoided by having the computer
> (or "real-life") as a founding axiom.

Yes, it is a distinct problem. You definition is quite right. But it
isn't "obvious" - most OO systems don't have this. They use some
kind of virtual machine to make objects run. I think we agree in nearly
everything, but that you think these thing are all implied by "objects"
and don't need to be mentioned. I think they must be made explicit or
most people won't understand what we are talking about.

>    I don't consider inheritance as anything else than a low-level hack.
> Delegation (importing and exporting objects), or for the functional people,
> functors (aka higher-order functions), are a much more generic way of doing
> things.

I agree inheritance is not a necessary part of the OO model. I have read
several comparisons with delegation and still don't know the difference
between the two. Timothy Budd, of Little Smalltalk fame, has created a
language called Leda that combines OO and functional programming ( and
logic programming too, I think ). I haven't seen it, but Kyle Hayes said
it is very neat.

> > It is a very hard job! You are right that knowing "why" we are doing
> > things is the first step before worrying "how" we will be doing them.
>    Pardon my self-contentment, but I think I already have a clear idea of
> the "why" and many ideas about the "how". Though these ideas are sure not
> perfect, I have spent a long time developping and refining them (there
> was a time when I gazed with wonder at C++). I'm so sorry I can't
> explain them better. Please tell me what is wrong or missing about what
> I already put in the WWW pages. Sure if we could physically meet,
> communication would be easier...

The most precise explanation is working code :-) Too bad it never turns
out as neat as when it was just in our heads...

>    Anyway, I deeply recommend to anyone to study functional programming
> at least a bit. Though the state of current FP languages is far from
> perfect, it opened my eyes to a world of which I had little to no idea.

I wrote an interpreter for FP in Lisp back in 1981, but haven't kept
current with functional programming since then, except for the time
I studied Sisal. I wish I had time to take a better look at SML, for
example.

> If you could study a bit of abstract type theory (particularly the
> Curry-Howard isomorphism and higher-order logic), it'd be even greater.

I am very weak in those theories. Are there references to some papers
in the Tunes pages? If I unlearned a little more about this, I might
even be a good candidate as a programmer for Microsoft ;-)

> > by Alan Kay:
> > 
> >   All complex systems are built from a collection of smalled componentes.
> > Why divide a computer system into weaker things like procedures and
> > data structures? Why not divide the computer into hundreds of small,
> > specialized computers - make each part as powerful as the whole!
>    This sounds unclear to me. Again pardon my self-satisfaction, but I'd say
> that Alan Kay himself has not understood everything. He sure is a fine
> artist and technician, but he doesn't look like a fine theorist.

Probably right, but I wonder how much you read on Kay to have such an
opinion. I don't think there is much theory behind OO - it is such a
practical thing. That is why it caught on in the industry before the
academics got interested in it, unlike all other computer trends.

>    I interpret that as an unconscious call for reflectivity: all objects
> are to be considered together with their semantics, a meta-object that
> potentially covers all the system together.

He was saying that a data structure is weaker than a computer. An
OrderedCollection object is not - it is just more specialized. Smalltalk
wasn't initially as conventional as it is today...

>    To conclude, I'd like to say that really, I see nothing more in the OO
> fad than Unity in the way we consider computing. I myself would add
> Reflectivity, but if we are to call C++ or other such crap "OO", this seems
> not to be a requirement.
>    So I'll point again to the TUNES Review and HLL pages...

If others want a "watered down" view of objects, it doesn't mean we can't
have a better view. Objects are good as long as they are useful. They
aren't the end of computing history...

>    Well, I was meant to study and not do any tunesing until my exams are
> finished. Aaaarrrggghhh !

You can comment on this when you exams are over... I am not in a hurry :-)

Regards,
-- Jecel
P.S.: have you seen the CVS software for maintaining versions and generating
patch files? David Mentre suggested it to me a while back but only now
did I get a larger disk and space enough to try it. It is about as good
as it gets in Linux, though I hope we can do better with objects later on...