Fare's response on threads

Lynn H. Maxson lmaxson@pacbell.net
Sun, 24 Sep 2000 09:08:26 -0700 (PDT)

Kyle Lahnakoski wrote:

"...I just want to acknowledge that it is the machine that did the 
computation, and not me.  This simple allowance allows me to use 
language as if the machine was alive, and to make the conversation 

"Even though I use the language of software, and imply the use of
instruction sets, I do not mean limit the discussion to those
phenomena.  The homeostat uses mechanical means to define the 
simple rules, and feedback is used to generate the "iterations".  
The emergent behavior is identified only when the homeostat is 
run, it can not be foreseen without the feedback present."

You may not know entirely what will occur in the emergent 
behavior.  However in every instance you can trace the path from 
the rules to the "surprise".  You may not predict everything, but 
everything is predictable.

That's the point with mimicry attribute of all software: it does 
nothing which is not traceable back to the rules.  In short it 
cannot create rules on its own.  It cannot do what we as 
programmer's do.  Thus it cannot assume the role of programmer 
without our defining it.  Mimicry remains mimicry regardless of 
whether it is good or bad, unsophisticated or sophisticate, simple 
or elaborate.  

Mimicry has no means within it nor have we any means of giving it 
that little extra boost that will change it from mimicry to 
non-mimicry (real).  It makes no difference how close it comes to 
"resembling" the "real thing" or how difficult it may be to tell 
them apart.  It remains mimicry.  Do not confuse it with the 
mathematical concept of infinity (which is a human construct only 
and does not exist in reality) of approaching something 
numerically in the limit as the path is strewn with non-mimic 
members.  Infinity is a convenient fiction.  We should not confuse 
it with the actual reality.  The map is not the territory.

Ashby's homeostat did not follow von Neumann architecture nor 
Turning computational rules and still it "exhibited" adaptive or 
"goal-seeking" behavior.  It did not do it the way the brain does 
it, but it did illustrate that "programming" per se (as we know 
it) was unnecessary.

I do not worry that any one who hears me talk about what I do with 
my various tools and what "they" did will ever confuse them with 
being alive or sentient.  However, with computers and software it 
is different, because we frequently engage in science fiction, in 
movies, and unfortunately in other public publications in sentient 
computers and androids.  

Even in these responses we get suggestions of that possibility.  
Those people who have done so have stepped over the line from fact 
to fiction.  They are then caught in a self-deception, of doing in 
one instance of a tool use what they would never do in another, 
regardless of how they talked about it.

While you may not limit your range of discussion to von Neumann 
architectures and Turing rules (and neither do I), unless we 
broaden the scope of the Tunes project all those discussions 
should be so limited.  By you, by me, by everyone else.  It is 
wrong (because it is factually impossible) for responsible people 
to suggest in the slightest that continuation of a process of 
sophistication and elaboration will produce "magic in the box".

While it has a place in computer science fiction it has no place 
in computer science fact.