Fare's response on threads

Lynn H. Maxson lmaxson@pacbell.net
Sat, 23 Sep 2000 07:55:01 -0700 (PDT)

Kyle Lahnakoski wrote:

" If a human programmer defines the simple rules, I doubt the 
programmer should be credited for the emergent behavior."

Whatever the emergent behavior it corresponds exactly to the rules 
supplied.  More importantly it never "strays" from those rules 
allowed.  Nothing in the emergent behavior occurs from any other 
source.  Therefore it cannot take on an independent existence, 
i.e. make decisions "on its own".

I have no concerns about who does or does not get credit.  Humans 
create tools as a means of extending their own ability.  It may 
occur to extend strength, to increase speed, to increase range of 
vision, whatever.  In none of those instances does the tool itself 
suddenly acquire capability not inherent in the human design 
regardless of "unexpected" emergent behavior.

More to the point a computer system based on von Neumann 
architecture combined with software obeying the computational 
rules of a Turing machine can never evolve on its own or with the 
most elaborate, sophisticated instruction of its human author(s) 
into anything like the human brain.  First, the hardware, the 
physics, is entirely different.  Second, the software is not 
necessary nor present.  Ashby dashed such thoughts with his 
homeostat which "exhibited" adaptive behavior in a machine system, 
the homeostat, which had neither an instruction set nor any 
instruction (guidelines).

The gap that begins at the start of our software journey remains 
the same regardless of our software tricks, elaborate or 
sophisticated processes, or multiple levels of reflective behavior 
through meta-programming.  We are no closer to achieving the 
computing system as a brain at the end of such efforts than we 
were at the beginning.  Regardless of how well we "mimic" the 
brain's behavior at no point in the process will we cross over the 
threshold from mimicry.

This does not mean it is pointless to improve the range and 
capabilities possible with our tools.  It does mean that our tools 
will always remain within the controls we have set, that they will 
not suddenly acquire an independence, an ability to suddenly take 
off in a direction all their very own.

In short it doesn't change the value that comes from incorporating 
reflection into software.  We have a very simply means of 
replicating humans and human ability, a manufacturing process that 
we have yet to master on our own.  What success we have had has 
come from participating in that process rather than attempting its 
replication in one of our own invention.

The point is to remain computer scientists, not alchemists.  The 
is no magic in the machine which we have not provided.  No matter 
what we do with a von Neumann architecture follwing Turing 
computational rules we will never produce a "brain".

People like Steven Pinter like so many other science fiction 
authors we retain for entertainment, some enlightenment of 
alternate (fictional) universes, and flights of fantasy.