Fare's response on threads
Kyle Lahnakoski
kyle@arcavia.com
Sat, 23 Sep 2000 14:51:20 -0400
"Lynn H. Maxson" wrote:
>
> Kyle Lahnakoski wrote:
>
> " If a human programmer defines the simple rules, I doubt the
> programmer should be credited for the emergent behavior."
>
> Whatever the emergent behavior it corresponds exactly to the rules
> supplied. More importantly it never "strays" from those rules
> allowed. Nothing in the emergent behavior occurs from any other
> source. Therefore it cannot take on an independent existence,
> i.e. make decisions "on its own".
I did not say the emergent behavior broke any of the rules supplied.
The emergent behavior is a consequence of those rules. The simple rule
used to define the Mandelbrot set does not contain the wave-like
silhouettes identified in its visualization. They come from the many
iterations that produced the image. The iterations are necessary to
identify the patterns. The simple action of iteration adds an extra,
possibly unpredictable, component to a set of rules.
> I have no concerns about who does or does not get credit. Humans
> create tools as a means of extending their own ability. It may
> occur to extend strength, to increase speed, to increase range of
> vision, whatever. In none of those instances does the tool itself
> suddenly acquire capability not inherent in the human design
> regardless of "unexpected" emergent behavior.
And all your examples are simple tools, designed for a purpose, and not
designed for there ability to show emergent behavior.
> More to the point a computer system based on von Neumann
> architecture combined with software obeying the computational
> rules of a Turing machine can never evolve on its own or with the
> most elaborate, sophisticated instruction of its human author(s)
> into anything like the human brain. First, the hardware, the
> physics, is entirely different. Second, the software is not
> necessary nor present. Ashby dashed such thoughts with his
> homeostat which "exhibited" adaptive behavior in a machine system,
> the homeostat, which had neither an instruction set nor any
> instruction (guidelines).
Even though I use the language of software, and imply the use of
instruction sets, I do not mean limit the discussion to those
phenomena. The homeostat uses mechanical means to define the simple
rules, and feedback is used to generate the "iterations". The emergent
behavior is identified only when the homeostat is run, it can not be
foreseen without the feedback present.
> This does not mean it is pointless to improve the range and
> capabilities possible with our tools. It does mean that our tools
> will always remain within the controls we have set, that they will
> not suddenly acquire an independence, an ability to suddenly take
> off in a direction all their very own.
When you say "suddenly take off in a direction all their very own" what
do you mean? Do you mean action that was unexpected? By unexpected I
do not mean unforeseeable. Surely all that the machine does could be
predicted by simulating it before hand. But that is the ONLY way of
predicting its actions; you can only predict what the machine will do by
allowing it to run. Unexpected and unforeseeable have different
meanings in light of the existence of emergent behavior.
Your language and understanding of the machine does not allow you to
ever give it credit for the work it has done. I do not mean this in a
mystical way. The work it does is simple computing. I just want to
acknowledge that it is the machine that did the computation, and not
me. This simple allowance allows me to use language as if the machine
was alive, and to make the conversation efficient.
----------------------------------------------------------------------
Kyle Lahnakoski Arcavia Software Ltd.
(416) 892-7784 http://www.arcavia.com