Humpty Dumpty (was: reflection)
Lynn H. Maxson
lmaxson@pacbell.net
Wed, 20 Sep 2000 08:22:53 -0700 (PDT)
I don't want to burst Pinker's bubble, but that was done nearly 50
years ago by cybernetics, a group of quite talented men who in a
sense were advocates of the computational theory as well. In fact
it is Fare's references (if not reverences) to this group in his
Tunes documentation that has proded me to insure that we stay on
one side of the line without drifting over.
Now among and prominent in the group was D. Ross Ashby who
authored two books, "Introduction to Cybernetics" and "Design for
a Brain". The first is very readable for the casual reader and
the second is mind boggling in that it has had a lasting impact on
my thinking. In "Design for a Brain" Ashby describes his
"homeostat" an electro-mechanical system to emulate adaptive
behavior. Of all the cyberneticians (if we may call them that)
his "homeostat" took an infinitesimal step toward toward their
goal. However infinitesimal it was it was further than any of the
others ever progressed.
The homeostat, basically eight identical "simple EM devices"
totally interconnected exhibited "adaptive behavior". What
disturbed everyone was that it did it without "help", i.e.
intrinsically. To insure there is no confusion here that means
without the need for external intervention, i.e. human
programming. You could if you so chose say that it instructed
itself without the need for an (initial) instruction set.
That does not mean that one cannot advocate a computational theory
of the mind and be quite correct in doing so. It does mean that
at least one instance in which it does not occur also works as
well. In fact no computational model to date has ever achieve
anything on its own equal to that infinitesimal step of the
homeostat. Without the external intervention, excluding the
creation of the devices and their interconnection, the homeostat
"exhibited" sentient behavior, i.e. something from "within" itself
on its own.
Now I have to admit that my guide to the non-computational theory
of the mind comes from the writing of a second-generation
neurosurgeon, Antonio R. Damasio, in his book "Descartes Error:
Emotion, Reason, and the Human Brain". I wont bother you with W.
Gray Walter and the others who have also contributed to this
subject.
The point is that I am a guilty as anyone else in ascribing,
describing, and transcribing programmable (and programmed)
behavior exhibited in hardware under control of software.
However, when I do so it is with the constant reminder that what
is exhibited is not "intrinsic" to the software in that it was
constructed to do so. In fact if it does not work as we wished,
we treat it as an "error" and institute "corrective behavior
modification". We do so by "physically" changing the software
though we do not "physically" change the hardware. The difference
is that we can clearly separate the software from the hardware,
something which does not occur in the human brain.
In point of fact we can engage in mimicry in software to any level
of sophistication or in the instance of reflection to any level of
reflection on reflection (recursive behavior). We can do so
without fear of stepping over the line separating sentient from
non-sentient behavior. As long as we choose a programmable means,
externally developed software to initiate an activity (behavior)
on a software/hardware system we will never produce sentient
behavior (by definition).
I go through all this because it is important for you, for me, for
Pinker, and for anyone else to realise that a "computational
theory" can work without an external program (software) seed. In
fact for sentient behavior (non-mimicry) it's a requirement.
When dig down into the negative reactions often expressed to
Ashby's work with the homeostat when no "deus ex machina" was
necessary, that something could exhibit adaptive (survivor)
behavior on its own without external intervention it offered the
possibility that God was unnecessary. The problem with such fears
is that they do not understand the basic assumption underlying
faith (the absence of evidence). The larger problem, of course,
is ascribing human limits to God.
With respect to software we are God, the deus ex machina. We can
make it dance, sing, and follow the pulling of our strings. It's
only when we remove ourselves, allowing the hardware (the
computer) to develop its behavior intrinsically, that sentience is
possible. For us that simply means a computer without an
intrinsic (fixed) instruction set, but one that on "reflection" it
develops dynamically on its own.
Of course such a system is absolutely useless to us unless it
"agrees" to cooperate in some manner.<g> Understanding this you
will not gain anything other than "feckless" venture capital to
mass produce this for a market.<g> Of course, this probably
represents my limited marketing ability and vision.<vbg>
I have no "flame" to offer you. I just want to insure that
whenever we delve into higher abstractions and esoterics of Tunes
requirements and features that we in our thinking also do not
cross the line. Ashby dashed the hopes of men who would play God.
Only God can play that game.