Emergence of behavior through software
Thu, 05 Oct 2000 22:23:01 -0400
--On Wednesday, October 04, 2000 8:14 PM -0700 "Lynn H. Maxson"
> Alan Widge raises a number of issues that need addressing to his
That's *Alik*. Get it right, please.
> of its author(s). The author(s) have no means of transferring
> that creative property in them to make it intrinsic to their
> creation, i.e. that their creation can replicate the processes
> that occur within their author(s).
This is true. However, I think you owe us a "yet".
> Emotion, Reason, and the Human Brain". I think you need to spend
> some time "listening" to those who engage actively with the brain,
> how it works (in so far as we know), and when it doesn't: its
> disorders. The point here lies in how little we actually know of
> the brain and as a result even less our understanding of it.
It may interest you to know that one of my majors was cognitive science,
with a heavy neuroscience component. I am well aware of how little we
currently know about how the brain works. This does not mean that we can
> single-cell appear. Nothing logical, physical, or otherwise says
> that it isn't as timeless as the universe itself. We have used
This is not really true, at least not if one accepts current hypotheses
regarding the Big Bang. If we apply our concepts of time, the mathematics
seems to suggest that at least the first few minutes were utterly incapable
of supporting anything we would call an organism.
> death. Yet no organism begins from this state. It takes an
> organism to begat an organism. Now why we have yet to discover.
The basic problem is that we can't set up the biochemical clockwork and
then add the single push to get it all rolling. We need to start with a
running engine and cobble the parts onto it as we go. OTOH, there are
plenty of people trying to recreate the primordial soup, so perhaps someday
they will demonstrate spontaneous generation of self-sustaining processes.
There is nothing which says that we *need* an organism, but starting with
one is significantly simpler, so we do that.
> separations. It has no separate actors performing a separate
> action. No subject. No verb. Just a complete universe.
But it *does* have a beginning. There is a point in what we call time
(which, although it may have arbitrary divisions, is also considered to be
a physical dimension, and is therefore "real" in some sense) before which
there appears to have been nothing.
> When you attempt to impose your map on the territory that's when
> you engage in fiction.
Perhaps, but the very point of a map is to let you plan a route within the
territory, and it generally serves that function quite well.
> So far no one has even suggested creating silicon crystals using
> software and computer hardware. Why not. Here you have a pure
> non-organism of only one component type. Why is it that you have
> to grow them. Why can we not just crowd them together? The
> answer that you seem to sneer at is "embodiment", the means of
> construction. The means that occurs in nature we basically follow
> in the laboratory.
This seems rather disjointed. Crystals must be formed in a specific way,
yes. It may be true that minds require specific underlying patterns.
However, there is no evidence that those patterns cannot be implemented as
software or hardware.
> Now there is none of that in a neuron. No logic. No ands. No
> ors. No nots. What you have is a connection, an interconnection,
> unlike any in any computer. I would refer you to Ashby's
Except that a neuron can in fact compute in just that manner. I would refer
*you* to something as simple as the cells of your retina. Shine a light on
one, and it turns on. Remove the light, and it turns off. (Others turn off
by light and on by dark. Same principle.) The thresholding behavior of
neurons is not much different from a digital gate: if you're close enough
to +5V, you get 1, otherwise you get 0.
> It's a feedback system. On the other hand constructing a
> non-programmed, but adaptive homeostatic unit means only that you
> have to connect it. Completely random. Not this to that nor that
> to this.
But that is *not* how the brain behaves. You will most likely get very poor
results if you rewire the optic nerves to auditory cortex and auditory to
visual cortex. Brains do not begin as randomly wired networks; the DNA
itself contains "bootstrap code" to organize structures and begin
> The truth is that you can fake it out, have it in a simulated run
> while on the ground, never putting it into an actual plane until
> it has "learned", until it has become "stable". Now which one,
> which process, would you as an airline company use?
But this is exactly the point you try to deny later when I suggest that a
body need not be exactly a human body.
> brought home to me. When you write of faking out the brain by
> somehow switching it instantaneously or gradually from its natural
> system into an artificial one you are engaged in science fiction.
> You do not appreciate how intricate a system the human organism
> is. Having experienced a stroke, albeit a minor one, for just
> denying blood flow for an instant to the brain, and for a period
> not having your legs "obey" your orders, this is not a
> plug-and-play system.
I also happen to be a medical student. One of my particular areas of
interest is sensory prostheses. We can already replicate the cochlea to a
reasonable degree. There are more people working on the eye than I care to
count. It *can* be made plug-and-play (well, if "plug" is defined as "damn
tricky surgery") if you can decode the protocol.
> eye. The connection is not a cable. This is a completely
> non-logic-circuit-based system that you propose replicating with a
> completely logic-circuit-based one. It is one thing to have a
Yes? So? Our senses have very stereotyped signals. These can be generated
from standard digital and analog logic. Right now, the problem seems to be
a matter of getting sufficiently large arrays of sufficiently small
transducers and wiring them in properly. If a software brain existed, the
problem would barely exist for that system.
> circuitry. The brain is not a computer nor the computer a brain.
This is an assertion. You are entitled to it as opinion, but I do not
believe that you have proved it or can prove it.
> You see there are connections and what they connect. You can't
> replicate the connections or what they connect with a von Neumann
> machine operating under Turing rules. The brain is not a von
This is also an assertion, and one which I think has been at least
partially proven false. We can in fact produce the signals. We need to get
the physical wiring down, but that's a simple matter of engineering. Give
it two decades.
> To you rules are logic-based only. We have no reason to believe
> (or disbelieve) that the internal-working rules for the amoeba are
> based "strictly" in logic. That systems of logic can arise from
Do you claim, then, that physics does not derive from the eminently logical
system of mathematics?
> That's a big "if", you see. To function like the real one means
> embodying it within an organism that functions like the human
> organism. That's how the brain functions. It does not function
> in isolation nor does it operate on simply a subset of its
If this is true, explain those who have sensory deficits. Seems to me that
they're functioning quite nicely on a subset of their brain's capabilities.
> aside any thoughts of software and enjoy the magic. The answer
> here is strictly, no. There is no transference, no organism-based
> seed, in programming. If there were, programs would develop on
> their own without need for further assistance.
And you cannot show that this is impossible. We may not know how to do it,
but that does not mean it is impossible.
> Nope. There cannot be a software organism.
> I'll concede the point as it is theoretically true on the basis of
> probability theory (another human invention not present in the
> universe). However, take a look at the probabilities for a simple
> program like "hello, world". You get one right and umpteen
> zillion wrong. Whereas if you eliminate the random opcode picking
> and use logic, it comes more in balance. I'll leave it to your
> employer which he prefers you use.
That's not the point, though. If you accept this as true, you see how any
program could be created without anyone having the intent to create that
specific program. The process would need to be optimized, but I only
desired proof-of-concept. This seems to partially deny your idea that
randomness can do nothing for the idea of the emergent system.
> A Turing machine has no intrisic purpose, will, emotion, feeling,
> imagination, concept building, sense of the universe, or any of
> the other things which differentiate it from organisms in general
> and humans in particular. You are stuck with achieving your goals
You cannot prove that these things cannot be subfunctions of a TM. Again,
simply not knowing how to do something does not make it impossible. I do
not say that a TM "has emotion"; I am rather saying that emotion may simply
be the output of a particular computational process within the brain.
Each individual neuron has a definable input-output behavior. As such, it
computes a function, and as such, it is theoretically replaceable by a TM.
Chain enough of those together to replicate the limbic circuits and you may
well have artificial emotion. Until we get clever enough to try it, you
cannot claim that it is impossible.
> non-organism-based means of providing tools for their use. I
> suggest that Billy has the correct approach in terms of
> constructing software to support and extend human capabilities,
> something within our current ability.
I am making no argument that the Tunes project should try to build software
organisms. That is not possible based on current knowledge. However, you
are apparently arguing that it will never be possible, and I consider this