Emergence of behavior through software

Lynn H. Maxson lmaxson@pacbell.net
Sun, 01 Oct 2000 19:17:48 -0700 (PDT)


Alik Widge wrote:

"[re volition]All right. That's a definition. Now I ask for a 
justification. Why can volition not arise within the constraints 
of a rule set?"

It depends upon who is executing the rule set.  If it is you or I  
deciding that we don't want to operate within those constraints, 
then we can choose otherwise.  I don't know what the process is 
within "living" organisms that allows this.  We have it, computers 
and software don't.  As Fare has admitted all software executes in 
a manner consistent with its embedded logic.  

No one knows how to program volition because no one knows the 
process from which it arises.  If we determine that it is a 
non-transferable property of cell-based organisms, we can never 
incorporate it in software regardless of how well we mimic it.

"I personally would put some requirements on that, such that a 
system which wished to be an organism must at least be capable of 
sustaining itself indefinitely, but otherwise, I do not see this 
as a problem."

The thing that stumps me most in communicating with Fare lies in 
his metaphors, of confusing similar with identical, as if the 
properties of one became those of another identically.  They do 
not of course or we would not invoke metaphors.

Here you posit an impossible situation, a system deciding that it 
can become an organism.  Organisms don't have that choice.  
Neither do non-organisms.  The truth is that we have no means of 
creating an organism without starting with one: procreation.  For 
the record cloning does not change that.  As one who gardens 
extensively and raises fruit trees propagation always begins with 
an organism.  It is a problem with a computer and software in that 
neither start as an organism.  No matter how you mix, mash, meld, 
and merge them if you don't start with an organism, you don't end 
up with one.<g>

One passing note.  Artificial means not real.  AI means now and 
forever more not real intelligence, but something else altogether.  
No matter what we do to it or with it, it will never cross the 
line:it will remain artificial.

"Careful. A parasite cannot survive on its own --- it must live in 
a host. In fact, all known species exist only as part of 
ecosystems. Being dependent on other parts of a system does not 
preclude being an organism."

I thought I exercised extreme care.  Software is not an organism.  
Computers are not an organism.  We have only one means of 
producing organisms, that according to a process we do not 
understand.  However, it does not change the fact that two 
non-organisms cannot join to form an organism.  Two "wrongs" 
cannot make a "right".<g>

" Why can software not procreate? What about [software] viruses?"

Software is not an organism.  That's the long and the short of it.  
Software viruses work because they receive control from the 
processor and execute a "behavior" consistent with their embedded 
rules.  Nothing changes.

"Again, be careful. You can't prove either of those. I have yet to 
find a task which a human can do and a Turing machine cannot. The 
brain and computer are superficially different, but that doesn't 
mean that they aren't just two implementations of a central 
theme."

I don't want to touch this one.  I am somewhat disappointed that 
your contact with other humans and organisms hasn't introduced you 
to processes not duplicable by a Turing machine.  I don't know 
what computer architecture to which you have been exposed, but if 
it was von Neumann-based, the differences are not superficially 
different.

We should hear more about what you think is the central theme 
common to both of them.  In biology it is survival.  In evolution 
it is survival of the fittest.

"But they're not non-existent. Have the program create them, then 
pass control to them. I see where you're coming from --- you're 
saying that this is still the programmer telling the program to 
make them. But did the programmer have no volition if someone else 
told him to write that program?
 
Seems like we're chasing a chain back to the Big Bang."

I think you're getting the idea of what must occur in software 
which must execute in a manner consistent with its embedded logic.  
It can never be free of the actions of the programmer, regardless 
of the programmer's ability to predict all possible outcomes or 
understand them completely.  It does no more than what the 
programmer told it.

"I'm trying to catch you in a contradiction here. If obeying some 
rules, any rules, which are mathematically expressible precludes 
volition, I argue that you must declare humans non-volitional. 
Since you don't seem willing to do that, I sense a contradiction."

Seems fair (or even Fare).<g>  Humans do not "obey" mathematically 
expressible rules.  Mathematics is one of a multitude of human 
systems along with religion, politics, education, social, 
psychological, and all the rest.  Whether we stay "within" the 
rules or stray outside them is a choice we can make at any time.  
That choice is not available to software nor have we any means of 
programming it into it.

"All right... I'll set myself up a warehouse of old x86es and let 
them compute for as long as they can continue to run."

You'll do better with just one.  Unfortunately software is a 
fragile beast, overly sensitive to "illogic", and prone to 
failure.  It is one thing to put monkeys in front of typewriters 
where whatever order of letters is acceptable.  That's simply not 
true of software.

"We keep coming back to this idea that following rules means 
you're not spontaneous. I suppose that as long as you're using 
that as an assumption, your argument is consistent."

If I have no choice but to follow them, then spontaneity is out.  
That the situation with software which must follow the rules set 
by some source other than itself.  On the other hand I (or you) 
can be following rules "suddenly" seeing a different path to 
pursue than the current one.  The difference (perhaps) is that as 
an organism we are aware that we are following rules.  Software, 
being non-intelligent as well as a non-life form, is not aware 
that it is even following rules.

"All quite true, and I do not argue it. I merely challenge your 
further statement that this means software can never have 
volition."

If it doesn't have choice or even a choice in its choices, it 
cannot have volition.  It is not aware that it does not have a 
choice.  That piece of "magic" which exists at least in human 
organisms does not exist in software nor have we any means of 
putting it there.  At least until we know what it is that we have 
to put.<g>  I suspect that we will find it intrinsic to the 
organism and therefore non-transferable.<g>

" All known neural pathways can be modeled in software. It's a 
statistical process, but so is the brain, from what we know."

"On the other hand, the process of brain-construction is by 
definition automatable, since the brain self-assembles from 
embryonic tissue."

You can simulate how a brain works down to the quantum detail and 
you still will not end up with a brain.  If you want to say that 
there is no difference between this "artificial" brain and a real 
one then develop it in its entirety through procreation.

Now meld it with the remainder of the human system.  Without this 
remainder, without a system in place, the brain has no function 
and in fact can do nothing.  The brain, the nervous system, the 
blood system, the organs, the skeleton, the muscles, the skin--all 
exist as a single entity, all interconnected.

The fascination with the brain and with emulating it in software 
deliberately "abstracts" it from the system of which it is a part.  
There's no way that such an abstraction, i.e. selectively leaving 
something out, and implementing it (if it were possible) in 
software will result in an artificial brain that "behaves" like a 
real one.

If you don't go for the abstraction, then you must go for the 
whole ball of wax, the human system.  The brain is not constructed 
according to the rules of logic.  Nothing so constructed can ever 
be a brain.  That is true for the most sophisticated and detailed 
implementation of a simulation.  There's no crossover point, no 
point at which the artificial acquires a "property" (or 
"properties") and becomes the real thing.

All computers are based 100% in pure logic.  All software which 
executes successfully cannot violate the rules of the host 
computer.  It's 100% logic based.  No organism from the 
single-cell amoeba on up is so constructed.  Logic is our 
creation, not vice versa.

I should remind you of the difference between automatic and 
automated.  The self-assembly you refer to, which does not effect 
the brain alone but the entire organism, is automatic.  If it were 
automated, then its source would have been another process 
different from the current one.