Emergence of behavior through software

Alik Widge aswst16+@pitt.edu
Wed, 04 Oct 2000 06:28:50 -0400


(Apologies to Lynn, who gets this twice because I screwed up the headers 
the first time.)

--On Sunday, October 01, 2000 7:17 PM -0700 "Lynn H. Maxson" 
<lmaxson@pacbell.net> wrote:

> It depends upon who is executing the rule set.  If it is you or I
> deciding that we don't want to operate within those constraints,
> then we can choose otherwise.  I don't know what the process is
> within "living" organisms that allows this.  We have it, computers
> and software don't.  As Fare has admitted all software executes in
> a manner consistent with its embedded logic.

I agree --- currently, humans have volition, software doesn't. However, 
given that we don't know what causes volition, why do you believe that it 
is guaranteed not to be generateable algorithmically?

> Here you posit an impossible situation, a system deciding that it
> can become an organism.  Organisms don't have that choice.

I was being, in fact, metaphorical. :-) You can rephrase this as "If a 
programmer wants me to call his creation an organism, it should be 
self-sustaining."

> an organism.  It is a problem with a computer and software in that
> neither start as an organism.  No matter how you mix, mash, meld,
> and merge them if you don't start with an organism, you don't end
> up with one.<g>

However, we do start with an organism: the human programmer. I argue that 
whatever magical things are passed through sexual (or asexual) reproduction 
may also be passed through programming. After all, both are just an 
exchange of information.

Also, at some point, there were no organisms on Earth. An infinitesmal time 
later, there was at least one. This seems to contradict your statement that 
organisms may not spontaneously arise, unless you'd care to introduce God. 
(Even then, where the heck did he come from?)

> One passing note.  Artificial means not real.  AI means now and
> forever more not real intelligence, but something else altogether.
> No matter what we do to it or with it, it will never cross the
> line:it will remain artificial.

But the word artificial is used very loosely; it also can mean 
"manufactured" or "not naturally arising", as in the term "artificial color 
and flavor". (Or are those tastes and sights somehow not real?)

> understand.  However, it does not change the fact that two
> non-organisms cannot join to form an organism.  Two "wrongs"
> cannot make a "right".<g>

You were arguing, though, that software couldn't be an organism because 
it's dependent on its hardware. I'm saying that there could be a software 
organism, with the hardware playing the same role that the planet does for 
us.

> Software is not an organism.  That's the long and the short of it.

Well, if you're just going to assume that, I'm not exactly able to argue 
it, am I? :-)

> Software viruses work because they receive control from the
> processor and execute a "behavior" consistent with their embedded
> rules.  Nothing changes.

But why does that make them not an organism? Why can organisms not be 
algorithmic?

> I don't want to touch this one.  I am somewhat disappointed that
> your contact with other humans and organisms hasn't introduced you
> to processes not duplicable by a Turing machine.  I don't know

Please give me an example of something humans do that Turing machines 
don't, then. I've put this to my profs in both comp. theory and AI, and 
they couldn't provide an answer. (Note that emotions and volition and such 
definitely don't count --- since we don't know what causes these, we cannot 
prove that they are not reducible to a TM.)

> We should hear more about what you think is the central theme
> common to both of them.  In biology it is survival.  In evolution
> it is survival of the fittest.

And for a computer organism, would it not also be survival?

The "central theme" I'm alluding to is that the brain very well may be a 
TM. A very odd one, quite different from the state machines we think of, 
but it nonetheless may be one.

> It can never be free of the actions of the programmer, regardless
> of the programmer's ability to predict all possible outcomes or
> understand them completely.  It does no more than what the
> programmer told it.

All right, but we're never free of the laws of physics. So what?

> Seems fair (or even Fare).<g>  Humans do not "obey" mathematically
> expressible rules.  Mathematics is one of a multitude of human

Again, what else would you call physics? I haven't seen anyone who can 
choose to break that. Yes, some of those rules are statistical, but they 
are nonetheless mathematical. (Moreover, if they are statistical, then this 
simply means they're a nondeterministic state machine, and we already know 
that NFAs may become DFAs if one is willing to suffer the performance hit.)

> You'll do better with just one.  Unfortunately software is a
> fragile beast, overly sensitive to "illogic", and prone to
> failure.  It is one thing to put monkeys in front of typewriters
> where whatever order of letters is acceptable.  That's simply not
> true of software.

I chose a few hundred because I wanted to make the probability come out 
right. Do you concede the point, then, that a program may be generated 
through random opcode-picking?

> pursue than the current one.  The difference (perhaps) is that as
> an organism we are aware that we are following rules.  Software,
> being non-intelligent as well as a non-life form, is not aware
> that it is even following rules.

Ah. So where does awareness come from, and why is that also not algorithmic?

> If it doesn't have choice or even a choice in its choices, it
> cannot have volition.  It is not aware that it does not have a

But you're using a very narrow definition of choice. I don't agree that 
"conform or don't conform to the opcodes" is the only choice available. 
This is, to me, like saying that the only choices currently available to me 
are "shoot myself or don't".

> putting it there.  At least until we know what it is that we have
> to put.<g>  I suspect that we will find it intrinsic to the
> organism and therefore non-transferable.<g>

Hm. Is your argument, then, not so much that software with will/awareness 
is impossible, as that it is impossible right now? I certainly cannot argue 
that; this is why my heart sinks every time I see another "We're going to 
solve AI!" effort announced. I personally am not convinced that awareness 
is intrinsic to the human brain, but that again veers into my personal 
theology.

> You can simulate how a brain works down to the quantum detail and
> you still will not end up with a brain.  If you want to say that

Why not?

> there is no difference between this "artificial" brain and a real
> one then develop it in its entirety through procreation.

Why is procreation so key, if the artificial brain functions just like the 
real one?

> Now meld it with the remainder of the human system.  Without this
> remainder, without a system in place, the brain has no function
> and in fact can do nothing.  The brain, the nervous system, the

Ah, the embodiment hypothesis. I agree that a brain is obviously useless 
without I/O, but I don't think that has to be a body as we know it. If we 
understand a brain well enough to make one, we also understand sensory 
coding enough to let it see through cameras, hear through microphones, and 
so on. We can transduce the directory listing of the hard drive on which it 
resides directly to its optical inputs and go from there.

> There's no way that such an abstraction, i.e. selectively leaving
> something out, and implementing it (if it were possible) in
> software will result in an artificial brain that "behaves" like a
> real one.

Why not? If you remove the brain from the body and "fool" it by making sure 
that all the inputs and outputs are receiving data (of any kind), why is it 
not behaving properly?

> whole ball of wax, the human system.  The brain is not constructed
> according to the rules of logic.  Nothing so constructed can ever
> be a brain.  That is true for the most sophisticated and detailed

Again, why not? And what makes you say that the brain is not logically 
constructed? There are fairly rigidly defined systems of connection in and 
between all its subparts. These connections vary slightly between 
individuals, but we've seen that all humans have the same cognitive 
processes, and therefore those variations are really just noise.

> computer.  It's 100% logic based.  No organism from the
> single-cell amoeba on up is so constructed.  Logic is our
> creation, not vice versa.

But an amoeba cannot choose to violate the rules of its own internal 
workings anymore than I may grow wings or a program may start executing 
invalid opcodes.

> I should remind you of the difference between automatic and
> automated.  The self-assembly you refer to, which does not effect
> the brain alone but the entire organism, is automatic.  If it were
> automated, then its source would have been another process
> different from the current one.

Noted. I don't see how it makes a difference, though, as long as the end 
product is the same.