Are you serious?
Lynn H. Maxson
lmaxson@pacbell.net
Wed, 12 Jul 2000 00:54:21 -0700 (PDT)
"Are you saying we want to make machines think?
That's a very attractive proposal, to tell the
truth, but also one which doesn't seem to me to
offer much hope of success."
Look, I'm almost sorry I brought it up. My point
was that there was reflective programming (as
defined by Fare for Tunes) and then there was
reflective programming (as defined within
cybernetics). Having become thoroughly enraptured
by D. Ross Ashby's "Design for a Brain" and
"Introduction to Cybernetics", I'm prepared to go
where computer scientists fear to tread.<g>
While I slowly pour through some of Fare's
writings, the most recent being "Reflection,
Non-Determinism, and the lambda-Calculus", I am
coming more and more to the conclusion that he
should call it something other than "reflection".
He has yet to introduce an implementation which is
other than (externally) rule-based. He is very
careful to point this out.
For me the reading is like piercing a fog and thus
I may miss some important connection. I accept the
responsibility for any misstep. I may be all alone
in this. What he says and how he says it may be
absolutely clear to everyone else. In which case I
will pursue operational definitions on my own until
my understanding is somewhere close to that of
others.
To say that a system reflects when it has no choice
in the matter seems stretching it to me. I can
only reflect when it is my choice to do so. When
an external agent demands that I reflect on my
behavior, thinking, or whatever, I do so within a
range which they have defined. Otherwise they will
continue to make the same demand until I do.<g>
In my view to do reflection to the degree that Fare
suggests exists beyond our rule-based means. I
have no more idea of what this is than I do of what
thinking is or that anything corresponds to it. We
have no means that I am aware of a system
establishing its own rules of reflection through
spontaneous generation, that which occurs in
humans. Neither computers nor software with their
100% basis in logic have this capability.
In that respect "reflection" as defined by Fare is
extremely limited. Certainly not that envisioned
by cybernetics nor by that actually demonstrated by
Ashby. That's not a crime, but it is confusing
unless proscribed by an operational definition, a
metric, which sets not only to what it does pertain
but also to that which it does not.
I would say the same exists for the other
requirements. It's very difficult to evaluate any
Tunes HLL candidate in any "objective" (and thus
scientific) manner with qualitative measures
exclusively. We should have a quantitative basis
on which we can all agree on the measures, even if
we disagree on what we are measuring.<g>