some general comments on science and egineering (Re: What I've been doing)

Jano John Akim Franke j.j.a.franke@t-online.de
Mon, 8 Sep 1997 12:02:42 +0200


Dear TUNES-people,

Patrick Premonts email inspired me to post some general comments about the
way projects are done today.

We all know what we do not like in existent computing systems, but the
solution IMHO should not be to elliminate these subjects. This is to tell
you why there may be another way.

Through many (man-)years of development both hardware and software new
things almost always dependet on the older ones being compatible to them or
adding a few features and dropping less, the very basic underlying
technology always stayed the same. This is reasonable because why should
you thing about how to do basic things everyone was doing before you
successfully? You can depend on long-prooved methods to get a fast start
for your project without wasting time for things that are optimal and
therefore can not change.

The point is: Maybe there are not as optimal as you think. As time went by
new things (features) were invented so computing systems aren't really what
they were in the 60s but why are we using the same principles as our
grandfathers? They worked fine for their computing machinery but as we see
they are causing problems in todays.

I see a threat in solutions not questioning the very basic roots of this
still young science. When we proceed as we do there is no question that we
may end up with an sub-optimal solution because what we do is like a
heuristic: We fix problems of current technology to a get a better one
leaving a part of it as it was from the beginning of the process. This is
(not fully but) like a heuristic solution-tree. Another science field
"business science" will tell you that you may get a sub-optimal solution by
using this - i admit a bit less strikt - precudure.

So what is the solution to that? It is _not_ to blindly depend on other
things without questioning them. Maybe they do not suit your use because
the mostly undocumented requirements do not fit. i know this is very hard
to accomplish in science as there is so much to consider and that is the
reason this system was invented but if some people do this it will be worth
it especially if you think about forgery in science but this is another
subject. If you question things the easiest solution is to take the
existent solution and say "this is optimal there is nothing else i can
think of". This is the wrong way. IMHO there is almost always another way.

----------
I would like to invite you to do a little experiment. The first point is
the most important(!):

	1 IMPORTANT: Forget about everything that you know about "computing
systems". Take a few minutes for this test.

	2 Determine what you would like to use this technology for. This does not
mean "i would like another button in my word-processor doing [...]." Go far
beyond that.

	3 Think about ways to get technolgy to fulfill the result of point 2.

	4 Compare the results of point 2 and 3 to current technology.
----------
My reaction to this is a radical other vision of a "computing system". If
you should come to some "Aha"-effects or if you found something new please
comment here.

For instance you could question why should we use formalisms for our
computing system as implicitly proposed by Patrick Premont:

----------
| from: Patrick Premont <premont@vis.toronto.edu>
| to: TUNES <tunes@ens.fr>
| subject: What I've been doing
| date: 1997-09-04T19:08+02:00
| 
| [...] logical formalisms to represent actions and their effect.
| [...] We can provide a way to logically represent world
| phenomena, and from that define programming languages to
| accommodate whatever needs we have.

IMHO in a flat form formalisms are abstract representations to be
interpreted by programs run on computing systems. Why aren't these
"interpretations" of the system to be to be abstracted? Why should they be
abstract without being communicated? Think about this.

Have you ever thought about fractals beyond the nice graphics calculated by
your computer? If we translate fractalism to our subject we may come to the
conclusion that the appearance of out proposed system should be in a way a
macroscopic form of it inner microscopic functions. Can you see this in
todays computing systems?

The american word "computer-science" describes traditional science in this
field better than the the german-word "Informatik" because it does not
handle "information" but "computers" as their central subject. In my
opinion we should focus on information first as we try to process,
transport and store it by computers. But what have you heard about
information while studying (if you have). Do a query in your favorite
library and you will find that there is more literature on computers or
software-technology and even worse on special implementations of it than on
the properties of information.

So if you question computer-science to get to another "information-science"
think about the very basic foundings like "programming", "processor" or
"memory" in its "traditional" form.

I hope i did not bore yu with these comments but everyone i told this was -
after a while defending traditional computing - impressed why this wasn't
thought of before. I hope this was a stimulation for you to think and - of
course - comments are very welcome.

Kun afablaj salutoj

Jano John Akim Franke
----------------------------------------------------------------------
Franke & Wedegärtner GbR | Fon +49-511-16490-40; Fax -88 (G3)
Jano John Akim Franke    | mailto:j.j.a.franke@infology.int.eu.org
Schaufelder Str. 27      | http://home.t-online.de/home/j.j.a.franke
DE-30167 Hannover        | >>> reinventing computing project <<<