Lynn H. Maxson
Mon, 17 Jul 2000 20:00:05 -0700 (PDT)
"What strikes me as strange is that sentence. What
do you mean by "in construction"? Do you mean
"part by part"?"
I'm sorry. Normally I would just say
"programming". I borrow this terminology from a
large client, Southern California Edison, who uses
it in their training of contract programmers who
need to learn the "SCE" way of doing things. Where
you see "construction", substitute "programming".
However, the point again is that only in
"programming" do we not allow the processing of an
entire application system as a whole. We insist
instead of breaking it up into "compilable pieces",
i.e. programs, instead of allowing the compilation
of all programs in the application system as a
single unit of work from a single unit (set) of
While I apologize for the "construction" instead of
"programming" designation, it does not change the
fact that implementers dictate the scope of
compilation, in most instances (as I don't know
all) to something that corresponds to a single
external procedure (in C and PL/I). If you only
want to compile and test a single statement, you
must somehow wrap it up into an acceptable "program
form". If you do not do so, chances are it will
terminate with a severe error, in effect chastises
you because it cannot adapt to your needs.<g>
"This almost makes sense to me -- would I be
correct in guessing that you made a typo here, and
you should have written "input and output
specifications"? Given that correction, this not
only makes sense, but even seems like a good
No typo. The only specifications are those written
by people, thus input specifications only. The
system does not generate output specifications of
any type. Nevertheless it is still a "good thing".
"You're being silly. People who work in Prolog are
still, by and large, called "programmers". This
practice will continue for the forseeable future.
Furthermore, you're still going to need to
programmers if you're ever going to implement your
specification system -- it won't specify itself
until it's written."
I can't be held responsible for the fundamental
error of those who developed Prolog.<g> That they
implemented it in the form of program development
instead of software development was a mistake. Had
they incorporated the results of analysis and
design as well as those of specification and
programming we might very well have C, C++, and
JAVA tools produced by "true" integrated
Further had the Prolog people taken advantage of
their lack of block structure, i.e. external and
internal procedures, they could have easily allowed
the set of input specifications to determine the
scope of compilation. I am not responsible for the
mistakes of others.<g>
As far as the writing of my system is concerned it
will be entirely in SL/I. You are correct in that
the "initial" implementation will be in a different
language (probably PL/I). All further versions
will be SL/I only.
"I don't see how both problem analysis and solution
design can be produced from the specs; usually the
specs are produced at the end of the Analysis
Well, you and I use different CASE tools.<g> Even,
it appears, different software development
processes. In mine specifications only appear as
the output of the specification stage which
precedes the analysis and design stages.
What you may have with a set of specifications is
"chaos". To bring order out of chaos you add new
specifications. All the system does is reflect the
chaos and its continued progress to "perfect order"
based on the implications of the specifications.
If you remember way back in our discussion, all
machine-independent specifications used an explicit
means of "invoked name" references which determines
their hierarchical arrangement. While we cannot
speak to the internal processing within a process,
e.g. a functional reference, we certainly can
relative to the data interface which joins them.
This set of data interfaces and their
(hierarchical) flow order determine the dataflow
diagrams of analysis. In the same manner they
determine the functional hierarchy of structure
charts of design. Thus specifications serve as
proper input in the analysis and design stages as
they do for programming (construction).
Now I'm sure you are still confused and thus a
slight pause might prove useful. The five-stage
software development process that I refer to
constantly has been with us since the beginning of
programming. As defined it is a seamless process
throughout. The sad fact is that in our entire IT
history we have never implemented it as defined.
More to the point (and to borrow one of your
favorite words) it would be "impossible" to
implement given any combination of existing tools.
Instead we live in a programmer-centric (and
dominated) IT universe. The most frequently
expressed feeling is that if you ain't programming,
you ain't doing anything. Nothing irritates a
programmer more than to have some "up front"
process try to dictate terms. Programmer regard
writing as a creative process and extremely resent
(and resist) anyone in any way impinging on their
The result is I doubt if a complete set of formal
specifications (in the sense used here) has ever
occurred, regardless of the insistence, for
example, in certain federal, state, and local
contracts. It doesn't happen as no formal
specification language which differs visually from
a programming language exists. If you have such a
language, you don't need to rewrite it in a
programming language. That's the lesson of Prolog
(even though they botched the rest of it<g>).
Not only are specifications incomplete, but no
matter how complete the analysis (dataflow) and
design (structure chart) no effective enforcement
exists in IT staffs that requires compliance by
programmers. The programmer will tell you (as they
have told me on numerous occasions) that those
other people are no-nothings with respect to
programming and thus what they supply cannot be
used as such.
Even though we have ISO 9000 which is to place
greater emphasis on the up front stages and OO has
put UML into the up front arena, nobody but nobody
controls what a programmer does. Programmer
management per se is non-existent. More than once
in an emergency situation the programmer says,
"Just leave me along and I will get it done."
I think very simply. I have two logic programming
products, Prolog and Trilogy. Both go directly
from specifications to executable. I don't doubt
your confusion when I talk about producing the
results of analysis and design from specifications
as well. If you have never experienced a complete
set of specifications, and chances are no one
reading this ever has, then you may have some
difficulty believing that they are necessary and
sufficient for the task.
Specification is an ill-defined process, hardly
more than an informal description, in most IT
accounts. Having a formal specification language
into which to translate user requirements which
eventually get retranslated into a formal
programming language, would seem to anyone a
duplication of effort. As you have to have the
programs up to now much of those specifications
have not been formally introduced except in terms
of program source.
Prolog changes that as it eliminates the
duplication of programming effort. In fact it
eliminates the programming effort entirely. It
does not eliminate the writing of specifications
which by and large programming (and programmers)
have discouraged as duplicate effort.
Now you may say that Prolog is a programming
language as well. I can only respond that it was a
basic error of the Prolog advocates. They choose
to ignore supporting the software development
process in its entirety in favor of "emulating" the
small piece (programming) assume by their
Now a compiler does a number of different processes
like syntax analysis, semantic analysis, proof
theory, and meta theory. I borrow these particular
terms and their use from Peter Flach's "Simply
Logical: Intelligent Reasoning by Example". Once
semantic analysis has occurred all the "names" and
their "relationships" used are known. This permits
us in the proof theory to capture each root
assertion (set of main goals) and use it along with
the explicit name references invoked throughout to
create the hierarchy represented the logical
organization. In Prolog currently on one such root
assertion (set of goals) is allowed. The system I
propose allows multiple of these and thus creates
multiple separate hierarchies (possibly connected
through data stores).
Each hierarchy contains all the information (data)
needed to produce the dataflow diagrams and
structure charts as well as the executable code. I
hope that clears up how specifications can suffice
to produce those results.
"...How about programs? What do you mean by saying
that I claimed this "impossible"?"
Here I can only offer you your own words.
>"Er -- no, it won't. Any programmer you ask to
>code that algorithm for you will either drop to
>the floor laughing or will be coding until he's
>dead. Solving the problem that way is
It's the same quote you started off this response
"I'm having trouble parsing this. None of what you
describe is unique to dataflow analysis; every
field of work uses division of effort and
encapsulation, including programming. I just don't
know what you're talking about."
Well, you see, it is the opposite of decomposition
and encapsulation. It recomposing and processing
as a whole all its pieces simultaneously. A
dataflow diagram, for example, can contain all of
the programs associated with an application as a
data-connected whole. Current compilers do not
support this "global" view.
" I fail to see that as a problem to be fixed; on
the contrary, it's our livlihood."
I understand this view. I can understand that its
existence appears as a form of job security. For
our users, however, it puts them in a game of
impossible catch up to meet the demands of their
users. The propose system when implemented will
devastate the IT profession, reducing its
population to just a fraction of its current size.
With luck come close to achieving what Claude
Shannon prediction that 10 system programmers would
supply all the code on this planet. That's the
significance of a minimam "50 times time reduction,
200 times cost reduction". If you depend on demand
exceeding supply and do not desire to compete for
the remaining business, you do have an interest in
it not happening.
"This is not the usual description of what has been
long known as "The Software Crisis". The usual
description is that we have too little control
over the quality of what we produce: e.g. we
promise an accounting system, but produce instead a
If you regard the problem of the backlog as
insoluble (the first crisis), then you might regard
the quality of what you actually produce as a
second crisis. The first drove the mass migration
to object-oriented technologies (which it has not
solved). OO in turn has shown that quality is not
its strong suit either.
The fact is that almost every IT failure is
commonly blamed in some manner on specifications,
some slip twixt the tongue and the lip. No one
comments that you can probably count the number of
IT accounts that do formal (and complete)
specifications worldwide on the fingers of one
hand. It's always easy to blame something you
didn't do as opposed to what you did.<g>
If anything as a profession, IT is excellent at
generating crises, exceedingly less so when it
comes to solving them.
"I've never once heard anyone suggest that
"distribution of processing" (or the more usual
term, which is is "cohesion") was the sole or even
the major cause of our problems."
The problem in maintenance was having to make the
"same" change in multiple locations. If not
distribution of processing (where the changes had
to be made), then the distribution of required
changes. The truth is that OO does not solve this
an cannot until the scope of compilation includes
all the programs as a single unit of work.
I understand what you mean by "cohesion" and will
not dispute it here.
"Huh? Implementation? Do you mean "our process"?"
No, our tools. These are our implements. Thus our
"I don't recall definining the software development
process as seamless, and I don't understand how
using an editor would be equivalent to creating a
seam. Extreme Programming is pretty close to
seamless, but they use editors."
I could probably help your recollection by
referencing any text on software development. In
it you will see a diagram (a dataflow diagram)
something like this:
That flow is known as the "classical waterfall"
process and is defined (and has been since
inception) as seamless. Granted it has never been
implemented as such (and that is the problem).
Implement it as define, produce a tool set that
supports it in a seamless manner, and today's
problems will disappear.
I know nothing of Extreme Programming except I
suspect they have no interest in the software
development process, only the programming
development one (the one that occurs entirely
withing the programming stage). I would probably
not even look at it as it appears another attempt
in the belief that optimizing a sub-process will
automatically optimize the entire process.
Everytime they have a "programming solution" I know
they do not understand the problem.<g>
"I think I understand this, and I think it sounds
good. I also think that what you say "input
specifications" you're not talking about
specifications which describe the input of the
user's desired program; you're actually
describing specifications which are about to be
input into your compiler. Am I at last correct
"I don't understand your point about being able to
introduce changes faster than they occur. Doesn't
mean anything to me. I intuit, though, that the
point is minor."
User change requests enter the system at a dynamic
rate. The goal (if I actually said it that way) is
not to introduce changes as faster than they occur,
but to respond to change request as fast as they
arrive. There is no sense in being stupid here,
but to assume that in the aggregate for any system
there is an average arrival rate over a given
interval, say three months, six months, a year.
Thus the response rate only has to meet this
average to institute the changes over the same
In order to handle sporadic "bursts" in this
arrival rate it is necessary to have capacity in
excess of that needed for "average". This is
simply a matter of capacity planning that we engage
in with performance and tuning of computing
systems. The goal of having excess capacity means
not only servicing changes as they arrive, but at
the same time relieve the backlog that has
So it is good to be able to respond to changes
faster than their occurrence.
"Oh, did you know that most users can't picture the
entire solution to their problem? (Most
programmers can't, either.) Your specification
language would have to solve that as well."
No. It is not a language problem. It is a tool
problem. In producing the results of analysis and
design from the input set of specifications, two
different views (pictures) are presented by the
tool. All this from one set of specifications.
"I'm lost here."
Well, you shouldn't be lost here. It's our old
buddy of "explicit" references invoked within
machine-independent specifications that determines
there hierarchical ordering. It's the same that
occurs in any block-structured language, including
"I understand this. Oddly enough, I just a month
ago wrote a program on a mailing list which did
Great. That means it is no longer impossible.<g>
"Processes into a process? Wouldn't this sentence
be better had it never been born?"
"When did I assign that impossibility?"
Again I reference the quote which begins your last
In wrapping this up I am beginning to sense that we
are coming closer to an understanding, if not
agreement, on most issues. We have no way in a
linear text form of representing a gestalt
experience, of getting what I have in my head
properly into yours. Mind reading would save a lot
of time, but then again it would eliminate
disagreements. No fun if you can't argue.:-)