One language, no subset.
Lynn H. Maxson
Sat, 24 Jun 2000 23:23:24 -0700 (PDT)
Kyle Lahnakoski wrote:"The second part, I do not
see your objection because you repeated it: "One
language which covered the hardware and software
domains from the lowest level (hardware encoded
logic)..." The 'one language' being HLL,
and the 'subset of HLL' being that part which
applies only to hardware.
One language. No subset. Not because it is not
possible. But because it is not necessary.
It is a specification language. It makes no
distinction between a software specification and a
hardware specification. They have the same look
What separates them is the "level of abstraction".
You have two basic machine architectures, CISC and
RISC. Of the two RISC is a lower level. That
simply means that it is possible to implement a
CISC architecture with a RISC. In manufacturing
terms your RISC instructions are raw material and
your CISC instructions are assemblies, consisting
of one or more RISC instructions.
These define the two lowest levels of abstraction.
You can, if you like, refer to the specifications
here as machine-dependent. But that is only due to
your understanding their levels within the
hierarchy of levels that exist.
At the level above CISC we have our first
machine-independent level, the raw material of all
software. It consists of control structures
(sequence, decision, and iteration) and primitive
operators. All (machine-independent) levels above
this level consist of assemblies which contain
other assemblies and raw material. Aside from the
software irregularities of recursion and
co-routines it behaves identically to any other
Again there is no subset. The same language that
describes (specifies) the machine architecture
differs in no way from that used to specify
software. I do not argue against anyone use of
multiple languages. I only argue that it is
Free yourself of the programming paradigm. The
fact of the matter is that all computer hardware
and software has a 100% basis in pure logic. We
produce no illogical machines or software. If they
behave illogically, we deem it something to be
Every one of you who has taken plane geometry or
algebra or symbolic logic or mathematical logic has
been exposed to everything that can occur in a
computing machine or software. Having learned it
in a textbook manner and having used that learning
in the specifying and resolution of a logical
problem, e.g. decomposition of a binomial or proof
of a theorem, has been exposed to the only language
necessary, that of logic.
I offer a specification language capable of
expressing any logical expression. Thus it is
capable of expressing machine architectures and any
software. It is no better than any other
specification language that does the same. Which a
given user prefers is of no concern to me. I want
to enable whatever preference he exhibits.
I have run into expressions of disgust when using a
declarative form as an example. The argument is
not to bother the user with such knowledge. The
answer is to offer the user a non-declarative
language. My answer is to offer both within a
single language, leaving it to the user to decide
which in what circumstances he "prefers" to use.
That's the problem when you fall back on "leaving
it up to the computer". It implies that the
language used by the computer differs from that of
the user. While it certainly happens, again the
issue lies in its necessity.
I have this opposition to arrogance on the part of
a tool author who wants to impose his system on the
user, in effect deciding for the user. I feel as a
tool builder that I want to enable the user to find
and do things his way. I don't know what's best
nor for how long something will remain top dog in
There is no arrogance in a specification language
capbable of any logical expression. Given the
existing tools of logic programming, specifically
the two-stage proof process of its logic engine
(completness proof and exhaustive true/false), we
have a relatively simple (and trusted) means of
dealing with incompletness, ambiguity, and
The secret lies not in avoiding them nor dictating
the order of their resolution, but in simply noting
them (and their reasons). When we are in
development we are by definition "in process" in an
incomplete state. Actually a series of such states
until eventually we arrive at a complete state (at
least for this version).
For Soma who prefers working at the assembly
language level I would suggest that he do it with a
specification language based on logic programming.
In that manner the logic engine will generate all
possible logically equivalent lower-level forms of
a higher-level assembly. This means the generation
of "all" logically equivalent sequences of machine
instructions (specifications). That's more than he
could ever construct (and test) in his lifetime as
well as more than the best code generated by the
best assembly language programmer.
What I am asserting here is the "normal" production
of executables from a logic-programming-based
specification language as fast or faster than the
best assembly language program written by anyone.
No performance hit regardless of software
If you want to completely rethink the process, then