8 bits..
Dr. J. Van Sckalkwyk (external)
SCHALKW@odie.ee.wits.ac.za
Mon, 19 Dec 1994 17:33:50 SAT
Dear Mike
About the 8 / 16 / 32 bit thing!
I've thought long, and, well, fairly hard about it.
Especially the text-processing side of things. (This
does not of course necessarily make my observations
more valid).
If we allow only 32 bits in our system architecture,
there is obviously an enormous performance hit on
a Z-80, or a 6502. Somewhat less, but still huge
on an 8088. Less on a 386, and (correct me if I'm
wrong, I don't have a P5 [thank god?]) even less on
a Pentium. Can someone tell me about RISCs, 68000
series chips etc?
One question is, is the added efficiency of allowing
8 bit and/or 16 bit "items" worth the added complexity?
My gut feeling is "NO", especially as things seem to
be getting more and more "32 bit streamlined" as we
"progress"! We may find that adding in 8 and 16 bit
"items" will actually _decrease_ our efficiency d/t
the added overheads of more testing for data types, etc
especially on the machines where we want optimum
performance ie. 32 bit machines.
Another problem involves memory, esp. with word processing.
I have a massive dislike for using more memory where
less will do. The thought of using 4x as much memory
in text-processing applications is _NOT_ very appetising,
but I really believe that this extra "waste" is
justified by the simpler coding. Also, memory is becoming
less and less expensive, and certainly, current (ugly,
big, clumsy) applications seem to be using more of it
anyway, I think mainly because of their clumsy design!
I would rather "design well" and end up using similar
amounts of memory, even though everything is 32 bits!!
((One question - a dream perhaps - is "how many gates
(etc) could they have cut out of the 386 (or whatever)
if they only dealt in 32 bit instructions?" - Just a
RISCish thought))!
This may not be an entirely convincing argument, but I
would go for simplicity, and try things out. If we find
that the overheads associated with using exclusively
32 bits are too great, _then_ increase the complexity,
rather than starting with a more complex system.
Whichever way we do it, it will almost certainly be
a real _pain_ to go back and recode extensively, but
I think that making a simpler start will facilitate
re-coding if we (unhappily) have to go back and do so!
(With " exclusively 32 bits", I don't _believe_ we will
have to go back & introduce 8 / 16 bits)!!
In terms of data transfer, I do not see the 32 bit coding
as a big problem, because if we are transferring "ASCII"
where "A" is represented as 00000041 rather than 41,
it is easy (for example) to establish a convention where we
signal that <the following is all "32 bit pseudo-ASCII"
packed 4 characters per 32 bits> or "merely" apply
compression to reduce volume. I do not see either as
involving a massive overhead.
Bye, JVS<