Patterns and Compositionality

btanksley@hifn.com btanksley@hifn.com
Mon, 31 Jan 2000 13:46:53 -0800


> From: Francois-Rene Rideau [mailto:fare@tunes.org]

> > I didn't say it was impossible -- I said that it's not easy.

> I didn't say it was easy with lambda-like formal calculi.
> I said it was as easy/difficult with such calculi as with any 
> formal calculi, including sequent calculi that you like.

No, it's _not_ equally easy.  Other formalisms are easier or harder.  Be
realistic: all things are not equal.  Some proof systems are better at some
things than other proof systems.  Lambda calculus is really nice for
name-based systems; other formalisms are better for other systems.

It so happens that computers don't use names in any of their operations, so
lambda calculus is a very poor fit.  The only reason it's so popular, IMO,
is that most people mistake the programming language for the computer.

> > proofs involving the resulting messes are a nightmare.

> Not any _more_ than with any calculus for imperative programming.
> Imperative programming IS a nightmare, stop.

I wasn't only talking about imperative coding, but the problem _is_ typified
by it.  Computers operate imperitively, stop.  If that gives you nightmares,
change careers.

> Using an imperative calculus as a "primitive" calculus
> is an abstraction inversion.

In that case we need more abstraction inversions.  That is to say, as nice
as abstraction is, abstractions proportionally closer to useless as their
distance from the thing they're modelling increases.

> > Nonsense!  Computers and even turing machines handle those _easily_.

> Not easily. The semantics of computers and turing machines is 
> quite contorted.

When looked at from the POV of lambda calculus, yes.  This is why lambda
calculus is so inadequate.

> >> Friends have done proofs of imperative programs (heap 
> >> sorting, and more)
> >> in Coq (a logic system based on pure typed lambda-calculus); 
> >> it's possible,
> >> but it's certainly not straightforward.

> > Have you seen the proofs on similar systems written using 
> > abstract state
> > machines?  The proofs are readable by anyone with a 
> > reasonable amount of perseverence.

> So are proof sketches in frameworks such as the one used by Coq.

I was afraid you'd say something like that, because different people have
different definitions of what's 'reasonable'.  My point was that proofs
using ASMs are a _lot_ easier to read than typical proofs using only lambda
calculus or combinators.  (Keep in mind that ASMs usually simply reduce the
problem to one which is easy to solve using other tools, such as
combinators.)

> The part that's not straightforward is going from the proof sketch
> to the precise, machine-checkable proof. Having worked with 
> proof checkers
> for an imperative framework (B, a dialect of Z), I can tell 
> that the latter is certainly not simpler.

Using many modern calculi, certainly not.  Using ASMs it's a LOT easier;
take a look at any of the _many_ proofs on the ASM pages.

> > Fare', you're too hung up on formality.  Not everything has a formal
> > definition!
> Not everything. But everything that can be usefully used
> for technical collaboration over e-mail.

Do you know what the Visitor design pattern is?  If so, then you can discuss
a design which uses it easily.  We can do that because we have a general
pattern in common, given by a book.  (We could also mention that the Visitor
pattern is not needed in languages which support multiple dispatch -- but
they in turn make other design patterns possible which weren't possible in
C++ or Smalltalk.)

> > "Design Patterns" are like "sewing patterns". Opposing them is
> > like opposing sewing patterns because they're only shapes, 
> > and geometricians have known about shapes all along.
> In as much as there is a meaning in "Design Patterns", I do 
> not oppose them
> (see below). What I oppose is the very approach of hyping around
> "Design Patterns", and presenting it as something that's new 
> and the be all end all of programming methodology,

Then we are agreed.  I also hate that.  At the same time, I highly approve
of attempts to document, describe, and name existing patterns.  That ongoing
effort makes design with the help of a group possible, and gives some
assurance that the design has at least some quality to it.

> whereas it's plain old and has been
> the very basis of programming since the very beginning (long known as
> "pseudo-code").

Um...  No.  Not even close.  The old system this augments is simply
nomenclature.  (And it doesn't replace or imitate, it only augments.)

> Similarly, just because I oppose "OO",
> "programming by contract", "<FOO> Methodology", "Java",
> and the rest of the industrial hype doesn't mean that I reject
> the good things about them (that none of them brought about,
> but instead borrowed from better previous works).

I don't oppose any of them.  I accept them and use them.  Just because they
imitate something older doesn't automatically mean they do something else
worse.

> > A realistic definition of a "pattern" in that sense is "how 
> > it's been done
> > before".  Perhaps the words "role model" serve better.
> No.

You offer your own definition below, but before I attack it ;-) I'd like to
hear why you think my definition stinks so bad you don't even bother to
rebut it.

> A realistic definition of a "pattern" is meta-information
> (that you can alternatively consider as snippets of metaprograms
> for an arbitrarily declarative language).
> Yes, meta-information is useful. But just giving it a brand new name
> and hyping around informally instead of building formal tools to take
> advantage of it, is not progress but a brake to progress.

Ah, Progress!  The magic word of the 18th century.  Well, I've been rebuked;
I'm an impediment to progress.  In that case, I'm obviously an impediment to
Good, because Progress is Good because Good is Progress.  (Okay, I got that
out of my system.)

Design Patterns are a _specific_ type of metainformation about a specific
type of information.  You're objecting to a perfectly good specific word
simply because a vague, general word exists?  If you object to the name
"Design Patterns" on that basis, you should also object to "Binomial
nomenclature" in zoology, because it's also metainformation.

> >>>  Hype is the entire basis of our devotion to Tunes,
> >>>  and Fare is responsible for that.
> >>>  So hype can be used for good as well as evil ;-).

> >> No comment.
> > It's one of the more important things I said.

> > Patterns are useful and expedient.  TUNES is also useful, 
> > but not always expedient.  Hype claims to be useful, but
> > is never expedient.  TUNES is
> > closer to hype than patterns are.

> Meta-comment: I am convinced that Tunes is not hype,
> but I admit I haven't given a proof of it yet;
> thus I reserve a comment for later (when I am no more convinced,
> of have given a suitable proof).

By the definition of Hype and the definition of Tunes, Tunes is hype: it's a
lot of talk about something which doesn't exist.

> And yes, I'm responsible for all
> of Tunes' shortcomings.

Perhaps.  I'd like to point out that you're responsible for all of Tunes'
good things, too.  And those cannot be minimized -- if it weren't for the
Tunes project, many good things would not exist.  These things were not
brought about by formality; they were brought about by a collection and
organization of information in a way which allowed humans to understand
them.  _That_ is the purpose of Design Patterns, and _that_ is also the area
in which Tunes has been useful.

But I just said that Tunes is hype.  How do I reconcile that with my belief
that the Tunes project is already useful?  It's simpler than it looks: in
reality, the best things a group can do start with hype, as the leader of
the group encourages the members to look ahead to the future, to what
amazing things are possible (but not yet real).  You're very good at it, and
without that no group can survive.

> [ François-René ÐVB Rideau | Reflection&Cybernethics | 

-Billy