Patterns and Compositionality

Massimo Dentico
Thu, 27 Jan 2000 20:08:30 +0100 wrote:
> Massimo, why are all your messages sent with high importance?  ("X-Priority:
> 1" in the header.)

For  the strange behavior  of  my ISP  POP3 server:  sometimes  it
delivers messages  with low priority  with days  of delay  or  not
delivers  at all.  If this  causes problems,  I could  try to send
messages with normal priority (like this).
> > From: Massimo Dentico []
> > Subject: Re: Patterns and Compositionality
> > Jim Little wrote:
> > The  context  is: "[..]  If  generalizing  a  pattern  is  lambda-
> > abstraction and inlining a pattern is beta-expansion [..]".  So, I
> > have  rendered  explicit  my premises  just  to make  feasible  an
> > objection to my reasoning. But the intuitive definition of pattern
> > doesn't invalidate /a priori/ a more rigorous one.
> Lambda calculus is not even _close_ to powerful enough to handle design
> patterns -- it's not even powerful enough to easily handle function calls
> when the called function is permitted to modify variables (although it is
> sufficiently powerful to model either one, given some more complexity).  So
> you can't establish a one-to-one correspondance between operations in lambda
> calculus and pattern design or use.

I am  trying  to say exactly  what you have written,  sorry for my
English: *if* the substitution process (generalizing/inlining) for
patterns  in  OOD is  equivalent  to the substituition process  in
functional/logic  languages  (pattern  matching)  *and*  Wegner is
right *than* Lambda Calculus (LC) is an incomplete formalism  even
for  these  languages  (I want  to remember  that the substitution
process in LC becomes apparent  in the laws of LC thus they define
the  calculus  and  *precede* it).   At  least,  this  is  what  I
understand about the matter:  misconception  are attributable only
to me. The context is this:

[...]  Structured programming  for actions (verbs) can be formally
defined  by function composition,  while  structured  object-based
programming  for composite objects (nouns) is  modeled by **design
patterns** that have no compositional formal specifications. [...]

> So far, patterns have no been successfully formalized.  One reason for this
> is that a true design pattern has to be commonly useful; if it's not, it may
> be a clever design, but it's not suitable for a pattern.
> > I suspect that  Fare speaks about the "hype"  (I'm not sure  about
> > the meaning  of this word),  around  the useful  idea  of pattern,
> > exactly  because  the  lack  of a precise definition  prevent  any
> > attempt  of  rigorous  reasoning   (for  example,  comparisons  of
> > expressiveness with other frameworks).
> Fare senses most strongly other's crimes when he himself is guilty of the
> same thing.  Hype is the entire basis of our devotion to Tunes, and Fare is
> responsible for that.  So hype can be used for good as well as evil ;-).

Ok, what means hype exactly?
> Patterns are useful.  This is one of the main parts of their definition, so
> they can't help it.
> The patterns used in C++ are not similar to the patterns used in C, and
> neither one looks like the design patterns used in Beta (although the C++
> patterns are mostly useful in Beta, the other way is not true; some C++
> patterns are useless in Beta because Beta can express the same concepts as
> part of the language).
> > Instead the problem is:  the rigorous definition cathces the whole
> > intuitive  definition?  The  paper  of Wegner (as I understand it)
> > tries  to  answer  this  question  relatively  to  the  notion  of
> > computability,  the intuitive  compared  to the formal notion.  He
> > suggests  that   Multi-Stream  Interection  Machines  (MIMs)   and
> > co-inductive  reasoning  give  a  stronger  model  of  computation
> > compared  to  Turing  Machines,   lambda  calculus  and  inductive
> > reasoning.
> A very interesting hypothesis.  I lost the link to it, I think; or I never
> read it.  Where may I look?

Home page of Peter Wegner:

I'm  writing  about this paper,  cited  in  my original  post that
originate  this thread:  "Interaction, Computability, and Church’s
> > The intention of my original post was  to highlight  the fact that
> > if Wegner  and my equation are right  then,  even in the
> > field  of functional/logic
> > programming,   the  formal  model  of  reference
> > (lambda  calculs)  doesn't  catch  the  expressive  power  of this
> > languages.
> Which language?

For example  Haskell (functional)  has pattern-matching and Prolog
(logic) has it  as a  fundamental mechanism  that unify  procedure
invocation  (procedural  interpretation)   and  data  construction
and deconstruction.
> Or were you trying to say that lambda calculus can't model everything a
> computer can compute?

This is what stated Wegner. I want to highlight  that even for the
languages  that one suppose to be based  on lambda calculus,  this
formalism  is not adequate  to deal  with their important features
like pattern matching.

I hope that this post is understandable.

Massimo Dentico