Refactoring in Tunes

Francois-Rene Rideau fare@tunes.org
Mon, 10 Jan 2000 22:31:36 +0100


>> As I understand it, refactoring is just a very particular case of
>> metaprogramming, with lots of hype around, and meant to be 
>> done manually only
> No!  Emphatically and strongly NO.  Refactoring is indeed a subcase of
> metaprogramming, but that says nothing about whether we should discuss it;
> metaprogramming itself is a special case of computer science, and we
> certainly discuss THAT.
Yes, but in the given case, the fact is that if you try to study refactoring
without studying metaprogramming in general, you just get a bunch of ad-hoc
rules. Metaprogramming (in the form of compiler theory) tells us that
the transformations involved in optimizing/whatever programs can be very
long-winded before you end back with a "same-semantics" program;
imposing a short-view constraint that all transformations preserve semantics
will only get in the way. Oh, and if you remove that local constraint,
then you have full metaprogramming. Again, in as much as "refactoring"
is any good, it's by teaching principles of (as general as possible)
metaprogramming. The rest is usual OO hype.

> Refactoring is a useful discussion as compared to metaprogramming, because
> refactoring is about teaching people how to control their source code.
Metaprogramming is about teaching people how to transform their source code
into better one, without introducing stupid arbitrary constraints like
"this code should have same interface as previous one" that you must latter
clumsily eliminate in no less arbitrary ways "oh, after all, no, we can
change this interface". Instead of giving people ad-hoc ways to go from
program A to program A, teach them general ways to go from program A to
program B.

> Automatic refactoring will only replace manual refactoring when computers
> become as intelligent as humans.  Until then, all we'll have is _assisted_
> refactorings.
Optimizing compilers are arguably automatic refactoring machines.
I admit that to begin with, I will be glad with a metaprogramming assistant.

>> Massimo already talked about
>> lambda-abstraction and beta-expansion: concepts indeed difficult to
>> express if at all in languages such as C, C++, Java, but trivial in
>> functional programming languages such as LISP, FORTH, ML, 
>> Haskell, etc.
> ...and irrelevant to computer science in general.
As Massimo said, generalizing a pattern IS lambda-abstraction,
and Inlining a pattern IS beta-expansion. Perfectly relevant.

> Lambda calculus has been
> a huge drag on computer science (I assert), and the sooner we get rid of it,
> the better.  I posted a link to the 'Joy' page a bit earlier; take a look
> for a study of computation theory completely without lambda calculus.
Joy looks like a (postfix notation) combinator language, the kind we
know to be equivalent to lambda-calculus. (Seen unlambda, lately?)

>> The fact that such operations be done before runtime leads us 
>> in the realms
>> of "partial-" or staged evaluation. Combining them leads us 
>> to substitution,
>> and finding the right substitution is unification (which can 
>> get pretty
>> difficult for higher-order terms).
>
> And none of this is relevant without lambda calculus.  Pretty neat, huh?
Of course they are relevant! Staged evaluation is the very principle
behind automatic metaprogramming. Partial evaluation has been done for
lots of languages, including C and Java (but mostly with LISP, Prolog,
Scheme, ML, and the like). Substitution is essential in all logic programming
and rewrite logic systems, not to talk about unification.

>> If you allow for more meta-level
>> functionality (who talked about differentiating functions to compute
>> normal vectors for 3D shapes?), then you can "refactor" the 
>> code in even simpler ways; etc.
> Could you clarify this?  I don't really see how differentiating a function
> could result in a refactoring, much less a simpler one.
If you can automatically differentiate functions,
instead of having a pattern solve_by_newton_raphson_method(f,f',seed),
you'll just have a pattern solve_by_newton_raphson_method(f,seed).
No more need of manual differentiation (and keeping f' up-to-date with f
as it changes). Neat, uh? And of course, you can basically have these
patterns be first class iff you have higher-order lambda-abstraction
in your language (i.e. something akin to lambda-calculus).

> Interesting.  I think I see your point here -- each person can write their
> own refactoring metacode which provides their facorite view of the program,
> and the programming tools update every person's metacode to stay up to stay
> up with all changes to the program itself.  So, for example, I have a
> class-and-procedures based view, and I use it to modify the code; Brian logs
> in tomorrow, and sees my changes in his types-and-arrows view.
You get the idea.

[ "Faré" | VN: Уng-Vû Bân | Join the TUNES project!   http://www.tunes.org/  ]
[ FR: François-René Rideau | TUNES is a Useful, Nevertheless Expedient System ]
[ Reflection&Cybernethics  | Project for  a Free Reflective  Computing System ]
Well, the fact is, God exists and I'm His prophet. And my prophecy is that
you shouldn't believe in God, least it's a blasphemy against His gift of a
brain to you. So that don't you believe in Him, least you go to Hell!