[gclist] Re: Articles expiring too fast!
Nick Barnes
nickb@harlequin.co.uk
Tue, 28 Jan 1997 15:27:39 +0000
> Newsfeeds look like a job for compression. Many times
> someone posts an article and gets a dozen replies in
> which large parts of the original article are repeated.
> Some specialization of L-Z would cut the size down
> dramatically. It needs a way of picking off what you
> are looking at. But when you start looking at rec.humor.funny
> that would decompress for you. Each article would point to
> what it decended from as its compression source that would
> keep compression savings on cross posts.
Except that almost all news (by volume) is binaries and pictures,
which have already been compressed (in the case of pictures, already
compressed by a domain-specific algorithm such as GIF or JPEG).
Now if you had an even more specific compressor, which identified
pornographic scenes and compressed them by content ("leg here, grimace
there"), and then reconstructed them from a small library of images,
you could probably cut down the bandwidth considerably.
2/3 :-)
More seriously, large ISPs need smarter caching algorithms. Most of
news is almost certainly write-once.
More seriously still, unmoderated news is dying a slow death from
Sturgeon's Law. When some plausible mechanism for charging for
internet use is devised, this problem might go away. In the meantime,
I stopped reading unmoderated news some time ago. If I were running an
ISP, I would try charging less for accounts with limited news access
(no alt.*, for a start).
Nick B