[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: Websites consistently failing
- From: Katsumi Yamaoka <yamaoka@xxxxxxx>
- Date: Tue, 02 Sep 2003 08:31:36 +0900
- X-ml-name: emacs-w3m
- X-mail-count: 05688
- References: <E19tt0w-0002jJ-00@ethereal>
Hi,
>>>>> In [emacs-w3m : No.05687]
>>>>> "Nolan J. Darilek" <nolan@thewordnerd.info> wrote:
> There are certain websites (everything in the axkit wiki,
> http://www.axkit.org/wiki/view/AxKit/LiveSites for instance) which I
> can't visit using emacs-w3m 1.3.6 without receiving the following
> error:
> Reading http://www.axkit.org/wiki/view/AxKit/LiveSites...done
> Can't decode encoded contents: http://www.axkit.org/wiki/view/AxKit/LiveSites
> Cannot retrieve URL: http://www.axkit.org/wiki/view/AxKit/LiveSites (exit status: 0)
I got the same error for that page. The contents are compressed
by gzip but seem to have corrupted, since gzip returned the
message ``gzip: stdin: unexpected end of file'' and the code 1.
> This also happens on several "normal" websites after a time; I'll get
> the above error on a site which, only a few moments earlier, worked
> fine and, until I kill and restart my emacs session, the site refuses
> to load. This is under w3m 0.4.1 and emacs 21.3 if that matters at
> all.
> Any ideas about what might be happening?
Well, we can make gzip return always the code 0 as follows,
though it isn't a right solution. ;-)
(setcdr (assq 'gzip w3m-decoder-alist)
'("/bin/sh" ("-c" "gzip -d 2>/dev/null; exit 0")))
--
Katsumi Yamaoka <yamaoka@jpl.org>