wget.texi (Recursive Retrieval Options): Explained that you need
to use -r -l1 -p to get the two levels of requisites for a
<FRAMESET> page. Also made a few other wording improvements.
me to believe it wasn't was exposing a different bug -- URLs specified on the
commandline as opposed to being recursed to don't always get re-converted at the
end of the Wget run.
appropriate -I, -L, and -R/-rpath flags in environment variables,
manually. Automated everything, including bundling libtool so we can
successfully link with the OpenSSL shared libraries on just about any
platform.
NetWinder ARM Linux system (among others). Updated to 2001-03-16 vers.
* config.sub: Hadn't been updated since 1996 -- didn't work for
NetWinder ARM Linux system (among others). Updated to 2001-03-12 vers.
get from a file over HTTP (FTP only supports ranges ending at the end of the
file, though forcibly disconnecting from the server at the desired endpoint
might be workable).
multiple ChangeLog files (currently ./ChangeLog, doc/ChangeLog, and
src/ChangeLog), since this is unusual and people have complained their patches
hadn't been applied after checking only the top-level ChangeLog.
removed. Hopefully all the failures I was seeing were due to the fact that it
wasn't documented that non-globbing, non-recursive FTP downloads need -N to get
the remote timestamp to be preserved.
* wget.texi: Moved -nr from "Recursive Retrieval Options" to "FTP Options" and
gave it a @cindex entry. Alphabetized FTP options by long option name.
* main.c (print_help): -nr belongs in "FTP options" section of --help output,
not "Recursive retrieval" section. Alphabetized FTP options by long option
name.
* Makefile.in (install): Do install.man if we have pod2man.
* Makefile.in: Make wget man page and install it if we have pod2man. Added some
missing '$(srcdir)/'s. Added missing dependencies on install targets
(allowing you to just do `make install' rather than forcing you to do `make &&
make install'). Also, Makefile rules should always use output file parameters
if available rather than redirecting stdout with '>', or you falsely satisfy
dependencies if the tool you're running is missing or fails -- fixed call of
texi2pod.pl that did this wrong.
* texi2pod.pl: Removed from CVS. Now automatically generated.
* texi2pod.pl.in: This new file is processed into texi2pod.pl, getting the
appropriate path to the Perl 5+ executable on this system and becoming
executable (CVS files, by contrast, don't arrive executable).
number of bytes at the end of a file before resuming download. Apparently, some
stupid proxies insert a "transfer interrupted" string we need to get rid of.
you in a directory other than "/"? I don't see a src/ChangeLog entry for
it. In any case, my testing shows that it's fixed in 1.7-dev, but TODO and
a comment in src/ftp.c were not changed to reflect this.
looking at the dates would make you think that things went into
1.6 that actually just went into the 1.7-dev branch. Added "[Not
in 1.6 branch.]" where appropriate to clarify.
* NEWS: Released Wget version 1.6.
* po/*.po: 'Project-Id-Version's were very haphazard, saying
either "wget" or "GNU wget", and with versions of 1.5.2-b[124], 1.5.3, the
nonexistent 1.5.4, and 1.6-pre. Standardized all to "GNU Wget 1.7-dev".
Perhaps this is wrong to do because some of the translations haven't been
updated since the versions they state, but I know some of the files were updated
specifically for 1.6, and none of them used this version. In any case, the
'POT-Creation-Date's and 'PO-Revision-Date's remain the best indicator of
whether a translation's out of date.
- use mmap() to read whole files in core instead of allocating memory
and read'ing it.
- use a new, more general, HTML parser (html-parse.c) and interface to
it from Wget (html-url.c).
- respect <meta name=robots content=nofollow> (easy with the new HTML
parser).
- use hash tables instead of linked lists in places where the lists
were used to facilitate mappings.
- rewrite the code in host.c to be more readable and faster (hash
tables instead of home-grown lists.)
- make convert_links properly convert partial URLs to complete ones
for those URLs that have *not* been downloaded.
- use HTTP persistent connections where available. very
simple-minded, caches the last connection to the server.
Published in <sxshf533d5r.fsf@florida.arsdigita.de>.
* TODO: We need to check the HTTP spec w.r.t. simplification of absolute URLs.
* MAILING-LIST: I didn't realize <wget@sunsite.auc.dk> allowed posting by
non-subscribers. <bug-wget@gnu.org> soon to be an alias for it.
* NEWS: Always forget to update this file when making user-vis. changes.
* ftp.c (ftp_retrieve_list): Use new INFINITE_RECURSION #define.
* html.c: htmlfindurl() now takes final `dash_p_leaf_HTML' parameter.
Wrapped some > 80-column lines. When -p is specified and we're at a
leaf node, do not traverse <A>, <AREA>, or <LINK> tags other than
<LINK REL="stylesheet">.
* html.h (htmlfindurl): Now takes final `dash_p_leaf_HTML' parameter.
* init.c: Added new -p / --page-requisites / page_requisites option.
* main.c (print_help): Clarified that -l inf and -l 0 both allow
infinite recursion. Changed the unhelpful --mirrior description
to simply give the options it's equivalent to. Added new -p option.
(main): Added some comments; handle new -p / --page-requisites.
* options.h (struct options): Added new page_requisites field.
* recur.c: Changed "URL-s" to "URLs" and "HTML-s" to "HTMLs".
Calculate and pass down new `dash_p_leaf_HTML' parameter to
get_urls_html(). Use new INFINITE_RECURSION #define.
* retr.c: Changed "URL-s" to "URLs". get_urls_html() now takes
final `dash_p_leaf_HTML' parameter.
* url.c: get_urls_html() and htmlfindurl() now take final
`dash_p_leaf_HTML' parameter.
* url.h (get_urls_html): Now takes final `dash_p_leaf_HTML' parameter.
* wget.h: Added some comments and new INFINITE_RECURSION #define.
* wget.texi (Recursive Retrieval Options): Documented new -p option.
added missing company names, removed needless ^L, made AIX entry more general
to reflect my testing, removed the non-factual "this version of", and fixed
some grammatical errors.
However, Brian McMahon <bm@iucr.org> wants the old incorrect behavior to still
be available as an option, as he depends on it to allow mirrors of his site to
send CGI queries to his original site, but still get graphics off of the mirror
site. Perhaps this would be better dealt with by adding an option to tell -k
not to convert certain URLs patterns?
download a single HTML document and all its constituents.
* po/*.{gmo,po,pot}: Regenerated after adding new options.
* po/hr.po: Hrvoje forgot '\n's on his translations of my altered messages,
causing msgfmt to balk and `make install' to fail.
* wget.texi (Recursive Retrieval Options): In -K description, added a link to
the discussion of interaction with -N.
(Recursive Accept/Reject Options): Did some alphabetizing and added descriptions
of new --follow-tags and -G / --ignore-tags options.
(Following Links): Changed "the loads of" to "loads of".
(Wgetrc Commands): Added descriptions of new follow_tags and ignore_tags
commands.
* html.c (idmatch): Implemented checking of my new --follow-tags and
--ignore-tags options.
* init.c (commands): Added comment reminding people adding new entries doing
allocation to add corresponding freeing in cleanup().
(commands): Added new followtags and ignoretags commands.
(cleanup): Free storage for new followtags and ignoretags.
* main.c: Use of "comma-separated list" was random -- normalized it. Did some
alphabetization. Added comments pointing out "Options without arguments" and
"Options accepting an argument" sections of long_options[]. Added new options
--follow-tags and -G / --ignore-tags. Added comment that Damir's --referer is
currently undocumented. Added comment that Heiko's --waitretry is partially
undocumented (mentioned in --help but not in wget.texi). Moved improperly
sorted 24, 129, and 'G' cases.
* options.h (struct options): Added new fields follow_tags and ignore_tags.
* wget.h: Added "#define EQ 0" so we can say "strcmp(a, b) == EQ".
together, we compare local file X.orig (if extant) against server file X.
Previously -k and -N were worthless in combination because the local converted
files always differed from the server versions.
is available via anonymous CVS and desirable features are being added, it's
quite possible for end-users to be getting their hands on development versions.
They may report bugs, so if we don't change the version number, we'll have to
continually followup the statement "I'm using version 1.5.3" with the question
"The FTP archive or the CVS source?" Better to just make this development
version have a unique number. Once we're ready to actually release the next
version, we can up the version from 1.5.3+dev to 1.5.4, or 1.6, or whatever it
turns out to be (depending on how much development gets done).
Also made minor updates (dates, email addresses) to wget.texi.