1
0
mirror of https://github.com/moparisthebest/wget synced 2024-07-03 16:38:41 -04:00

Automated merge.

This commit is contained in:
Micah Cowan 2008-12-01 07:05:29 -08:00
commit 1b4ed7dcb7
84 changed files with 1023 additions and 514 deletions

View File

@ -1,3 +1,18 @@
2008-11-10 Micah Cowan <micah@cowan.name>
* MAILING-LIST: Mention Gmane, introduce subsections.
2008-11-05 Micah Cowan <micah@cowan.name>
* MAILING-LIST: Mention moderation for unsubscribed posts, and
archive location.
2008-10-31 Micah Cowan <micah@cowan.name>
* MAILING-LIST: Update information.
* NEWS: Add mention of mailing list move.
2008-08-01 Joao Ferreira <joao@joaoff.com> 2008-08-01 Joao Ferreira <joao@joaoff.com>
* NEWS: Added option --default-page to support alternative * NEWS: Added option --default-page to support alternative

View File

@ -1,33 +1,50 @@
Mailing List Mailing Lists
================ =============
There are several Wget-related mailing lists. The general discussion Primary List
list is at <wget@sunsite.dk>. It is the preferred place for support ------------
requests and suggestions, as well as for discussion of development.
You are invited to subscribe.
To subscribe, simply send mail to <wget-subscribe@sunsite.dk> and The primary mailinglist for discussion, bug-reports, or questions about
follow the instructions. Unsubscribe by mailing to GNU Wget is at <bug-wget@gnu.org>. To subscribe, send an email to
<wget-unsubscribe@sunsite.dk>. The mailing list is archived at <bug-wget-join@gnu.org>, or visit
`http://www.mail-archive.com/wget%40sunsite.dk/' and at `http://lists.gnu.org/mailman/listinfo/bug-wget'.
`http://news.gmane.org/gmane.comp.web.wget.general'.
Another mailing list is at <wget-patches@sunsite.dk>, and is used to You do not need to subscribe to send a message to the list; however,
submit patches for review by Wget developers. A "patch" is a textual please note that unsubscribed messages are moderated, and may take a
representation of change to source code, readable by both humans and while before they hit the list--*usually around a day*. If you want
programs. The file `PATCHES' that comes with Wget covers the creation your message to show up immediately, please subscribe to the list
and submitting of patches in detail. Please don't send general before posting. Archives for the list may be found at
suggestions or bug reports to `wget-patches'; use it only for patch `http://lists.gnu.org/pipermail/bug-wget/'.
submissions.
Subscription is the same as above for <wget@sunsite.dk>, except that An NNTP/Usenettish gateway is also available via Gmane
you send to <wget-patches-subscribe@sunsite.dk>, instead. The mailing (http://gmane.org/about.php). You can see the Gmane archives at
list is archived at `http://news.gmane.org/gmane.comp.web.wget.patches'. `http://news.gmane.org/gmane.comp.web.wget.general'. Note that the
Gmane archives conveniently include messages from both the current
list, and the previous one. Messages also show up in the Gmane archives
sooner than they do at `lists.gnu.org'.
Finally, there is the <wget-notify@addictivecode.org> mailing list. Bug Notices List
This is a non-discussion list that receives commit notifications from ----------------
the source repository, and also bug report-change notifications. This
is the highest-traffic list for Wget, and is recommended only for Additionally, there is the <wget-notify@addictivecode.org> mailing
people who are seriously interested in ongoing Wget development. list. This is a non-discussion list that receives bug report
Subscription is through the `mailman' interface at notifications from the bug-tracker. To subscribe to this list, send an
email to <wget-notify-join@addictivecode.org>, or visit
`http://addictivecode.org/mailman/listinfo/wget-notify'. `http://addictivecode.org/mailman/listinfo/wget-notify'.
Obsolete Lists
--------------
Previously, the mailing list <wget@sunsite.dk> was used as the main
discussion list, and another list, <wget-patches@sunsite.dk> was used
for submitting and discussing patches to GNU Wget.
Messages from <wget@sunsite.dk> are archived at
`http://www.mail-archive.com/wget%40sunsite.dk/' and at
`http://news.gmane.org/gmane.comp.web.wget.general' (which also
continues to archive the current list, <bug-wget@gnu.org>).
Messages from <wget-patches@sunsite.dk> are archived at
`http://news.gmane.org/gmane.comp.web.wget.patches'.

5
NEWS
View File

@ -8,6 +8,8 @@ Please send GNU Wget bug reports to <bug-wget@gnu.org>.
* Changes in Wget 1.12 (MAINLINE) * Changes in Wget 1.12 (MAINLINE)
** Mailing list MOVED to bug-wget@gnu.org
** --default-page option added to support alternative default names for ** --default-page option added to support alternative default names for
index.html. index.html.
@ -27,6 +29,9 @@ support password prompts at the console.
** The --input-file option now also handles retrieving links from ** The --input-file option now also handles retrieving links from
an external file. an external file.
** Several previously existing, but undocumented .wgetrc options
are now documented: save_headers, spider, and user_agent.
* Changes in Wget 1.11.4 * Changes in Wget 1.11.4

View File

@ -1,3 +1,47 @@
2008-11-15 Steven Schubiger <stsc@members.fsf.org>
* sample.wgetrc: Comment the waitretry "default" value,
because there is a global one now.
* wget.texi (Download Options): Mention the global
default value.
2008-11-10 Micah Cowan <micah@cowan.name>
* Makefile.am (EXTRA_DIST): Removed no-longer-present
README.maint (shouldn't have been there in the first place).
* wget.texi (Mailing Lists): Added information aboug Gmane portal,
added subsection headings.
Update node pointers.
2008-11-05 Micah Cowan <micah@cowan.name>
* wget.texi: Move --no-http-keep-alive from FTP Options to HTTP
Options.
(Mailing List): Mention moderation for unsubscribed posts, and
archive location.
2008-11-04 Micah Cowan <micah@cowan.name>
* wget.texi, fdl.texi: Updated to FDL version 1.3.
2008-10-31 Micah Cowan <micah@cowan.name>
* wget.texi (Mailing List): Update info to reflect change to
bug-wget@gnu.org.
2008-09-30 Steven Schubiger <stsc@members.fsf.org>
* wget.texi (Wgetrc Commands): Add default_page, save_headers,
spider and user_agent to the list of recognized commands.
2008-09-10 Michael Kessler <kessler.michael@aon.at>
* wget.texi (Robot Exclusion): Fixed typo "downloads" ->
"download"
2008-08-03 Xavier Saint <wget@sxav.eu> 2008-08-03 Xavier Saint <wget@sxav.eu>
* wget.texi : Add option descriptions for the three new * wget.texi : Add option descriptions for the three new

View File

@ -48,7 +48,8 @@ $(SAMPLERCTEXI): $(srcdir)/sample.wgetrc
info_TEXINFOS = wget.texi info_TEXINFOS = wget.texi
wget_TEXINFOS = fdl.texi sample.wgetrc.munged_for_texi_inclusion wget_TEXINFOS = fdl.texi sample.wgetrc.munged_for_texi_inclusion
EXTRA_DIST = README.maint sample.wgetrc $(SAMPLERCTEXI) \ EXTRA_DIST = sample.wgetrc \
$(SAMPLERCTEXI) \
texi2pod.pl texi2pod.pl
wget.pod: $(srcdir)/wget.texi $(srcdir)/version.texi wget.pod: $(srcdir)/wget.texi $(srcdir)/version.texi

View File

@ -1,130 +0,0 @@
TO RELEASE WGET X.Y.Z:
1) update PO files from the TP
cd po
../util/update_po_files.sh
2) generate tarball
from the trunk:
cd ~/tmp
~/code/svn/wget/trunk/util/dist-wget --force-version X.Y.Z
from a branch:
cd ~/tmp
~/code/svn/wget/branches/X.Y/util/dist-wget --force-version X.Y.Z -b branches/X.Y
3) test the tarball
4) set new version number "X.Y.Z" on the repository
5) tag the sources in subversion
from the trunk:
svn copy -m "Tagging release X.Y.Z" http://svn.dotsrc.org/repo/wget/trunk http://svn.dotsrc.org/repo/wget/tags/WGET_X_Y_Z/
from a branch:
svn copy -m "Tagging release X.Y.Z" http://svn.dotsrc.org/repo/wget/branches/X.Y/ http://svn.dotsrc.org/repo/wget/tags/WGET_X_Y_Z/
6) upload the tarball on gnu.org
RELEASE=X.Y.Z
TARBALL=wget-${RELEASE}.tar.gz
gpg --default-key 7B2FD4B0 --detach-sign -b --output ${TARBALL}.sig $TARBALL
echo -e "version: 1.1\ndirectory: wget\nfilename: $TARBALL\ncomment: Wget release ${RELEASE}" > ${TARBALL}.directive
gpg --default-key 7B2FD4B0 --clearsign ${TARBALL}.directive
lftp ftp://ftp-upload.gnu.org/incoming/ftp
(use ftp://ftp-upload.gnu.org/incoming/alpha for pre-releases)
put wget-X.Y.Z.tar.gz
put wget-X.Y.Z.tar.gz.sig
put wget-X.Y.Z.tar.gz.directive.asc
7) update wget.sunsite.dk and gnu.org/software/wget
8) send announcement on wget@sunsite.dk:
hi to everybody,
i have just uploaded the wget X.Y.Z tarball on ftp.gnu.org:
ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz
you can find the GPG signature of the tarball at these URLs:
ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz.sig
and the GPG key i have used for the signature at this URL:
http://www.tortonesi.com/GNU-GPG-Key.txt
the key fingerprint is:
pub 1024D/7B2FD4B0 2005-06-02 Mauro Tortonesi (GNU Wget Maintainer)
<mauro@ferrara.linux.it>
Key fingerprint = 1E90 AEA8 D511 58F0 94E5 B106 7220 24E9 7B2F D4B0
the MD5 checksum of the tarball is:
MD5 of tarball wget-X.Y.Z.tar.gz
{DESCRIPTION OF THE CHANGES}
9) send announcement on info-gnu@gnu.org
I'm very pleased to announce the availability of GNU Wget X.Y.Z.
GNU Wget is a non-interactive command-line tool for retrieving files using
HTTP, HTTPS and FTP, which may easily be called from scripts, cron jobs,
terminals without X-Windows support, etc.
For more information, please see:
http://www.gnu.org/software/wget
http://wget.sunsite.dk
Here are the compressed sources and the GPG detached signature:
ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz
ftp://ftp.gnu.org/gnu/wget/wget-X.Y.Z.tar.gz.sig
The MD5 checksums of the tarball is:
MD5 of tarball wget-X.Y.Z.tar.gz
The GPG key I have used for the tarball signature is available at this URL:
http://www.tortonesi.com/GNU-GPG-Key.txt
the key fingerprint is:
pub 1024D/7B2FD4B0 2005-06-02 Mauro Tortonesi (GNU Wget Maintainer)
<mauro@ferrara.linux.it>
Key fingerprint = 1E90 AEA8 D511 58F0 94E5 B106 7220 24E9 7B2F D4B0
{DESCRIPTION OF THE CHANGES}
10) post announcement on freshmeat.net
11) set new version number "X.Y.Z+devel" on the repository

View File

@ -1,13 +1,12 @@
@c The GNU Free Documentation License.
@center Version 1.3, 3 November 2008
@node GNU Free Documentation License @c This file is intended to be included within another document,
@appendixsec GNU Free Documentation License @c hence no sectioning command or @node.
@cindex FDL, GNU Free Documentation License
@center Version 1.2, November 2002
@display @display
Copyright @copyright{} 2000,2001,2002 Free Software Foundation, Inc. Copyright @copyright{} 2000, 2001, 2002, 2007, 2008 Free Software Foundation, Inc.
51 Franklin St, Fifth Floor, Boston, MA 02110-1301, USA @uref{http://fsf.org/}
Everyone is permitted to copy and distribute verbatim copies Everyone is permitted to copy and distribute verbatim copies
of this license document, but changing it is not allowed. of this license document, but changing it is not allowed.
@ -112,6 +111,9 @@ formats which do not have any title page as such, ``Title Page'' means
the text near the most prominent appearance of the work's title, the text near the most prominent appearance of the work's title,
preceding the beginning of the body of the text. preceding the beginning of the body of the text.
The ``publisher'' means any person or entity that distributes copies
of the Document to the public.
A section ``Entitled XYZ'' means a named subunit of the Document whose A section ``Entitled XYZ'' means a named subunit of the Document whose
title either is precisely XYZ or contains XYZ in parentheses following title either is precisely XYZ or contains XYZ in parentheses following
text that translates XYZ in another language. (Here XYZ stands for a text that translates XYZ in another language. (Here XYZ stands for a
@ -380,13 +382,30 @@ title.
@item @item
TERMINATION TERMINATION
You may not copy, modify, sublicense, or distribute the Document except You may not copy, modify, sublicense, or distribute the Document
as expressly provided for under this License. Any other attempt to except as expressly provided under this License. Any attempt
copy, modify, sublicense or distribute the Document is void, and will otherwise to copy, modify, sublicense, or distribute it is void, and
automatically terminate your rights under this License. However, will automatically terminate your rights under this License.
parties who have received copies, or rights, from you under this
License will not have their licenses terminated so long as such However, if you cease all violation of this License, then your license
parties remain in full compliance. from a particular copyright holder is reinstated (a) provisionally,
unless and until the copyright holder explicitly and finally
terminates your license, and (b) permanently, if the copyright holder
fails to notify you of the violation by some reasonable means prior to
60 days after the cessation.
Moreover, your license from a particular copyright holder is
reinstated permanently if the copyright holder notifies you of the
violation by some reasonable means, this is the first time you have
received notice of violation of this License (for any work) from that
copyright holder, and you cure the violation prior to 30 days after
your receipt of the notice.
Termination of your rights under this section does not terminate the
licenses of parties who have received copies or rights from you under
this License. If your rights have been terminated and not permanently
reinstated, receipt of a copy of some or all of the same material does
not give you any rights to use it.
@item @item
FUTURE REVISIONS OF THIS LICENSE FUTURE REVISIONS OF THIS LICENSE
@ -404,7 +423,42 @@ following the terms and conditions either of that specified version or
of any later version that has been published (not as a draft) by the of any later version that has been published (not as a draft) by the
Free Software Foundation. If the Document does not specify a version Free Software Foundation. If the Document does not specify a version
number of this License, you may choose any version ever published (not number of this License, you may choose any version ever published (not
as a draft) by the Free Software Foundation. as a draft) by the Free Software Foundation. If the Document
specifies that a proxy can decide which future versions of this
License can be used, that proxy's public statement of acceptance of a
version permanently authorizes you to choose that version for the
Document.
@item
RELICENSING
``Massive Multiauthor Collaboration Site'' (or ``MMC Site'') means any
World Wide Web server that publishes copyrightable works and also
provides prominent facilities for anybody to edit those works. A
public wiki that anybody can edit is an example of such a server. A
``Massive Multiauthor Collaboration'' (or ``MMC'') contained in the
site means any set of copyrightable works thus published on the MMC
site.
``CC-BY-SA'' means the Creative Commons Attribution-Share Alike 3.0
license published by Creative Commons Corporation, a not-for-profit
corporation with a principal place of business in San Francisco,
California, as well as future copyleft versions of that license
published by that same organization.
``Incorporate'' means to publish or republish a Document, in whole or
in part, as part of another Document.
An MMC is ``eligible for relicensing'' if it is licensed under this
License, and if all works that were first published under this License
somewhere other than this MMC, and subsequently incorporated in whole
or in part into the MMC, (1) had no cover texts or invariant sections,
and (2) were thus incorporated prior to November 1, 2008.
The operator of an MMC Site may republish an MMC contained in the site
under CC-BY-SA on the same site at any time before August 1, 2009,
provided the MMC is eligible for relicensing.
@end enumerate @end enumerate
@page @page
@ -418,7 +472,7 @@ license notices just after the title page:
@group @group
Copyright (C) @var{year} @var{your name}. Copyright (C) @var{year} @var{your name}.
Permission is granted to copy, distribute and/or modify this document Permission is granted to copy, distribute and/or modify this document
under the terms of the GNU Free Documentation License, Version 1.2 under the terms of the GNU Free Documentation License, Version 1.3
or any later version published by the Free Software Foundation; or any later version published by the Free Software Foundation;
with no Invariant Sections, no Front-Cover Texts, and no Back-Cover with no Invariant Sections, no Front-Cover Texts, and no Back-Cover
Texts. A copy of the license is included in the section entitled ``GNU Texts. A copy of the license is included in the section entitled ``GNU
@ -427,7 +481,7 @@ license notices just after the title page:
@end smallexample @end smallexample
If you have Invariant Sections, Front-Cover Texts and Back-Cover Texts, If you have Invariant Sections, Front-Cover Texts and Back-Cover Texts,
replace the ``with...Texts.'' line with this: replace the ``with@dots{}Texts.'' line with this:
@smallexample @smallexample
@group @group

View File

@ -49,7 +49,7 @@
# downloads, set waitretry to maximum number of seconds to wait (Wget # downloads, set waitretry to maximum number of seconds to wait (Wget
# will use "linear backoff", waiting 1 second after the first failure # will use "linear backoff", waiting 1 second after the first failure
# on a file, 2 seconds after the second failure, etc. up to this max). # on a file, 2 seconds after the second failure, etc. up to this max).
waitretry = 10 #waitretry = 10
## ##

View File

@ -82,7 +82,7 @@ Info entry for @file{wget}.
@contents @contents
@ifnottex @ifnottex
@node Top @node Top, Overview, (dir), (dir)
@top Wget @value{VERSION} @top Wget @value{VERSION}
@insertcopying @insertcopying
@ -102,7 +102,7 @@ Info entry for @file{wget}.
* Concept Index:: Topics covered by this manual. * Concept Index:: Topics covered by this manual.
@end menu @end menu
@node Overview @node Overview, Invoking, Top, Top
@chapter Overview @chapter Overview
@cindex overview @cindex overview
@cindex features @cindex features
@ -211,7 +211,7 @@ Public License, as published by the Free Software Foundation (see the
file @file{COPYING} that came with GNU Wget, for details). file @file{COPYING} that came with GNU Wget, for details).
@end itemize @end itemize
@node Invoking @node Invoking, Recursive Download, Overview, Top
@chapter Invoking @chapter Invoking
@cindex invoking @cindex invoking
@cindex command line @cindex command line
@ -248,7 +248,7 @@ the command line.
* Recursive Accept/Reject Options:: * Recursive Accept/Reject Options::
@end menu @end menu
@node URL Format @node URL Format, Option Syntax, Invoking, Invoking
@section URL Format @section URL Format
@cindex URL @cindex URL
@cindex URL syntax @cindex URL syntax
@ -326,7 +326,7 @@ with your favorite browser, like @code{Lynx} or @code{Netscape}.
@c man begin OPTIONS @c man begin OPTIONS
@node Option Syntax @node Option Syntax, Basic Startup Options, URL Format, Invoking
@section Option Syntax @section Option Syntax
@cindex option syntax @cindex option syntax
@cindex syntax of options @cindex syntax of options
@ -401,7 +401,7 @@ the default. For instance, using @code{follow_ftp = off} in
using @samp{--no-follow-ftp} is the only way to restore the factory using @samp{--no-follow-ftp} is the only way to restore the factory
default from the command line. default from the command line.
@node Basic Startup Options @node Basic Startup Options, Logging and Input File Options, Option Syntax, Invoking
@section Basic Startup Options @section Basic Startup Options
@table @samp @table @samp
@ -429,7 +429,7 @@ instances of @samp{-e}.
@end table @end table
@node Logging and Input File Options @node Logging and Input File Options, Download Options, Basic Startup Options, Invoking
@section Logging and Input File Options @section Logging and Input File Options
@table @samp @table @samp
@ -517,7 +517,7 @@ Prepends @var{URL} to relative links read from the file specified with
the @samp{-i} option. the @samp{-i} option.
@end table @end table
@node Download Options @node Download Options, Directory Options, Logging and Input File Options, Invoking
@section Download Options @section Download Options
@table @samp @table @samp
@ -863,8 +863,7 @@ file, up to the maximum number of @var{seconds} you specify. Therefore,
a value of 10 will actually make Wget wait up to (1 + 2 + ... + 10) = 55 a value of 10 will actually make Wget wait up to (1 + 2 + ... + 10) = 55
seconds per file. seconds per file.
Note that this option is turned on by default in the global By default, Wget will assume a value of 10 seconds.
@file{wgetrc} file.
@cindex wait, random @cindex wait, random
@cindex random wait @cindex random wait
@ -1038,7 +1037,7 @@ Prompt for a password for each connection established. Cannot be specified
when @samp{--password} is being used, because they are mutually exclusive. when @samp{--password} is being used, because they are mutually exclusive.
@end table @end table
@node Directory Options @node Directory Options, HTTP Options, Download Options, Invoking
@section Directory Options @section Directory Options
@table @samp @table @samp
@ -1110,7 +1109,7 @@ i.e. the top of the retrieval tree. The default is @samp{.} (the
current directory). current directory).
@end table @end table
@node HTTP Options @node HTTP Options, HTTPS (SSL/TLS) Options, Directory Options, Invoking
@section HTTP Options @section HTTP Options
@table @samp @table @samp
@ -1170,6 +1169,19 @@ For more information about security issues with Wget, @xref{Security
Considerations}. Considerations}.
@end iftex @end iftex
@cindex Keep-Alive, turning off
@cindex Persistent Connections, disabling
@item --no-http-keep-alive
Turn off the ``keep-alive'' feature for HTTP downloads. Normally, Wget
asks the server to keep the connection open so that, when you download
more than one document from the same server, they get transferred over
the same TCP connection. This saves time and at the same time reduces
the load on the server.
This option is useful when, for some reason, persistent (keep-alive)
connections don't work for you, for example due to a server bug or due
to the inability of server-side scripts to cope with the connections.
@cindex proxy @cindex proxy
@cindex cache @cindex cache
@item --no-cache @item --no-cache
@ -1444,7 +1456,7 @@ form-based authentication.
@end table @end table
@node HTTPS (SSL/TLS) Options @node HTTPS (SSL/TLS) Options, FTP Options, HTTP Options, Invoking
@section HTTPS (SSL/TLS) Options @section HTTPS (SSL/TLS) Options
@cindex SSL @cindex SSL
@ -1569,7 +1581,7 @@ not used), EGD is never contacted. EGD is not needed on modern Unix
systems that support @file{/dev/random}. systems that support @file{/dev/random}.
@end table @end table
@node FTP Options @node FTP Options, Recursive Retrieval Options, HTTPS (SSL/TLS) Options, Invoking
@section FTP Options @section FTP Options
@table @samp @table @samp
@ -1672,22 +1684,9 @@ Note that when retrieving a file (not a directory) because it was
specified on the command-line, rather than because it was recursed to, specified on the command-line, rather than because it was recursed to,
this option has no effect. Symbolic links are always traversed in this this option has no effect. Symbolic links are always traversed in this
case. case.
@cindex Keep-Alive, turning off
@cindex Persistent Connections, disabling
@item --no-http-keep-alive
Turn off the ``keep-alive'' feature for HTTP downloads. Normally, Wget
asks the server to keep the connection open so that, when you download
more than one document from the same server, they get transferred over
the same TCP connection. This saves time and at the same time reduces
the load on the server.
This option is useful when, for some reason, persistent (keep-alive)
connections don't work for you, for example due to a server bug or due
to the inability of server-side scripts to cope with the connections.
@end table @end table
@node Recursive Retrieval Options @node Recursive Retrieval Options, Recursive Accept/Reject Options, FTP Options, Invoking
@section Recursive Retrieval Options @section Recursive Retrieval Options
@table @samp @table @samp
@ -1892,7 +1891,7 @@ If, for whatever reason, you want strict comment parsing, use this
option to turn it on. option to turn it on.
@end table @end table
@node Recursive Accept/Reject Options @node Recursive Accept/Reject Options, , Recursive Retrieval Options, Invoking
@section Recursive Accept/Reject Options @section Recursive Accept/Reject Options
@table @samp @table @samp
@ -1987,7 +1986,7 @@ This is a useful option, since it guarantees that only the files
@c man end @c man end
@node Recursive Download @node Recursive Download, Following Links, Invoking, Top
@chapter Recursive Download @chapter Recursive Download
@cindex recursion @cindex recursion
@cindex retrieving @cindex retrieving
@ -2055,7 +2054,7 @@ about this.
Recursive retrieval should be used with care. Don't say you were not Recursive retrieval should be used with care. Don't say you were not
warned. warned.
@node Following Links @node Following Links, Time-Stamping, Recursive Download, Top
@chapter Following Links @chapter Following Links
@cindex links @cindex links
@cindex following links @cindex following links
@ -2079,7 +2078,7 @@ links it will follow.
* FTP Links:: Following FTP links. * FTP Links:: Following FTP links.
@end menu @end menu
@node Spanning Hosts @node Spanning Hosts, Types of Files, Following Links, Following Links
@section Spanning Hosts @section Spanning Hosts
@cindex spanning hosts @cindex spanning hosts
@cindex hosts, spanning @cindex hosts, spanning
@ -2136,7 +2135,7 @@ wget -rH -Dfoo.edu --exclude-domains sunsite.foo.edu \
@end table @end table
@node Types of Files @node Types of Files, Directory-Based Limits, Spanning Hosts, Following Links
@section Types of Files @section Types of Files
@cindex types of files @cindex types of files
@ -2241,7 +2240,7 @@ local filenames, and so @emph{do} contribute to filename matching.
This behavior, too, is considered less-than-desirable, and may change This behavior, too, is considered less-than-desirable, and may change
in a future version of Wget. in a future version of Wget.
@node Directory-Based Limits @node Directory-Based Limits, Relative Links, Types of Files, Following Links
@section Directory-Based Limits @section Directory-Based Limits
@cindex directories @cindex directories
@cindex directory limits @cindex directory limits
@ -2325,7 +2324,7 @@ directory, while in @samp{http://foo/bar} (no trailing slash),
meaningless, as its parent is @samp{/}). meaningless, as its parent is @samp{/}).
@end table @end table
@node Relative Links @node Relative Links, FTP Links, Directory-Based Limits, Following Links
@section Relative Links @section Relative Links
@cindex relative links @cindex relative links
@ -2354,7 +2353,7 @@ to ``just work'' without having to convert links.
This option is probably not very useful and might be removed in a future This option is probably not very useful and might be removed in a future
release. release.
@node FTP Links @node FTP Links, , Relative Links, Following Links
@section Following FTP Links @section Following FTP Links
@cindex following ftp links @cindex following ftp links
@ -2374,7 +2373,7 @@ effect on such downloads. On the other hand, domain acceptance
Also note that followed links to @sc{ftp} directories will not be Also note that followed links to @sc{ftp} directories will not be
retrieved recursively further. retrieved recursively further.
@node Time-Stamping @node Time-Stamping, Startup File, Following Links, Top
@chapter Time-Stamping @chapter Time-Stamping
@cindex time-stamping @cindex time-stamping
@cindex timestamping @cindex timestamping
@ -2424,7 +2423,7 @@ say.
* FTP Time-Stamping Internals:: * FTP Time-Stamping Internals::
@end menu @end menu
@node Time-Stamping Usage @node Time-Stamping Usage, HTTP Time-Stamping Internals, Time-Stamping, Time-Stamping
@section Time-Stamping Usage @section Time-Stamping Usage
@cindex time-stamping usage @cindex time-stamping usage
@cindex usage, time-stamping @cindex usage, time-stamping
@ -2480,7 +2479,7 @@ gives a timestamp. For @sc{http}, this depends on getting a
directory listing with dates in a format that Wget can parse directory listing with dates in a format that Wget can parse
(@pxref{FTP Time-Stamping Internals}). (@pxref{FTP Time-Stamping Internals}).
@node HTTP Time-Stamping Internals @node HTTP Time-Stamping Internals, FTP Time-Stamping Internals, Time-Stamping Usage, Time-Stamping
@section HTTP Time-Stamping Internals @section HTTP Time-Stamping Internals
@cindex http time-stamping @cindex http time-stamping
@ -2512,7 +2511,7 @@ with @samp{-N}, server file @samp{@var{X}} is compared to local file
Arguably, @sc{http} time-stamping should be implemented using the Arguably, @sc{http} time-stamping should be implemented using the
@code{If-Modified-Since} request. @code{If-Modified-Since} request.
@node FTP Time-Stamping Internals @node FTP Time-Stamping Internals, , HTTP Time-Stamping Internals, Time-Stamping
@section FTP Time-Stamping Internals @section FTP Time-Stamping Internals
@cindex ftp time-stamping @cindex ftp time-stamping
@ -2541,7 +2540,7 @@ that is supported by some @sc{ftp} servers (including the popular
@code{wu-ftpd}), which returns the exact time of the specified file. @code{wu-ftpd}), which returns the exact time of the specified file.
Wget may support this command in the future. Wget may support this command in the future.
@node Startup File @node Startup File, Examples, Time-Stamping, Top
@chapter Startup File @chapter Startup File
@cindex startup file @cindex startup file
@cindex wgetrc @cindex wgetrc
@ -2569,7 +2568,7 @@ commands.
* Sample Wgetrc:: A wgetrc example. * Sample Wgetrc:: A wgetrc example.
@end menu @end menu
@node Wgetrc Location @node Wgetrc Location, Wgetrc Syntax, Startup File, Startup File
@section Wgetrc Location @section Wgetrc Location
@cindex wgetrc location @cindex wgetrc location
@cindex location of wgetrc @cindex location of wgetrc
@ -2590,7 +2589,7 @@ means that in case of collision user's wgetrc @emph{overrides} the
system-wide wgetrc (in @file{/usr/local/etc/wgetrc} by default). system-wide wgetrc (in @file{/usr/local/etc/wgetrc} by default).
Fascist admins, away! Fascist admins, away!
@node Wgetrc Syntax @node Wgetrc Syntax, Wgetrc Commands, Wgetrc Location, Startup File
@section Wgetrc Syntax @section Wgetrc Syntax
@cindex wgetrc syntax @cindex wgetrc syntax
@cindex syntax of wgetrc @cindex syntax of wgetrc
@ -2617,7 +2616,7 @@ global @file{wgetrc}, you can do it with:
reject = reject =
@end example @end example
@node Wgetrc Commands @node Wgetrc Commands, Sample Wgetrc, Wgetrc Syntax, Startup File
@section Wgetrc Commands @section Wgetrc Commands
@cindex wgetrc commands @cindex wgetrc commands
@ -2710,6 +2709,9 @@ Ignore @var{n} remote directory components. Equivalent to
@item debug = on/off @item debug = on/off
Debug mode, same as @samp{-d}. Debug mode, same as @samp{-d}.
@item default_page = @var{string}
Default page name---the same as @samp{--default-page=@var{string}}.
@item delete_after = on/off @item delete_after = on/off
Delete after download---the same as @samp{--delete-after}. Delete after download---the same as @samp{--delete-after}.
@ -3002,6 +3004,9 @@ this off.
Save cookies to @var{file}. The same as @samp{--save-cookies Save cookies to @var{file}. The same as @samp{--save-cookies
@var{file}}. @var{file}}.
@item save_headers = on/off
Same as @samp{--save-headers}.
@item secure_protocol = @var{string} @item secure_protocol = @var{string}
Choose the secure protocol to be used. Legal values are @samp{auto} Choose the secure protocol to be used. Legal values are @samp{auto}
(the default), @samp{SSLv2}, @samp{SSLv3}, and @samp{TLSv1}. The same (the default), @samp{SSLv2}, @samp{SSLv3}, and @samp{TLSv1}. The same
@ -3014,6 +3019,9 @@ responses---the same as @samp{-S}.
@item span_hosts = on/off @item span_hosts = on/off
Same as @samp{-H}. Same as @samp{-H}.
@item spider = on/off
Same as @samp{--spider}.
@item strict_comments = on/off @item strict_comments = on/off
Same as @samp{--strict-comments}. Same as @samp{--strict-comments}.
@ -3037,6 +3045,10 @@ Specify username @var{string} for both @sc{ftp} and @sc{http} file retrieval.
This command can be overridden using the @samp{ftp_user} and This command can be overridden using the @samp{ftp_user} and
@samp{http_user} command for @sc{ftp} and @sc{http} respectively. @samp{http_user} command for @sc{ftp} and @sc{http} respectively.
@item user_agent = @var{string}
User agent identification sent to the HTTP Server---the same as
@samp{--user-agent=@var{string}}.
@item verbose = on/off @item verbose = on/off
Turn verbose on/off---the same as @samp{-v}/@samp{-nv}. Turn verbose on/off---the same as @samp{-v}/@samp{-nv}.
@ -3050,7 +3062,7 @@ only---the same as @samp{--waitretry=@var{n}}. Note that this is
turned on by default in the global @file{wgetrc}. turned on by default in the global @file{wgetrc}.
@end table @end table
@node Sample Wgetrc @node Sample Wgetrc, , Wgetrc Commands, Startup File
@section Sample Wgetrc @section Sample Wgetrc
@cindex sample wgetrc @cindex sample wgetrc
@ -3067,7 +3079,7 @@ its line.
@include sample.wgetrc.munged_for_texi_inclusion @include sample.wgetrc.munged_for_texi_inclusion
@end example @end example
@node Examples @node Examples, Various, Startup File, Top
@chapter Examples @chapter Examples
@cindex examples @cindex examples
@ -3081,7 +3093,7 @@ complexity.
* Very Advanced Usage:: The hairy stuff. * Very Advanced Usage:: The hairy stuff.
@end menu @end menu
@node Simple Usage @node Simple Usage, Advanced Usage, Examples, Examples
@section Simple Usage @section Simple Usage
@itemize @bullet @itemize @bullet
@ -3134,7 +3146,7 @@ links index.html
@end example @end example
@end itemize @end itemize
@node Advanced Usage @node Advanced Usage, Very Advanced Usage, Simple Usage, Examples
@section Advanced Usage @section Advanced Usage
@itemize @bullet @itemize @bullet
@ -3270,7 +3282,7 @@ wget -O - http://cool.list.com/ | wget --force-html -i -
@end example @end example
@end itemize @end itemize
@node Very Advanced Usage @node Very Advanced Usage, , Advanced Usage, Examples
@section Very Advanced Usage @section Very Advanced Usage
@cindex mirroring @cindex mirroring
@ -3319,7 +3331,7 @@ wget -m -k -K -E http://www.gnu.org/ -o /home/me/weeklog
@end itemize @end itemize
@c man end @c man end
@node Various @node Various, Appendices, Examples, Top
@chapter Various @chapter Various
@cindex various @cindex various
@ -3329,14 +3341,14 @@ This chapter contains all the stuff that could not fit anywhere else.
* Proxies:: Support for proxy servers. * Proxies:: Support for proxy servers.
* Distribution:: Getting the latest version. * Distribution:: Getting the latest version.
* Web Site:: GNU Wget's presence on the World Wide Web. * Web Site:: GNU Wget's presence on the World Wide Web.
* Mailing List:: Wget mailing list for announcements and discussion. * Mailing Lists:: Wget mailing list for announcements and discussion.
* Internet Relay Chat:: Wget's presence on IRC. * Internet Relay Chat:: Wget's presence on IRC.
* Reporting Bugs:: How and where to report bugs. * Reporting Bugs:: How and where to report bugs.
* Portability:: The systems Wget works on. * Portability:: The systems Wget works on.
* Signals:: Signal-handling performed by Wget. * Signals:: Signal-handling performed by Wget.
@end menu @end menu
@node Proxies @node Proxies, Distribution, Various, Various
@section Proxies @section Proxies
@cindex proxies @cindex proxies
@ -3412,7 +3424,7 @@ Alternatively, you may use the @samp{proxy-user} and
settings @code{proxy_user} and @code{proxy_password} to set the proxy settings @code{proxy_user} and @code{proxy_password} to set the proxy
username and password. username and password.
@node Distribution @node Distribution, Web Site, Proxies, Various
@section Distribution @section Distribution
@cindex latest version @cindex latest version
@ -3421,7 +3433,7 @@ master GNU archive site ftp.gnu.org, and its mirrors. For example,
Wget @value{VERSION} can be found at Wget @value{VERSION} can be found at
@url{ftp://ftp.gnu.org/pub/gnu/wget/wget-@value{VERSION}.tar.gz} @url{ftp://ftp.gnu.org/pub/gnu/wget/wget-@value{VERSION}.tar.gz}
@node Web Site @node Web Site, Mailing Lists, Distribution, Various
@section Web Site @section Web Site
@cindex web site @cindex web site
@ -3430,43 +3442,64 @@ The official web site for GNU Wget is at
information resides at ``The Wget Wgiki'', information resides at ``The Wget Wgiki'',
@url{http://wget.addictivecode.org/}. @url{http://wget.addictivecode.org/}.
@node Mailing List @node Mailing Lists, Internet Relay Chat, Web Site, Various
@section Mailing List @section Mailing Lists
@cindex mailing list @cindex mailing list
@cindex list @cindex list
There are several Wget-related mailing lists. The general discussion @unnumberedsubsec Primary List
list is at @email{wget@@sunsite.dk}. It is the preferred place for
support requests and suggestions, as well as for discussion of
development. You are invited to subscribe.
To subscribe, simply send mail to @email{wget-subscribe@@sunsite.dk} The primary mailinglist for discussion, bug-reports, or questions
and follow the instructions. Unsubscribe by mailing to about GNU Wget is at @email{bug-wget@@gnu.org}. To subscribe, send an
@email{wget-unsubscribe@@sunsite.dk}. The mailing list is archived at email to @email{bug-wget-join@@gnu.org}, or visit
@url{http://lists.gnu.org/mailman/listinfo/bug-wget}.
You do not need to subscribe to send a message to the list; however,
please note that unsubscribed messages are moderated, and may take a
while before they hit the list---@strong{usually around a day}. If
you want your message to show up immediately, please subscribe to the
list before posting. Archives for the list may be found at
@url{http://lists.gnu.org/pipermail/bug-wget/}.
An NNTP/Usenettish gateway is also available via
@uref{http://gmane.org/about.php,Gmane}. You can see the Gmane
archives at
@url{http://news.gmane.org/gmane.comp.web.wget.general}. Note that the
Gmane archives conveniently include messages from both the current
list, and the previous one. Messages also show up in the Gmane
archives sooner than they do at @url{lists.gnu.org}.
@unnumberedsubsec Bug Notices List
Additionally, there is the @email{wget-notify@@addictivecode.org} mailing
list. This is a non-discussion list that receives bug report
notifications from the bug-tracker. To subscribe to this list,
send an email to @email{wget-notify-join@@addictivecode.org},
or visit @url{http://addictivecode.org/mailman/listinfo/wget-notify}.
@unnumberedsubsec Obsolete Lists
Previously, the mailing list @email{wget@@sunsite.dk} was used as the
main discussion list, and another list,
@email{wget-patches@@sunsite.dk} was used for submitting and
discussing patches to GNU Wget.
Messages from @email{wget@@sunsite.dk} are archived at
@itemize @tie{}
@item
@url{http://www.mail-archive.com/wget%40sunsite.dk/} and at @url{http://www.mail-archive.com/wget%40sunsite.dk/} and at
@url{http://news.gmane.org/gmane.comp.web.wget.general}. @item
@url{http://news.gmane.org/gmane.comp.web.wget.general} (which also
continues to archive the current list, @email{bug-wget@@gnu.org}).
@end itemize
Another mailing list is at @email{wget-patches@@sunsite.dk}, and is Messages from @email{wget-patches@@sunsite.dk} are archived at
used to submit patches for review by Wget developers. A ``patch'' is @itemize @tie{}
a textual representation of change to source code, readable by both @item
humans and programs. The
@url{http://wget.addictivecode.org/PatchGuidelines} page
covers the creation and submitting of patches in detail. Please don't
send general suggestions or bug reports to @samp{wget-patches}; use it
only for patch submissions.
Subscription is the same as above for @email{wget@@sunsite.dk}, except
that you send to @email{wget-patches-subscribe@@sunsite.dk}, instead.
The mailing list is archived at
@url{http://news.gmane.org/gmane.comp.web.wget.patches}. @url{http://news.gmane.org/gmane.comp.web.wget.patches}.
@end itemize
Finally, there is the @email{wget-notify@@addictivecode.org} mailing @node Internet Relay Chat, Reporting Bugs, Mailing Lists, Various
list. This is a non-discussion list that receives bug report-change
notifications from the bug-tracker. Unlike for the other mailing lists,
subscription is through the @code{mailman} interface at
@url{http://addictivecode.org/mailman/listinfo/wget-notify}.
@node Internet Relay Chat
@section Internet Relay Chat @section Internet Relay Chat
@cindex Internet Relay Chat @cindex Internet Relay Chat
@cindex IRC @cindex IRC
@ -3475,7 +3508,7 @@ subscription is through the @code{mailman} interface at
In addition to the mailinglists, we also have a support channel set up In addition to the mailinglists, we also have a support channel set up
via IRC at @code{irc.freenode.org}, @code{#wget}. Come check it out! via IRC at @code{irc.freenode.org}, @code{#wget}. Come check it out!
@node Reporting Bugs @node Reporting Bugs, Portability, Internet Relay Chat, Various
@section Reporting Bugs @section Reporting Bugs
@cindex bugs @cindex bugs
@cindex reporting bugs @cindex reporting bugs
@ -3495,7 +3528,7 @@ Wget crashes, it's a bug. If Wget does not behave as documented,
it's a bug. If things work strange, but you are not sure about the way it's a bug. If things work strange, but you are not sure about the way
they are supposed to work, it might well be a bug, but you might want to they are supposed to work, it might well be a bug, but you might want to
double-check the documentation and the mailing lists (@pxref{Mailing double-check the documentation and the mailing lists (@pxref{Mailing
List}). Lists}).
@item @item
Try to repeat the bug in as simple circumstances as possible. E.g. if Try to repeat the bug in as simple circumstances as possible. E.g. if
@ -3534,7 +3567,7 @@ safe to try.
@end enumerate @end enumerate
@c man end @c man end
@node Portability @node Portability, Signals, Reporting Bugs, Various
@section Portability @section Portability
@cindex portability @cindex portability
@cindex operating systems @cindex operating systems
@ -3567,7 +3600,7 @@ Support for building on MS-DOS via DJGPP has been contributed by Gisle
Vanem; a port to VMS is maintained by Steven Schweda, and is available Vanem; a port to VMS is maintained by Steven Schweda, and is available
at @url{http://antinode.org/}. at @url{http://antinode.org/}.
@node Signals @node Signals, , Portability, Various
@section Signals @section Signals
@cindex signal handling @cindex signal handling
@cindex hangup @cindex hangup
@ -3588,7 +3621,7 @@ SIGHUP received, redirecting output to `wget-log'.
Other than that, Wget will not try to interfere with signals in any way. Other than that, Wget will not try to interfere with signals in any way.
@kbd{C-c}, @code{kill -TERM} and @code{kill -KILL} should kill it alike. @kbd{C-c}, @code{kill -TERM} and @code{kill -KILL} should kill it alike.
@node Appendices @node Appendices, Copying this manual, Various, Top
@chapter Appendices @chapter Appendices
This chapter contains some references I consider useful. This chapter contains some references I consider useful.
@ -3599,7 +3632,7 @@ This chapter contains some references I consider useful.
* Contributors:: People who helped. * Contributors:: People who helped.
@end menu @end menu
@node Robot Exclusion @node Robot Exclusion, Security Considerations, Appendices, Appendices
@section Robot Exclusion @section Robot Exclusion
@cindex robot exclusion @cindex robot exclusion
@cindex robots.txt @cindex robots.txt
@ -3638,7 +3671,7 @@ avoid. To be found by the robots, the specifications must be placed in
download and parse. download and parse.
Although Wget is not a web robot in the strictest sense of the word, it Although Wget is not a web robot in the strictest sense of the word, it
can downloads large parts of the site without the user's intervention to can download large parts of the site without the user's intervention to
download an individual page. Because of that, Wget honors RES when download an individual page. Because of that, Wget honors RES when
downloading recursively. For instance, when you issue: downloading recursively. For instance, when you issue:
@ -3682,7 +3715,7 @@ robot exclusion, set the @code{robots} variable to @samp{off} in your
@file{.wgetrc}. You can achieve the same effect from the command line @file{.wgetrc}. You can achieve the same effect from the command line
using the @code{-e} switch, e.g. @samp{wget -e robots=off @var{url}...}. using the @code{-e} switch, e.g. @samp{wget -e robots=off @var{url}...}.
@node Security Considerations @node Security Considerations, Contributors, Robot Exclusion, Appendices
@section Security Considerations @section Security Considerations
@cindex security @cindex security
@ -3713,7 +3746,7 @@ being careful when you send debug logs (yes, even when you send them to
me). me).
@end enumerate @end enumerate
@node Contributors @node Contributors, , Security Considerations, Appendices
@section Contributors @section Contributors
@cindex contributors @cindex contributors
@ -4058,17 +4091,21 @@ Kristijan Zimmer.
Apologies to all who I accidentally left out, and many thanks to all the Apologies to all who I accidentally left out, and many thanks to all the
subscribers of the Wget mailing list. subscribers of the Wget mailing list.
@node Copying this manual @node Copying this manual, Concept Index, Appendices, Top
@appendix Copying this manual @appendix Copying this manual
@menu @menu
* GNU Free Documentation License:: Licnse for copying this manual. * GNU Free Documentation License:: Licnse for copying this manual.
@end menu @end menu
@node GNU Free Documentation License, , Copying this manual, Copying this manual
@appendixsec GNU Free Documentation License
@cindex FDL, GNU Free Documentation License
@include fdl.texi @include fdl.texi
@node Concept Index @node Concept Index, , Copying this manual, Top
@unnumbered Concept Index @unnumbered Concept Index
@printindex cp @printindex cp

View File

@ -1,3 +1,91 @@
2008-11-13 Micah Cowan <micah@cowan.name>
* http.c (gethttp): Don't do anything when content-length >= our
requested range.
2008-11-16 Steven Schubiger <stsc@members.fsf.org>
* main.c: Declare and initialize the numurls counter.
* ftp.c, http.c: Make the counter visible here and use it.
* options.h: Remove old declaration from options struct.
2008-11-15 Steven Schubiger <stsc@members.fsf.org>
* init.c (defaults): Set default waitretry value.
2008-11-14 Steven Schubiger <stsc@members.fsf.org>
* main.c (format_and_print_line): Use a custom format
string for printing leading spaces.
2008-11-12 Micah Cowan <micah@cowan.name>
* ftp-ls.c (ftp_index): HTML-escape dir name in title, h1, a:href.
2008-11-12 Alexander Belopolsky <alexander.belopolsky@gmail.com>
* url.c, url.h (url_escape_unsafe_and_reserved): Added.
* ftp-ls.c (ftp_index): URL-escape, rather than HTML-escape, the
filename appearing in the link.
2008-11-12 Steven Schubiger <stsc@members.fsf.org>
* main.c (print_version): Hand the relevant
xstrdup/xfree calls back to format_and_print_line().
2008-11-11 Steven Schubiger <stsc@members.fsf.org>
* main.c (format_and_print_line): Move both the memory
allocating and freeing bits upwards to print_version().
2008-11-10 Saint Xavier <wget@sxav.eu>
* http.c: Make --auth-no-challenge works with user:pass@ in URLs.
2008-11-05 Micah Cowan <micah@cowan.name>
* ftp.c (print_length): Should print humanized "size remaining"
only when it's at least 1k.
2008-10-31 Micah Cowan <micah@cowan.name>
* main.c (print_version): Add information about the mailing list.
2008-10-31 Alexander Drozdov <dzal_mail@mtu-net.ru>
* retr.c (fd_read_hunk): Make assert deal with maxsize == 0.
* ftp-ls.c (clean_line): Prevent underflow on empty lines.
2008-10-26 Gisle Vanem <gvanem@broadpark.no>
* main.c (format_and_print_line): Put variables on top of
blocks (not all compilers are C99). Add an extra '\n' if
SYSTEM_WGETRC isn't defined and printed.
2008-09-09 Gisle Vanem <gvanem@broadpark.no>
* url.c (url_error): Use aprintf, not asprintf.
2008-09-09 Micah Cowan <micah@cowan.name>
* init.c (home_dir): Save the calculated value for home,
to avoid duplicated work on repeated calls.
(wgetrc_file_name) [WINDOWS]: Define and initialize home var.
* build_info.c, main.c: Remove unnecessary extern vars
system_wgetrc and locale_dir.
* main.c: Define program_name for lib/error.c.
2008-09-02 Gisle Vanem <gvanem@broadpark.no>
* mswindows.h: Must ensure <stdio.h> is included before
we redefine ?vsnprintf().
2008-08-08 Steven Schubiger <stsc@members.fsf.org> 2008-08-08 Steven Schubiger <stsc@members.fsf.org>
* main.c, utils.h: Removed some dead conditional DEBUG_MALLOC code. * main.c, utils.h: Removed some dead conditional DEBUG_MALLOC code.

View File

@ -33,9 +33,6 @@ as that of the covered work. */
#include "wget.h" #include "wget.h"
#include <stdio.h> #include <stdio.h>
char *system_wgetrc = SYSTEM_WGETRC;
char *locale_dir = LOCALEDIR;
const char* (compiled_features[]) = const char* (compiled_features[]) =
{ {

View File

@ -75,6 +75,7 @@ clean_line(char *line)
if (!len) return 0; if (!len) return 0;
if (line[len - 1] == '\n') if (line[len - 1] == '\n')
line[--len] = '\0'; line[--len] = '\0';
if (!len) return 0;
if (line[len - 1] == '\r') if (line[len - 1] == '\r')
line[--len] = '\0'; line[--len] = '\0';
for ( ; *line ; line++ ) if (*line == '\t') *line = ' '; for ( ; *line ; line++ ) if (*line == '\t') *line = ' ';
@ -849,7 +850,9 @@ ftp_index (const char *file, struct url *u, struct fileinfo *f)
{ {
FILE *fp; FILE *fp;
char *upwd; char *upwd;
char *htcldir; /* HTML-clean dir name */
char *htclfile; /* HTML-clean file name */ char *htclfile; /* HTML-clean file name */
char *urlclfile; /* URL-clean file name */
if (!output_stream) if (!output_stream)
{ {
@ -877,12 +880,16 @@ ftp_index (const char *file, struct url *u, struct fileinfo *f)
} }
else else
upwd = xstrdup (""); upwd = xstrdup ("");
htcldir = html_quote_string (u->dir);
fprintf (fp, "<!DOCTYPE HTML PUBLIC \"-//IETF//DTD HTML 2.0//EN\">\n"); fprintf (fp, "<!DOCTYPE HTML PUBLIC \"-//IETF//DTD HTML 2.0//EN\">\n");
fprintf (fp, "<html>\n<head>\n<title>"); fprintf (fp, "<html>\n<head>\n<title>");
fprintf (fp, _("Index of /%s on %s:%d"), u->dir, u->host, u->port); fprintf (fp, _("Index of /%s on %s:%d"), htcldir, u->host, u->port);
fprintf (fp, "</title>\n</head>\n<body>\n<h1>"); fprintf (fp, "</title>\n</head>\n<body>\n<h1>");
fprintf (fp, _("Index of /%s on %s:%d"), u->dir, u->host, u->port); fprintf (fp, _("Index of /%s on %s:%d"), htcldir, u->host, u->port);
fprintf (fp, "</h1>\n<hr>\n<pre>\n"); fprintf (fp, "</h1>\n<hr>\n<pre>\n");
while (f) while (f)
{ {
fprintf (fp, " "); fprintf (fp, " ");
@ -922,13 +929,18 @@ ftp_index (const char *file, struct url *u, struct fileinfo *f)
break; break;
} }
htclfile = html_quote_string (f->name); htclfile = html_quote_string (f->name);
urlclfile = url_escape_unsafe_and_reserved (f->name);
fprintf (fp, "<a href=\"ftp://%s%s:%d", upwd, u->host, u->port); fprintf (fp, "<a href=\"ftp://%s%s:%d", upwd, u->host, u->port);
if (*u->dir != '/') if (*u->dir != '/')
putc ('/', fp); putc ('/', fp);
fprintf (fp, "%s", u->dir); /* XXX: Should probably URL-escape dir components here, rather
* than just HTML-escape, for consistency with the next bit where
* we use urlclfile for the file component. Anyway, this is safer
* than what we had... */
fprintf (fp, "%s", htcldir);
if (*u->dir) if (*u->dir)
putc ('/', fp); putc ('/', fp);
fprintf (fp, "%s", htclfile); fprintf (fp, "%s", urlclfile);
if (f->type == FT_DIRECTORY) if (f->type == FT_DIRECTORY)
putc ('/', fp); putc ('/', fp);
fprintf (fp, "\">%s", htclfile); fprintf (fp, "\">%s", htclfile);
@ -941,9 +953,11 @@ ftp_index (const char *file, struct url *u, struct fileinfo *f)
fprintf (fp, "-> %s", f->linkto ? f->linkto : "(nil)"); fprintf (fp, "-> %s", f->linkto ? f->linkto : "(nil)");
putc ('\n', fp); putc ('\n', fp);
xfree (htclfile); xfree (htclfile);
xfree (urlclfile);
f = f->next; f = f->next;
} }
fprintf (fp, "</pre>\n</body>\n</html>\n"); fprintf (fp, "</pre>\n</body>\n</html>\n");
xfree (htcldir);
xfree (upwd); xfree (upwd);
if (!output_stream) if (!output_stream)
fclose (fp); fclose (fp);

View File

@ -69,6 +69,7 @@ typedef struct
struct url *proxy; /* FTWK-style proxy */ struct url *proxy; /* FTWK-style proxy */
} ccon; } ccon;
extern int numurls;
/* Look for regexp "( *[0-9]+ *byte" (literal parenthesis) anywhere in /* Look for regexp "( *[0-9]+ *byte" (literal parenthesis) anywhere in
the string S, and return the number converted to wgint, if found, 0 the string S, and return the number converted to wgint, if found, 0
@ -216,7 +217,7 @@ print_length (wgint size, wgint start, bool authoritative)
logprintf (LOG_VERBOSE, " (%s)", human_readable (size)); logprintf (LOG_VERBOSE, " (%s)", human_readable (size));
if (start > 0) if (start > 0)
{ {
if (start >= 1024) if (size - start >= 1024)
logprintf (LOG_VERBOSE, _(", %s (%s) remaining"), logprintf (LOG_VERBOSE, _(", %s (%s) remaining"),
number_to_static_string (size - start), number_to_static_string (size - start),
human_readable (size - start)); human_readable (size - start));
@ -1295,7 +1296,7 @@ ftp_loop_internal (struct url *u, struct fileinfo *f, ccon *con)
number of bytes and files downloaded. */ number of bytes and files downloaded. */
{ {
total_downloaded_bytes += len; total_downloaded_bytes += len;
opt.numurls++; numurls++;
} }
/* Deletion of listing files is not controlled by --delete-after, but /* Deletion of listing files is not controlled by --delete-after, but
@ -1310,7 +1311,7 @@ ftp_loop_internal (struct url *u, struct fileinfo *f, ccon *con)
for instance, may want to know how many bytes and files they've for instance, may want to know how many bytes and files they've
downloaded through it. */ downloaded through it. */
total_downloaded_bytes += len; total_downloaded_bytes += len;
opt.numurls++; numurls++;
if (opt.delete_after) if (opt.delete_after)
{ {

View File

@ -142,6 +142,8 @@ struct request {
int hcount, hcapacity; int hcount, hcapacity;
}; };
extern int numurls;
/* Create a new, empty request. At least request_set_method must be /* Create a new, empty request. At least request_set_method must be
called before the request can be used. */ called before the request can be used. */
@ -1496,9 +1498,10 @@ gethttp (struct url *u, struct http_stat *hs, int *dt, struct url *proxy,
user = user ? user : (opt.http_user ? opt.http_user : opt.user); user = user ? user : (opt.http_user ? opt.http_user : opt.user);
passwd = passwd ? passwd : (opt.http_passwd ? opt.http_passwd : opt.passwd); passwd = passwd ? passwd : (opt.http_passwd ? opt.http_passwd : opt.passwd);
if (user && passwd /* We only do "site-wide" authentication with "global" user/password
&& !u->user) /* We only do "site-wide" authentication with "global" * values unless --auth-no-challange has been requested; URL user/password
user/password values; URL user/password info overrides. */ * info overrides. */
if (user && passwd && (!u->user || opt.auth_without_challenge))
{ {
/* If this is a host for which we've already received a Basic /* If this is a host for which we've already received a Basic
* challenge, we'll go ahead and send Basic authentication creds. */ * challenge, we'll go ahead and send Basic authentication creds. */
@ -2159,11 +2162,15 @@ File %s already there; not retrieving.\n\n"), quote (hs->local_file));
} }
} }
if (statcode == HTTP_STATUS_RANGE_NOT_SATISFIABLE) if (statcode == HTTP_STATUS_RANGE_NOT_SATISFIABLE
|| (hs->restval > 0 && statcode == HTTP_STATUS_OK
&& contrange == 0 && hs->restval >= contlen)
)
{ {
/* If `-c' is in use and the file has been fully downloaded (or /* If `-c' is in use and the file has been fully downloaded (or
the remote file has shrunk), Wget effectively requests bytes the remote file has shrunk), Wget effectively requests bytes
after the end of file and the server response with 416. */ after the end of file and the server response with 416
(or 200 with a <= Content-Length. */
logputs (LOG_VERBOSE, _("\ logputs (LOG_VERBOSE, _("\
\n The file is already fully retrieved; nothing to do.\n\n")); \n The file is already fully retrieved; nothing to do.\n\n"));
/* In case the caller inspects. */ /* In case the caller inspects. */
@ -2773,7 +2780,7 @@ Remote file exists.\n\n"));
number_to_static_string (hstat.contlen), number_to_static_string (hstat.contlen),
hstat.local_file, count); hstat.local_file, count);
} }
++opt.numurls; ++numurls;
total_downloaded_bytes += hstat.len; total_downloaded_bytes += hstat.len;
/* Remember that we downloaded the file for later ".orig" code. */ /* Remember that we downloaded the file for later ".orig" code. */
@ -2801,7 +2808,7 @@ Remote file exists.\n\n"));
tms, u->url, number_to_static_string (hstat.len), tms, u->url, number_to_static_string (hstat.len),
hstat.local_file, count); hstat.local_file, count);
} }
++opt.numurls; ++numurls;
total_downloaded_bytes += hstat.len; total_downloaded_bytes += hstat.len;
/* Remember that we downloaded the file for later ".orig" code. */ /* Remember that we downloaded the file for later ".orig" code. */

View File

@ -335,6 +335,8 @@ defaults (void)
opt.max_redirect = 20; opt.max_redirect = 20;
opt.waitretry = 10;
#ifdef ENABLE_IRI #ifdef ENABLE_IRI
opt.enable_iri = true; opt.enable_iri = true;
#else #else
@ -349,35 +351,41 @@ defaults (void)
char * char *
home_dir (void) home_dir (void)
{ {
char *home = getenv ("HOME"); static char buf[PATH_MAX];
static char *home;
if (!home) if (!home)
{ {
home = getenv ("HOME");
if (!home)
{
#if defined(MSDOS) #if defined(MSDOS)
/* Under MSDOS, if $HOME isn't defined, use the directory where /* Under MSDOS, if $HOME isn't defined, use the directory where
`wget.exe' resides. */ `wget.exe' resides. */
const char *_w32_get_argv0 (void); /* in libwatt.a/pcconfig.c */ const char *_w32_get_argv0 (void); /* in libwatt.a/pcconfig.c */
char *p, buf[PATH_MAX]; char *p;
strcpy (buf, _w32_get_argv0 ()); strcpy (buf, _w32_get_argv0 ());
p = strrchr (buf, '/'); /* djgpp */ p = strrchr (buf, '/'); /* djgpp */
if (!p) if (!p)
p = strrchr (buf, '\\'); /* others */ p = strrchr (buf, '\\'); /* others */
assert (p); assert (p);
*p = '\0'; *p = '\0';
home = buf; home = buf;
#elif !defined(WINDOWS) #elif !defined(WINDOWS)
/* If HOME is not defined, try getting it from the password /* If HOME is not defined, try getting it from the password
file. */ file. */
struct passwd *pwd = getpwuid (getuid ()); struct passwd *pwd = getpwuid (getuid ());
if (!pwd || !pwd->pw_dir) if (!pwd || !pwd->pw_dir)
return NULL; return NULL;
home = pwd->pw_dir; strcpy (buf, pwd->pw_dir);
home = buf;
#else /* !WINDOWS */ #else /* !WINDOWS */
/* Under Windows, if $HOME isn't defined, use the directory where /* Under Windows, if $HOME isn't defined, use the directory where
`wget.exe' resides. */ `wget.exe' resides. */
home = ws_mypath (); home = ws_mypath ();
#endif /* WINDOWS */ #endif /* WINDOWS */
}
} }
return home ? xstrdup (home) : NULL; return home ? xstrdup (home) : NULL;
@ -403,12 +411,13 @@ wgetrc_env_file_name (void)
} }
return NULL; return NULL;
} }
/* Check for the existance of '$HOME/.wgetrc' and return it's path /* Check for the existance of '$HOME/.wgetrc' and return it's path
if it exists and is set. */ if it exists and is set. */
char * char *
wgetrc_user_file_name (void) wgetrc_user_file_name (void)
{ {
char *home = home_dir(); char *home = home_dir ();
char *file = NULL; char *file = NULL;
if (home) if (home)
file = aprintf ("%s/.wgetrc", home); file = aprintf ("%s/.wgetrc", home);
@ -422,6 +431,7 @@ wgetrc_user_file_name (void)
} }
return file; return file;
} }
/* Return the path to the user's .wgetrc. This is either the value of /* Return the path to the user's .wgetrc. This is either the value of
`WGETRC' environment variable, or `$HOME/.wgetrc'. `WGETRC' environment variable, or `$HOME/.wgetrc'.
@ -430,6 +440,7 @@ wgetrc_user_file_name (void)
char * char *
wgetrc_file_name (void) wgetrc_file_name (void)
{ {
char *home = NULL;
char *file = wgetrc_env_file_name (); char *file = wgetrc_env_file_name ();
if (file && *file) if (file && *file)
return file; return file;
@ -441,6 +452,7 @@ wgetrc_file_name (void)
`wget.ini' in the directory where `wget.exe' resides; we do this for `wget.ini' in the directory where `wget.exe' resides; we do this for
backward compatibility with previous versions of Wget. backward compatibility with previous versions of Wget.
SYSTEM_WGETRC should not be defined under WINDOWS. */ SYSTEM_WGETRC should not be defined under WINDOWS. */
home = home_dir ();
if (!file || !file_exists_p (file)) if (!file || !file_exists_p (file))
{ {
xfree_null (file); xfree_null (file);
@ -449,6 +461,7 @@ wgetrc_file_name (void)
if (home) if (home)
file = aprintf ("%s/wget.ini", home); file = aprintf ("%s/wget.ini", home);
} }
xfree_null (home);
#endif /* WINDOWS */ #endif /* WINDOWS */
if (!file) if (!file)

View File

@ -72,8 +72,6 @@ extern char *system_getrc;
extern char *link_string; extern char *link_string;
/* defined in build_info.c */ /* defined in build_info.c */
extern char *compiled_features[]; extern char *compiled_features[];
extern char *system_wgetrc;
extern char *locale_dir;
/* Used for --version output in print_version */ /* Used for --version output in print_version */
static const int max_chars_per_line = 72; static const int max_chars_per_line = 72;
@ -82,6 +80,9 @@ static void redirect_output_signal (int);
#endif #endif
const char *exec_name; const char *exec_name;
/* Number of successfully downloaded URLs */
int numurls = 0;
#ifndef TESTING #ifndef TESTING
/* Initialize I18N/L10N. That amounts to invoking setlocale, and /* Initialize I18N/L10N. That amounts to invoking setlocale, and
@ -711,20 +712,26 @@ prompt_for_password (void)
and an appropriate number of spaces are added on subsequent and an appropriate number of spaces are added on subsequent
lines.*/ lines.*/
static void static void
format_and_print_line (char* prefix, char* line, format_and_print_line (const char *prefix, const char *line,
int line_length) int line_length)
{ {
int leading_spaces;
int remaining_chars;
char *line_dup, *token;
assert (prefix != NULL); assert (prefix != NULL);
assert (line != NULL); assert (line != NULL);
line_dup = xstrdup (line);
if (line_length <= 0) if (line_length <= 0)
line_length = max_chars_per_line; line_length = max_chars_per_line;
const int leading_spaces = strlen (prefix); leading_spaces = strlen (prefix);
printf ("%s", prefix); printf ("%s", prefix);
int remaining_chars = line_length - leading_spaces; remaining_chars = line_length - leading_spaces;
/* We break on spaces. */ /* We break on spaces. */
char* token = strtok (line, " "); token = strtok (line_dup, " ");
while (token != NULL) while (token != NULL)
{ {
/* If however a token is much larger than the maximum /* If however a token is much larger than the maximum
@ -732,12 +739,7 @@ format_and_print_line (char* prefix, char* line,
token on the next line. */ token on the next line. */
if (remaining_chars <= strlen (token)) if (remaining_chars <= strlen (token))
{ {
printf ("\n"); printf ("\n%*c", leading_spaces, ' ');
int j = 0;
for (j = 0; j < leading_spaces; j++)
{
printf (" ");
}
remaining_chars = line_length - leading_spaces; remaining_chars = line_length - leading_spaces;
} }
printf ("%s ", token); printf ("%s ", token);
@ -746,8 +748,8 @@ format_and_print_line (char* prefix, char* line,
} }
printf ("\n"); printf ("\n");
xfree (prefix);
xfree (line); xfree (line_dup);
} }
static void static void
@ -760,13 +762,15 @@ print_version (void)
const char *link_title = "Link : "; const char *link_title = "Link : ";
const char *prefix_spaces = " "; const char *prefix_spaces = " ";
const int prefix_space_length = strlen (prefix_spaces); const int prefix_space_length = strlen (prefix_spaces);
char *line;
char *env_wgetrc, *user_wgetrc;
int i;
printf ("GNU Wget %s\n", version_string); printf ("GNU Wget %s\n", version_string);
printf (options_title); printf (options_title);
/* compiled_features is a char*[]. We limit the characters per /* compiled_features is a char*[]. We limit the characters per
line to max_chars_per_line and prefix each line with a constant line to max_chars_per_line and prefix each line with a constant
number of spaces for proper alignment. */ number of spaces for proper alignment. */
int i =0;
for (i = 0; compiled_features[i] != NULL; ) for (i = 0; compiled_features[i] != NULL; )
{ {
int line_length = max_chars_per_line - prefix_space_length; int line_length = max_chars_per_line - prefix_space_length;
@ -785,31 +789,36 @@ print_version (void)
/* Handle the case when $WGETRC is unset and $HOME/.wgetrc is /* Handle the case when $WGETRC is unset and $HOME/.wgetrc is
absent. */ absent. */
printf (wgetrc_title); printf (wgetrc_title);
char *env_wgetrc = wgetrc_env_file_name (); env_wgetrc = wgetrc_env_file_name ();
if (env_wgetrc && *env_wgetrc) if (env_wgetrc && *env_wgetrc)
{ {
printf ("%s (env)\n%s", env_wgetrc, prefix_spaces); printf ("%s (env)\n%s", env_wgetrc, prefix_spaces);
xfree (env_wgetrc); xfree (env_wgetrc);
} }
char *user_wgetrc = wgetrc_user_file_name (); user_wgetrc = wgetrc_user_file_name ();
if (user_wgetrc) if (user_wgetrc)
{ {
printf ("%s (user)\n%s", user_wgetrc, prefix_spaces); printf ("%s (user)\n%s", user_wgetrc, prefix_spaces);
xfree (user_wgetrc); xfree (user_wgetrc);
} }
printf ("%s (system)\n", system_wgetrc); #ifdef SYSTEM_WGETRC
printf ("%s (system)\n", SYSTEM_WGETRC);
#else
putchar ('\n');
#endif
format_and_print_line (strdup (locale_title), format_and_print_line (locale_title,
strdup (locale_dir), LOCALEDIR,
max_chars_per_line); max_chars_per_line);
format_and_print_line (strdup (compile_title), format_and_print_line (compile_title,
strdup (compilation_string), compilation_string,
max_chars_per_line); max_chars_per_line);
format_and_print_line (strdup (link_title), format_and_print_line (link_title,
strdup (link_string), link_string,
max_chars_per_line); max_chars_per_line);
printf ("\n"); printf ("\n");
/* TRANSLATORS: When available, an actual copyright character /* TRANSLATORS: When available, an actual copyright character
(cirle-c) should be used in preference to "(C)". */ (cirle-c) should be used in preference to "(C)". */
@ -826,9 +835,13 @@ There is NO WARRANTY, to the extent permitted by law.\n"), stdout);
stdout); stdout);
fputs (_("Currently maintained by Micah Cowan <micah@cowan.name>.\n"), fputs (_("Currently maintained by Micah Cowan <micah@cowan.name>.\n"),
stdout); stdout);
fputs (_("Please send bug reports and questions to <bug-wget@gnu.org>.\n"),
stdout);
exit (0); exit (0);
} }
char *program_name; /* Needed by lib/error.c. */
int int
main (int argc, char **argv) main (int argc, char **argv)
{ {
@ -837,6 +850,8 @@ main (int argc, char **argv)
int nurl, status; int nurl, status;
bool append_to_log = false; bool append_to_log = false;
program_name = argv[0];
i18n_initialize (); i18n_initialize ();
/* Construct the name of the executable, without the directory part. */ /* Construct the name of the executable, without the directory part. */
@ -1249,7 +1264,7 @@ WARNING: Can't reopen standard output in binary mode;\n\
logprintf (LOG_NOTQUIET, logprintf (LOG_NOTQUIET,
_("FINISHED --%s--\nDownloaded: %d files, %s in %s (%s)\n"), _("FINISHED --%s--\nDownloaded: %d files, %s in %s (%s)\n"),
datetime_str (time (NULL)), datetime_str (time (NULL)),
opt.numurls, numurls,
human_readable (total_downloaded_bytes), human_readable (total_downloaded_bytes),
secs_to_human_time (total_download_time), secs_to_human_time (total_download_time),
retr_rate (total_downloaded_bytes, total_download_time)); retr_rate (total_downloaded_bytes, total_download_time));

View File

@ -78,6 +78,8 @@ as that of the covered work. */
# define strncasecmp strnicmp # define strncasecmp strnicmp
#endif #endif
#include <stdio.h>
/* The same for snprintf() and vsnprintf(). */ /* The same for snprintf() and vsnprintf(). */
#define snprintf _snprintf #define snprintf _snprintf
#define vsnprintf _vsnprintf #define vsnprintf _vsnprintf

View File

@ -124,10 +124,6 @@ struct options
SUM_SIZE_INT quota; /* Maximum file size to download and SUM_SIZE_INT quota; /* Maximum file size to download and
store. */ store. */
int numurls; /* Number of successfully downloaded
URLs #### should be removed because
it's not a setting, but a global var */
bool server_response; /* Do we print server response? */ bool server_response; /* Do we print server response? */
bool save_headers; /* Do we save headers together with bool save_headers; /* Do we save headers together with
file? */ file? */

View File

@ -393,7 +393,7 @@ fd_read_hunk (int fd, hunk_terminator_t terminator, long sizehint, long maxsize)
char *hunk = xmalloc (bufsize); char *hunk = xmalloc (bufsize);
int tail = 0; /* tail position in HUNK */ int tail = 0; /* tail position in HUNK */
assert (maxsize >= bufsize); assert (!maxsize || maxsize >= bufsize);
while (1) while (1)
{ {

View File

@ -252,6 +252,15 @@ url_escape (const char *s)
return url_escape_1 (s, urlchr_unsafe, false); return url_escape_1 (s, urlchr_unsafe, false);
} }
/* URL-escape the unsafe and reserved characters (see urlchr_table) in
a given string, returning a freshly allocated string. */
char *
url_escape_unsafe_and_reserved (const char *s)
{
return url_escape_1 (s, urlchr_unsafe|urlchr_reserved, false);
}
/* URL-escape the unsafe characters (see urlchr_table) in a given /* URL-escape the unsafe characters (see urlchr_table) in a given
string. If no characters are unsafe, S is returned. */ string. If no characters are unsafe, S is returned. */
@ -929,9 +938,9 @@ url_error (const char *url, int error_code)
if ((p = strchr (scheme, ':'))) if ((p = strchr (scheme, ':')))
*p = '\0'; *p = '\0';
if (!strcasecmp (scheme, "https")) if (!strcasecmp (scheme, "https"))
asprintf (&error, _("HTTPS support not compiled in")); error = aprintf (_("HTTPS support not compiled in"));
else else
asprintf (&error, _(parse_errors[error_code]), quote (scheme)); error = aprintf (_(parse_errors[error_code]), quote (scheme));
xfree (scheme); xfree (scheme);
return error; return error;

View File

@ -83,6 +83,7 @@ struct url
/* Function declarations */ /* Function declarations */
char *url_escape (const char *); char *url_escape (const char *);
char *url_escape_unsafe_and_reserved (const char *);
struct url *url_parse (const char *, int *, struct iri *iri, bool percent_encode); struct url *url_parse (const char *, int *, struct iri *iri, bool percent_encode);
char *url_error (const char *, int); char *url_error (const char *, int);

View File

@ -1,3 +1,84 @@
2008-11-26 Micah Cowan <micah@cowan.name> (not copyrightable)
* Test-ftp-iri-disabled.px, Test-ftp-iri-fallback.px,
Test-ftp-iri.px, Test-idn-cmd.px, Test-idn-headers.px,
Test-idn-meta.px, Test-iri-disabled.px,
Test-iri-forced-remote.px, Test-iri-list.px, Test-iri.px: More
module-scope warnings.
2008-11-25 Steven Schubiger <stsc@members.fsf.org>
* WgetTest.pm.in: Remove the magic interpreter line;
replace -w with lexical warnings.
2008-11-13 Steven Schubiger <stsc@members.fsf.org>
* FTPServer.pm, FTPTest.pm, HTTPServer.pm, HTTPTest.pm,
WgetTest.pm.in: Clean up leftover whitespace.
2008-11-12 Steven Schubiger <stsc@members.fsf.org>
* Test-auth-basic.px, Test-auth-no-challenge.px,
Test-auth-no-challenge-url.px, Test-c-full.px,
Test-c-partial.px, Test-c.px, Test-c-shorter.px,
Test-E-k-K.px, Test-E-k.px, Test-ftp.px,
Test-HTTP-Content-Disposition-1.px,
Test-HTTP-Content-Disposition-2.px,
Test-HTTP-Content-Disposition.px, Test-N-current.px,
Test-N-HTTP-Content-Disposition.px,
Test-N--no-content-disposition.px,
Test-N--no-content-disposition-trivial.px,
Test-N-no-info.px, Test--no-content-disposition.px,
Test--no-content-disposition-trivial.px, Test-N-old.px,
Test-nonexisting-quiet.px, Test-noop.px, Test-np.px,
Test-N.px, Test-N-smaller.px,
Test-O-HTTP-Content-Disposition.px, Test-O-nc.px,
Test-O--no-content-disposition.px,
Test-O--no-content-disposition-trivial.px,
Test-O-nonexisting.px, Test-O.px,
Test-proxy-auth-basic.px, Test-Restrict-Lowercase.px,
Test-Restrict-Uppercase.px,
Test--spider-fail.pxm, Test--spider.px,
Test--spider-r-HTTP-Content-Disposition.px,
Test--spider-r--no-content-disposition.px,
Test--spider-r--no-content-disposition-trivial.px,
Test--spider-r.px: Enforce lexically scoped warnings.
* Test-proxied-https-auth.px, run-px: Place use strict
before use warnings.
2008-11-12 Steven Schubiger <stsc@members.fsf.org>
* FTPServer.pm, FTPTest.pm, HTTPServer.pm, HTTPTest.pm:
Remove the magic interpreter line, because it cannot be
used fully. Substitute -w with use warnings.
2008-11-11 Micah Cowan <micah@cowan.name>
* HTTPServer.pm (handle_auth): Allow testing of
--auth-no-challenge.
* Test-auth-no-challenge.px, Test-auth-no-challenge-url.px:
Added.
* run-px: Add Test-auth-no-challenge.px,
Test-auth-no-challenge-url.px.
2008-11-07 Steven Schubiger <stsc@members.fsf.org>
* run-px: Use some colors for the summary part of the test
output to strengthen the distinction between a successful
or failing run.
2008-11-06 Steven Schubiger <stsc@members.fsf.org>
* run-px: When executing test scripts, invoke them with the
current perl executable name as determined by env.
2008-11-06 Micah Cowan <micah@cowan.name>
* run-px: Use strict (thanks Steven Schubiger!).
2008-09-09 Micah Cowan <micah@cowan.name> 2008-09-09 Micah Cowan <micah@cowan.name>
* Test-idn-cmd.px: Added. * Test-idn-cmd.px: Added.

View File

@ -1,11 +1,10 @@
#!/usr/bin/perl -w
# Part of this code was borrowed from Richard Jones's Net::FTPServer # Part of this code was borrowed from Richard Jones's Net::FTPServer
# http://www.annexia.org/freeware/netftpserver # http://www.annexia.org/freeware/netftpserver
package FTPServer; package FTPServer;
use strict; use strict;
use warnings;
use Cwd; use Cwd;
use Socket; use Socket;
@ -354,12 +353,12 @@ sub _RETR_command
unless (defined $filename && length $filename) { unless (defined $filename && length $filename) {
print {$conn->{socket}} "550 File or directory not found.\r\n"; print {$conn->{socket}} "550 File or directory not found.\r\n";
return; return;
} }
if ($filename eq "." || $filename eq "..") { if ($filename eq "." || $filename eq "..") {
print {$conn->{socket}} "550 RETR command is not supported on directories.\r\n"; print {$conn->{socket}} "550 RETR command is not supported on directories.\r\n";
return; return;
} }
my $fullname = $conn->{rootdir} . $dir . $filename; my $fullname = $conn->{rootdir} . $dir . $filename;
@ -517,12 +516,12 @@ sub _SIZE_command
unless (defined $filename && length $filename) { unless (defined $filename && length $filename) {
print {$conn->{socket}} "550 File or directory not found.\r\n"; print {$conn->{socket}} "550 File or directory not found.\r\n";
return; return;
} }
if ($filename eq "." || $filename eq "..") { if ($filename eq "." || $filename eq "..") {
print {$conn->{socket}} "550 SIZE command is not supported on directories.\r\n"; print {$conn->{socket}} "550 SIZE command is not supported on directories.\r\n";
return; return;
} }
my $fullname = $conn->{rootdir} . $dir . $filename; my $fullname = $conn->{rootdir} . $dir . $filename;

View File

@ -1,8 +1,7 @@
#!/usr/bin/perl -w
package FTPTest; package FTPTest;
use strict; use strict;
use warnings;
use FTPServer; use FTPServer;
use WgetTest; use WgetTest;

View File

@ -1,8 +1,7 @@
#!/usr/bin/perl -w
package HTTPServer; package HTTPServer;
use strict; use strict;
use warnings;
use HTTP::Daemon; use HTTP::Daemon;
use HTTP::Status; use HTTP::Status;
@ -145,8 +144,7 @@ sub handle_auth {
my $authhdr = $req->header('Authorization'); my $authhdr = $req->header('Authorization');
# Have we sent the challenge yet? # Have we sent the challenge yet?
unless (defined $url_rec->{auth_challenged} unless ($url_rec->{auth_challenged} || $url_rec->{auth_no_challenge}) {
&& $url_rec->{auth_challenged}) {
# Since we haven't challenged yet, we'd better not # Since we haven't challenged yet, we'd better not
# have received authentication (for our testing purposes). # have received authentication (for our testing purposes).
if ($authhdr) { if ($authhdr) {
@ -167,6 +165,9 @@ sub handle_auth {
# failed it. # failed it.
$code = 400; $code = 400;
$msg = "You didn't send auth after I sent challenge"; $msg = "You didn't send auth after I sent challenge";
if ($url_rec->{auth_no_challenge}) {
$msg = "--auth-no-challenge but no auth sent."
}
} else { } else {
my ($sent_method) = ($authhdr =~ /^(\S+)/g); my ($sent_method) = ($authhdr =~ /^(\S+)/g);
unless ($sent_method eq $url_rec->{'auth_method'}) { unless ($sent_method eq $url_rec->{'auth_method'}) {

View File

@ -1,8 +1,7 @@
#!/usr/bin/perl -w
package HTTPTest; package HTTPTest;
use strict; use strict;
use warnings;
use HTTPServer; use HTTPServer;
use WgetTest; use WgetTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -0,0 +1,50 @@
#!/usr/bin/perl
use strict;
use warnings;
use HTTPTest;
###############################################################################
my $wholefile = "You're all authenticated.\n";
# code, msg, headers, content
my %urls = (
'/needs-auth.txt' => {
auth_no_challenge => 1,
auth_method => 'Basic',
user => 'fiddle-dee-dee',
passwd => 'Dodgson',
code => "200",
msg => "You want fries with that?",
headers => {
"Content-type" => "text/plain",
},
content => $wholefile,
},
);
my $cmdline = $WgetTest::WGETPATH . " --auth-no-challenge "
. "http://fiddle-dee-dee:Dodgson\@localhost:{{port}}/needs-auth.txt";
my $expected_error_code = 0;
my %expected_downloaded_files = (
'needs-auth.txt' => {
content => $wholefile,
},
);
###############################################################################
my $the_test = HTTPTest->new (name => "Test-auth-no-challenge-url",
input => \%urls,
cmdline => $cmdline,
errcode => $expected_error_code,
output => \%expected_downloaded_files);
exit $the_test->run();
# vim: et ts=4 sw=4

51
tests/Test-auth-no-challenge.px Executable file
View File

@ -0,0 +1,51 @@
#!/usr/bin/perl
use strict;
use warnings;
use HTTPTest;
###############################################################################
my $wholefile = "You're all authenticated.\n";
# code, msg, headers, content
my %urls = (
'/needs-auth.txt' => {
auth_no_challenge => 1,
auth_method => 'Basic',
user => 'fiddle-dee-dee',
passwd => 'Dodgson',
code => "200",
msg => "You want fries with that?",
headers => {
"Content-type" => "text/plain",
},
content => $wholefile,
},
);
my $cmdline = $WgetTest::WGETPATH . " --auth-no-challenge"
. " --user=fiddle-dee-dee --password=Dodgson"
. " http://localhost:{{port}}/needs-auth.txt";
my $expected_error_code = 0;
my %expected_downloaded_files = (
'needs-auth.txt' => {
content => $wholefile,
},
);
###############################################################################
my $the_test = HTTPTest->new (name => "Test-auth-no-challenge",
input => \%urls,
cmdline => $cmdline,
errcode => $expected_error_code,
output => \%expected_downloaded_files);
exit $the_test->run();
# vim: et ts=4 sw=4

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use FTPTest; use FTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use FTPTest; use FTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use FTPTest; use FTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use FTPTest; use FTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl #!/usr/bin/perl
use warnings;
use strict; use strict;
use warnings;
use WgetTest; # For $WGETPATH. use WgetTest; # For $WGETPATH.

View File

@ -1,6 +1,7 @@
#!/usr/bin/perl -w #!/usr/bin/perl
use strict; use strict;
use warnings;
use HTTPTest; use HTTPTest;

View File

@ -1,5 +1,3 @@
#!/usr/bin/perl -w
# WARNING! # WARNING!
# WgetTest.pm is a generated file! Do not edit! Edit WgetTest.pm.in # WgetTest.pm is a generated file! Do not edit! Edit WgetTest.pm.in
# instead. # instead.
@ -8,6 +6,7 @@ package WgetTest;
$VERSION = 0.01; $VERSION = 0.01;
use strict; use strict;
use warnings;
use Cwd; use Cwd;
use File::Path; use File::Path;

View File

@ -1,11 +1,19 @@
#!/usr/bin/env perl #!/usr/bin/env perl
use 5.006;
use strict;
use warnings; use warnings;
use Term::ANSIColor ':constants';
$Term::ANSIColor::AUTORESET = 1;
die "Please specify the top source directory.\n" if (!@ARGV); die "Please specify the top source directory.\n" if (!@ARGV);
my $top_srcdir = shift @ARGV; my $top_srcdir = shift @ARGV;
my @tests = ( my @tests = (
'Test-auth-basic.px', 'Test-auth-basic.px',
'Test-auth-no-challenge.px',
'Test-auth-no-challenge-url.px',
'Test-proxy-auth-basic.px', 'Test-proxy-auth-basic.px',
'Test-proxied-https-auth.px', 'Test-proxied-https-auth.px',
'Test-N-HTTP-Content-Disposition.px', 'Test-N-HTTP-Content-Disposition.px',
@ -57,26 +65,56 @@ my @tests = (
'Test--spider-r.px', 'Test--spider-r.px',
); );
my @results; my @tested;
for my $test (@tests) { foreach my $test (@tests) {
print "Running $test\n\n"; print "Running $test\n\n";
system("$top_srcdir/tests/$test"); system("$^X $top_srcdir/tests/$test");
push @results, $?; push @tested, { name => $test, result => $? };
}
for (my $i=0; $i != @tests; ++$i) {
if ($results[$i] == 0) {
print "pass: ";
} else {
print "FAIL: ";
}
print "$tests[$i]\n";
} }
print "\n"; print "\n";
print scalar(@results) . " tests were run\n"; foreach my $test (@tested) {
print scalar(grep $_ == 0, @results) . " PASS\n"; ($test->{result} == 0)
print scalar(grep $_ != 0, @results) . " FAIL\n"; ? print GREEN 'pass: '
: print RED 'FAIL: ';
print $test->{name}, "\n";
}
exit scalar (grep $_ != 0, @results); my $count = sub
{
return {
pass => sub { scalar grep $_->{result} == 0, @tested },
fail => sub { scalar grep $_->{result} != 0, @tested },
}->{$_[0]}->();
};
my $summary = sub
{
my @lines = (
"${\scalar @tested} tests were run",
"${\$count->('pass')} PASS, ${\$count->('fail')} FAIL",
);
my $len_longest = sub
{
local $_ = 0;
foreach my $line (@lines) {
if (length $line > $_) {
$_ = length $line;
}
}
return $_;
}->();
return join "\n",
'=' x $len_longest,
@lines,
'=' x $len_longest;
}->();
print "\n";
print $count->('fail')
? RED $summary
: GREEN $summary;
print "\n";
exit $count->('fail');

48
util/freeopts Executable file
View File

@ -0,0 +1,48 @@
#!/usr/bin/perl -n
# NOTE the use of -n above; this script is called in a loop.
use warnings;
use strict;
our $scanning;
our %used_chars;
BEGIN {
$scanning = 0;
%used_chars = ();
open STDIN, "../src/main.c" or die "main.c: $!\n";
}
if (/^static struct cmdline_option option_data/) {
$scanning = 1;
}
elsif (/[}];/) {
$scanning = 0;
}
elsif (
$scanning &&
/^[\t ]*\{ "[^"]*", '(.)', OPT_[A-Z0-9_]*, /
) {
$used_chars{$1} = 1;
}
END {
my $cols = 0;
my $max_cols = 13;
my $opt_chars =
"ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789";
print "Free chars:\n\t";
for (my $i = 0; $i < length $opt_chars; ++$i, ++$cols) {
if ($cols == $max_cols) {
$cols = 0;
print "\n\t";
}
my $opt = substr($opt_chars,$i,1);
print ' ';
if (!$used_chars{ $opt }) {
print "-$opt";
} else {
print ' ';
}
}
print "\n";
}

View File

@ -1,3 +1,11 @@
2008-09-09 Gisle Vanem <gvanem@broadpark.no>
* config-compiler.h: MingW do have <stdint.h>; added HAVE_STDINT_H.
Added _CRT_SECURE_NO_WARNINGS to supress warnings in MSVC8+ about
using "old" ANSI-functions.
* config.h: config-post.h is gone. SIZEOF_LONG_LONG is 8.
2008-01-25 Micah Cowan <micah@cowan.name> 2008-01-25 Micah Cowan <micah@cowan.name>
* Makefile.am, Makefile.doc, Makefile.src, Makefile.top, * Makefile.am, Makefile.doc, Makefile.src, Makefile.top,

View File

@ -83,6 +83,7 @@ as that of the covered work. */
/* MinGW and GCC support some POSIX and C99 features. */ /* MinGW and GCC support some POSIX and C99 features. */
#define HAVE_INTTYPES_H 1 #define HAVE_INTTYPES_H 1
#define HAVE_STDINT_H 1
#define HAVE__BOOL 1 #define HAVE__BOOL 1
#undef SIZEOF_LONG_LONG /* avoid redefinition warning */ #undef SIZEOF_LONG_LONG /* avoid redefinition warning */
@ -128,6 +129,7 @@ as that of the covered work. */
#if _MSC_VER >= 1400 #if _MSC_VER >= 1400
#pragma warning ( disable : 4996 ) #pragma warning ( disable : 4996 )
#define _CRT_SECURE_NO_DEPRECATE #define _CRT_SECURE_NO_DEPRECATE
#define _CRT_SECURE_NO_WARNINGS
#endif #endif

View File

@ -158,7 +158,7 @@
#define SIZEOF_LONG 4 #define SIZEOF_LONG 4
/* The size of a `long long', as computed by sizeof. */ /* The size of a `long long', as computed by sizeof. */
#define SIZEOF_LONG_LONG 0 #define SIZEOF_LONG_LONG 8
/* The size of a `off_t', as computed by sizeof. */ /* The size of a `off_t', as computed by sizeof. */
#define SIZEOF_OFF_T 4 #define SIZEOF_OFF_T 4
@ -214,5 +214,3 @@
/* Include compiler-specific defines. */ /* Include compiler-specific defines. */
#include "config-compiler.h" #include "config-compiler.h"
#include "config-post.h"