1999-12-02 02:42:23 -05:00
|
|
|
\input texinfo @c -*-texinfo-*-
|
|
|
|
|
|
|
|
@c %**start of header
|
|
|
|
@setfilename wget.info
|
2003-09-30 17:09:06 -04:00
|
|
|
@include version.texi
|
|
|
|
@settitle GNU Wget @value{VERSION} Manual
|
1999-12-02 02:42:23 -05:00
|
|
|
@c Disable the monstrous rectangles beside overfull hbox-es.
|
|
|
|
@finalout
|
|
|
|
@c Use `odd' to print double-sided.
|
|
|
|
@setchapternewpage on
|
|
|
|
@c %**end of header
|
|
|
|
|
|
|
|
@iftex
|
|
|
|
@c Remove this if you don't use A4 paper.
|
|
|
|
@afourpaper
|
|
|
|
@end iftex
|
|
|
|
|
2003-09-16 06:38:02 -04:00
|
|
|
@c Title for man page. The weird way texi2pod.pl is written requires
|
|
|
|
@c the preceding @set.
|
|
|
|
@set Wget Wget
|
|
|
|
@c man title Wget The non-interactive network downloader.
|
|
|
|
|
2011-09-27 11:14:43 -04:00
|
|
|
@dircategory Network applications
|
1999-12-02 02:42:23 -05:00
|
|
|
@direntry
|
2011-09-27 11:14:43 -04:00
|
|
|
* Wget: (wget). Non-interactive network downloader.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end direntry
|
|
|
|
|
2008-01-31 05:00:14 -05:00
|
|
|
@copying
|
2007-12-07 01:43:28 -05:00
|
|
|
This file documents the GNU Wget utility for downloading network
|
1999-12-02 02:42:23 -05:00
|
|
|
data.
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin COPYRIGHT
|
2009-09-04 03:13:47 -04:00
|
|
|
Copyright @copyright{} 1996, 1997, 1998, 1999, 2000, 2001, 2002, 2003,
|
2011-01-01 07:19:37 -05:00
|
|
|
2004, 2005, 2006, 2007, 2008, 2009, 2010, 2011 Free Software Foundation,
|
|
|
|
Inc.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-01-31 05:00:14 -05:00
|
|
|
@iftex
|
1999-12-02 02:42:23 -05:00
|
|
|
Permission is granted to make and distribute verbatim copies of
|
|
|
|
this manual provided the copyright notice and this permission notice
|
|
|
|
are preserved on all copies.
|
2008-01-31 05:00:14 -05:00
|
|
|
@end iftex
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@ignore
|
|
|
|
Permission is granted to process this file through TeX and print the
|
|
|
|
results, provided the printed document carries a copying permission
|
|
|
|
notice identical to this one except for the removal of this paragraph
|
|
|
|
(this paragraph not being relevant to the printed manual).
|
|
|
|
@end ignore
|
2000-11-14 17:49:07 -05:00
|
|
|
Permission is granted to copy, distribute and/or modify this document
|
2013-12-29 05:41:22 -05:00
|
|
|
under the terms of the GNU Free Documentation License, Version 1.3 or
|
2006-06-28 09:02:55 -04:00
|
|
|
any later version published by the Free Software Foundation; with no
|
2013-12-29 05:41:22 -05:00
|
|
|
Invariant Sections, with no Front-Cover Texts, and with no Back-Cover
|
|
|
|
Texts. A copy of the license is included in the section entitled
|
|
|
|
``GNU Free Documentation License''.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
2008-01-31 05:00:14 -05:00
|
|
|
@end copying
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@titlepage
|
2003-09-30 17:09:06 -04:00
|
|
|
@title GNU Wget @value{VERSION}
|
|
|
|
@subtitle The non-interactive download utility
|
1999-12-02 02:42:23 -05:00
|
|
|
@subtitle Updated for Wget @value{VERSION}, @value{UPDATED}
|
2005-06-22 09:00:02 -04:00
|
|
|
@author by Hrvoje Nik@v{s}i@'{c} and others
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@ignore
|
|
|
|
@c man begin AUTHOR
|
2003-11-08 18:48:36 -05:00
|
|
|
Originally written by Hrvoje Niksic <hniksic@xemacs.org>.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
|
|
|
@c man begin SEEALSO
|
2007-09-28 01:31:09 -04:00
|
|
|
This is @strong{not} the complete manual for GNU Wget.
|
|
|
|
For more complete information, including more detailed explanations of
|
|
|
|
some of the options, and a number of commands available
|
|
|
|
for use with @file{.wgetrc} files and the @samp{-e} option, see the GNU
|
|
|
|
Info entry for @file{wget}.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
|
|
|
@end ignore
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@page
|
|
|
|
@vskip 0pt plus 1filll
|
2008-01-31 05:00:14 -05:00
|
|
|
@insertcopying
|
1999-12-02 02:42:23 -05:00
|
|
|
@end titlepage
|
|
|
|
|
2008-01-31 05:00:14 -05:00
|
|
|
@contents
|
|
|
|
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Top, Overview, (dir), (dir)
|
1999-12-02 02:42:23 -05:00
|
|
|
@top Wget @value{VERSION}
|
|
|
|
|
2008-01-31 05:00:14 -05:00
|
|
|
@insertcopying
|
|
|
|
@end ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@menu
|
2009-08-28 02:57:09 -04:00
|
|
|
* Overview:: Features of Wget.
|
|
|
|
* Invoking:: Wget command-line arguments.
|
|
|
|
* Recursive Download:: Downloading interlinked pages.
|
|
|
|
* Following Links:: The available methods of chasing links.
|
|
|
|
* Time-Stamping:: Mirroring according to time-stamps.
|
|
|
|
* Startup File:: Wget's initialization file.
|
|
|
|
* Examples:: Examples of usage.
|
|
|
|
* Various:: The stuff that doesn't fit anywhere else.
|
|
|
|
* Appendices:: Some useful references.
|
|
|
|
* Copying this manual:: You may give out copies of this manual.
|
|
|
|
* Concept Index:: Topics covered by this manual.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end menu
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Overview, Invoking, Top, Top
|
1999-12-02 02:42:23 -05:00
|
|
|
@chapter Overview
|
|
|
|
@cindex overview
|
|
|
|
@cindex features
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
2001-12-08 01:47:48 -05:00
|
|
|
GNU Wget is a free utility for non-interactive download of files from
|
|
|
|
the Web. It supports @sc{http}, @sc{https}, and @sc{ftp} protocols, as
|
|
|
|
well as retrieval through @sc{http} proxies.
|
|
|
|
|
|
|
|
@c man end
|
|
|
|
This chapter is a partial overview of Wget's features.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
Wget is non-interactive, meaning that it can work in the background,
|
|
|
|
while the user is not logged on. This allows you to start a retrieval
|
|
|
|
and disconnect from the system, letting Wget finish the work. By
|
|
|
|
contrast, most of the Web browsers require constant user's presence,
|
|
|
|
which can be a great hindrance when transferring a lot of data.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
@ignore
|
|
|
|
@c man begin DESCRIPTION
|
|
|
|
|
|
|
|
@c man end
|
|
|
|
@end ignore
|
|
|
|
@c man begin DESCRIPTION
|
2008-04-24 19:48:46 -04:00
|
|
|
Wget can follow links in @sc{html}, @sc{xhtml}, and @sc{css} pages, to
|
|
|
|
create local versions of remote web sites, fully recreating the
|
|
|
|
directory structure of the original site. This is sometimes referred to
|
|
|
|
as ``recursive downloading.'' While doing that, Wget respects the Robot
|
|
|
|
Exclusion Standard (@file{/robots.txt}). Wget can be instructed to
|
|
|
|
convert the links in downloaded files to point at the local files, for
|
|
|
|
offline viewing.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
File name wildcard matching and recursive mirroring of directories are
|
|
|
|
available when retrieving via @sc{ftp}. Wget can read the time-stamp
|
|
|
|
information given by both @sc{http} and @sc{ftp} servers, and store it
|
|
|
|
locally. Thus Wget can see if the remote file has changed since last
|
|
|
|
retrieval, and automatically retrieve the new version if it has. This
|
|
|
|
makes Wget suitable for mirroring of @sc{ftp} sites, as well as home
|
|
|
|
pages.
|
|
|
|
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
@ignore
|
|
|
|
@c man begin DESCRIPTION
|
|
|
|
|
|
|
|
@c man end
|
|
|
|
@end ignore
|
|
|
|
@c man begin DESCRIPTION
|
|
|
|
Wget has been designed for robustness over slow or unstable network
|
|
|
|
connections; if a download fails due to a network problem, it will
|
|
|
|
keep retrying until the whole file has been retrieved. If the server
|
|
|
|
supports regetting, it will instruct the server to continue the
|
|
|
|
download from where it left off.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
Wget supports proxy servers, which can lighten the network load, speed
|
2008-02-08 14:51:04 -05:00
|
|
|
up retrieval and provide access behind firewalls. Wget uses the passive
|
|
|
|
@sc{ftp} downloading by default, active @sc{ftp} being an option.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2005-04-19 18:57:28 -04:00
|
|
|
@item
|
|
|
|
Wget supports IP version 6, the next generation of IP. IPv6 is
|
|
|
|
autodetected at compile-time, and can be disabled at either build or
|
|
|
|
run time. Binaries built with IPv6 support work well in both
|
|
|
|
IPv4-only and dual family environments.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
2003-09-30 17:09:06 -04:00
|
|
|
Built-in features offer mechanisms to tune which links you wish to follow
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Following Links}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
2005-04-19 18:57:28 -04:00
|
|
|
The progress of individual downloads is traced using a progress gauge.
|
|
|
|
Interactive downloads are tracked using a ``thermometer''-style gauge,
|
|
|
|
whereas non-interactive ones are traced with dots, each dot
|
|
|
|
representing a fixed amount of data received (1KB by default). Either
|
|
|
|
gauge can be customized to your preferences.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Most of the features are fully configurable, either through command line
|
2000-11-14 17:49:07 -05:00
|
|
|
options, or via the initialization file @file{.wgetrc} (@pxref{Startup
|
1999-12-02 02:42:23 -05:00
|
|
|
File}). Wget allows you to define @dfn{global} startup files
|
2010-10-28 18:20:31 -04:00
|
|
|
(@file{/usr/local/etc/wgetrc} by default) for site settings. You can also
|
|
|
|
specify the location of a startup file with the --config option.
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
|
|
|
|
@ignore
|
|
|
|
@c man begin FILES
|
|
|
|
@table @samp
|
|
|
|
@item /usr/local/etc/wgetrc
|
|
|
|
Default location of the @dfn{global} startup file.
|
|
|
|
|
|
|
|
@item .wgetrc
|
|
|
|
User startup file.
|
|
|
|
@end table
|
|
|
|
@c man end
|
|
|
|
@end ignore
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Finally, GNU Wget is free software. This means that everyone may use
|
|
|
|
it, redistribute it and/or modify it under the terms of the GNU General
|
2007-07-10 01:53:22 -04:00
|
|
|
Public License, as published by the Free Software Foundation (see the
|
|
|
|
file @file{COPYING} that came with GNU Wget, for details).
|
1999-12-02 02:42:23 -05:00
|
|
|
@end itemize
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Invoking, Recursive Download, Overview, Top
|
1999-12-02 02:42:23 -05:00
|
|
|
@chapter Invoking
|
|
|
|
@cindex invoking
|
|
|
|
@cindex command line
|
|
|
|
@cindex arguments
|
|
|
|
@cindex nohup
|
|
|
|
|
|
|
|
By default, Wget is very simple to invoke. The basic syntax is:
|
|
|
|
|
|
|
|
@example
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin SYNOPSIS
|
1999-12-02 02:42:23 -05:00
|
|
|
wget [@var{option}]@dots{} [@var{URL}]@dots{}
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
Wget will simply download all the @sc{url}s specified on the command
|
|
|
|
line. @var{URL} is a @dfn{Uniform Resource Locator}, as defined below.
|
|
|
|
|
|
|
|
However, you may wish to change some of the default parameters of
|
|
|
|
Wget. You can do it two ways: permanently, adding the appropriate
|
2000-11-14 17:49:07 -05:00
|
|
|
command to @file{.wgetrc} (@pxref{Startup File}), or specifying it on
|
1999-12-02 02:42:23 -05:00
|
|
|
the command line.
|
|
|
|
|
|
|
|
@menu
|
2009-08-28 02:57:09 -04:00
|
|
|
* URL Format::
|
|
|
|
* Option Syntax::
|
|
|
|
* Basic Startup Options::
|
|
|
|
* Logging and Input File Options::
|
|
|
|
* Download Options::
|
|
|
|
* Directory Options::
|
|
|
|
* HTTP Options::
|
|
|
|
* HTTPS (SSL/TLS) Options::
|
|
|
|
* FTP Options::
|
|
|
|
* Recursive Retrieval Options::
|
|
|
|
* Recursive Accept/Reject Options::
|
|
|
|
* Exit Status::
|
1999-12-02 02:42:23 -05:00
|
|
|
@end menu
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node URL Format, Option Syntax, Invoking, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section URL Format
|
|
|
|
@cindex URL
|
|
|
|
@cindex URL syntax
|
|
|
|
|
|
|
|
@dfn{URL} is an acronym for Uniform Resource Locator. A uniform
|
|
|
|
resource locator is a compact string representation for a resource
|
|
|
|
available via the Internet. Wget recognizes the @sc{url} syntax as per
|
|
|
|
@sc{rfc1738}. This is the most widely used form (square brackets denote
|
|
|
|
optional parts):
|
|
|
|
|
|
|
|
@example
|
|
|
|
http://host[:port]/directory/file
|
|
|
|
ftp://host[:port]/directory/file
|
|
|
|
@end example
|
|
|
|
|
|
|
|
You can also encode your username and password within a @sc{url}:
|
|
|
|
|
|
|
|
@example
|
|
|
|
ftp://user:password@@host/path
|
|
|
|
http://user:password@@host/path
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Either @var{user} or @var{password}, or both, may be left out. If you
|
|
|
|
leave out either the @sc{http} username or password, no authentication
|
|
|
|
will be sent. If you leave out the @sc{ftp} username, @samp{anonymous}
|
|
|
|
will be used. If you leave out the @sc{ftp} password, your email
|
|
|
|
address will be supplied as a default password.@footnote{If you have a
|
|
|
|
@file{.netrc} file in your home directory, password will also be
|
|
|
|
searched for there.}
|
|
|
|
|
2002-04-10 17:42:16 -04:00
|
|
|
@strong{Important Note}: if you specify a password-containing @sc{url}
|
|
|
|
on the command line, the username and password will be plainly visible
|
|
|
|
to all users on the system, by way of @code{ps}. On multi-user systems,
|
|
|
|
this is a big security risk. To work around it, use @code{wget -i -}
|
|
|
|
and feed the @sc{url}s to Wget's standard input, each on a separate
|
|
|
|
line, terminated by @kbd{C-d}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
You can encode unsafe characters in a @sc{url} as @samp{%xy}, @code{xy}
|
|
|
|
being the hexadecimal representation of the character's @sc{ascii}
|
|
|
|
value. Some common unsafe characters include @samp{%} (quoted as
|
|
|
|
@samp{%25}), @samp{:} (quoted as @samp{%3A}), and @samp{@@} (quoted as
|
|
|
|
@samp{%40}). Refer to @sc{rfc1738} for a comprehensive list of unsafe
|
|
|
|
characters.
|
|
|
|
|
|
|
|
Wget also supports the @code{type} feature for @sc{ftp} @sc{url}s. By
|
|
|
|
default, @sc{ftp} documents are retrieved in the binary mode (type
|
|
|
|
@samp{i}), which means that they are downloaded unchanged. Another
|
|
|
|
useful mode is the @samp{a} (@dfn{ASCII}) mode, which converts the line
|
|
|
|
delimiters between the different operating systems, and is thus useful
|
|
|
|
for text files. Here is an example:
|
|
|
|
|
|
|
|
@example
|
|
|
|
ftp://host/directory/file;type=a
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Two alternative variants of @sc{url} specification are also supported,
|
2000-03-02 08:44:56 -05:00
|
|
|
because of historical (hysterical?) reasons and their widespreaded use.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sc{ftp}-only syntax (supported by @code{NcFTP}):
|
|
|
|
@example
|
|
|
|
host:/dir/file
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@sc{http}-only syntax (introduced by @code{Netscape}):
|
|
|
|
@example
|
|
|
|
host[:port]/dir/file
|
|
|
|
@end example
|
|
|
|
|
|
|
|
These two alternative forms are deprecated, and may cease being
|
|
|
|
supported in the future.
|
|
|
|
|
|
|
|
If you do not understand the difference between these notations, or do
|
|
|
|
not know which one to use, just use the plain ordinary format you use
|
|
|
|
with your favorite browser, like @code{Lynx} or @code{Netscape}.
|
|
|
|
|
2005-04-25 18:05:42 -04:00
|
|
|
@c man begin OPTIONS
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Option Syntax, Basic Startup Options, URL Format, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Option Syntax
|
|
|
|
@cindex option syntax
|
|
|
|
@cindex syntax of options
|
|
|
|
|
2005-04-25 18:05:42 -04:00
|
|
|
Since Wget uses GNU getopt to process command-line arguments, every
|
|
|
|
option has a long form along with the short one. Long options are
|
|
|
|
more convenient to remember, but take time to type. You may freely
|
|
|
|
mix different option styles, or specify options after the command-line
|
|
|
|
arguments. Thus you may write:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget -r --tries=10 http://fly.srk.fer.hr/ -o log
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
The space between the option accepting an argument and the argument may
|
2007-09-28 00:57:11 -04:00
|
|
|
be omitted. Instead of @samp{-o log} you can write @samp{-olog}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
You may put several options that do not require arguments together,
|
|
|
|
like:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -drc @var{URL}
|
|
|
|
@end example
|
|
|
|
|
2009-08-02 12:10:32 -04:00
|
|
|
This is completely equivalent to:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -d -r -c @var{URL}
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Since the options can be specified after the arguments, you may
|
|
|
|
terminate them with @samp{--}. So the following will try to download
|
|
|
|
@sc{url} @samp{-x}, reporting failure to @file{log}:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -o log -- -x
|
|
|
|
@end example
|
|
|
|
|
|
|
|
The options that accept comma-separated lists all respect the convention
|
|
|
|
that specifying an empty list clears its value. This can be useful to
|
|
|
|
clear the @file{.wgetrc} settings. For instance, if your @file{.wgetrc}
|
|
|
|
sets @code{exclude_directories} to @file{/cgi-bin}, the following
|
|
|
|
example will first reset it, and then set it to exclude @file{/~nobody}
|
|
|
|
and @file{/~somebody}. You can also clear the lists in @file{.wgetrc}
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Wgetrc Syntax}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -X '' -X /~nobody,/~somebody
|
|
|
|
@end example
|
|
|
|
|
2005-04-25 18:05:42 -04:00
|
|
|
Most options that do not accept arguments are @dfn{boolean} options,
|
|
|
|
so named because their state can be captured with a yes-or-no
|
|
|
|
(``boolean'') variable. For example, @samp{--follow-ftp} tells Wget
|
|
|
|
to follow FTP links from HTML files and, on the other hand,
|
|
|
|
@samp{--no-glob} tells it not to perform file globbing on FTP URLs. A
|
|
|
|
boolean option is either @dfn{affirmative} or @dfn{negative}
|
|
|
|
(beginning with @samp{--no}). All such options share several
|
|
|
|
properties.
|
|
|
|
|
|
|
|
Unless stated otherwise, it is assumed that the default behavior is
|
|
|
|
the opposite of what the option accomplishes. For example, the
|
|
|
|
documented existence of @samp{--follow-ftp} assumes that the default
|
|
|
|
is to @emph{not} follow FTP links from HTML pages.
|
|
|
|
|
|
|
|
Affirmative options can be negated by prepending the @samp{--no-} to
|
|
|
|
the option name; negative options can be negated by omitting the
|
|
|
|
@samp{--no-} prefix. This might seem superfluous---if the default for
|
|
|
|
an affirmative option is to not do something, then why provide a way
|
|
|
|
to explicitly turn it off? But the startup file may in fact change
|
2009-06-11 19:27:00 -04:00
|
|
|
the default. For instance, using @code{follow_ftp = on} in
|
|
|
|
@file{.wgetrc} makes Wget @emph{follow} FTP links by default, and
|
2005-04-25 18:05:42 -04:00
|
|
|
using @samp{--no-follow-ftp} is the only way to restore the factory
|
|
|
|
default from the command line.
|
2001-02-10 19:22:42 -05:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Basic Startup Options, Logging and Input File Options, Option Syntax, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Basic Startup Options
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@item -V
|
|
|
|
@itemx --version
|
|
|
|
Display the version of Wget.
|
|
|
|
|
|
|
|
@item -h
|
|
|
|
@itemx --help
|
|
|
|
Print a help message describing all of Wget's command-line options.
|
|
|
|
|
|
|
|
@item -b
|
|
|
|
@itemx --background
|
|
|
|
Go to background immediately after startup. If no output file is
|
|
|
|
specified via the @samp{-o}, output is redirected to @file{wget-log}.
|
|
|
|
|
|
|
|
@cindex execute wgetrc command
|
|
|
|
@item -e @var{command}
|
|
|
|
@itemx --execute @var{command}
|
|
|
|
Execute @var{command} as if it were a part of @file{.wgetrc}
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Startup File}). A command thus invoked will be executed
|
1999-12-02 02:42:23 -05:00
|
|
|
@emph{after} the commands in @file{.wgetrc}, thus taking precedence over
|
2004-02-12 17:16:48 -05:00
|
|
|
them. If you need to specify more than one wgetrc command, use multiple
|
|
|
|
instances of @samp{-e}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Logging and Input File Options, Download Options, Basic Startup Options, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Logging and Input File Options
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@cindex output file
|
|
|
|
@cindex log file
|
|
|
|
@item -o @var{logfile}
|
|
|
|
@itemx --output-file=@var{logfile}
|
|
|
|
Log all messages to @var{logfile}. The messages are normally reported
|
|
|
|
to standard error.
|
|
|
|
|
|
|
|
@cindex append to log
|
|
|
|
@item -a @var{logfile}
|
|
|
|
@itemx --append-output=@var{logfile}
|
|
|
|
Append to @var{logfile}. This is the same as @samp{-o}, only it appends
|
|
|
|
to @var{logfile} instead of overwriting the old log file. If
|
|
|
|
@var{logfile} does not exist, a new file is created.
|
|
|
|
|
|
|
|
@cindex debug
|
|
|
|
@item -d
|
|
|
|
@itemx --debug
|
|
|
|
Turn on debug output, meaning various information important to the
|
|
|
|
developers of Wget if it does not work properly. Your system
|
|
|
|
administrator may have chosen to compile Wget without debug support, in
|
|
|
|
which case @samp{-d} will not work. Please note that compiling with
|
|
|
|
debug support is always safe---Wget compiled with the debug support will
|
|
|
|
@emph{not} print any debug info unless requested with @samp{-d}.
|
2000-11-14 17:49:07 -05:00
|
|
|
@xref{Reporting Bugs}, for more information on how to use @samp{-d} for
|
1999-12-02 02:42:23 -05:00
|
|
|
sending bug reports.
|
|
|
|
|
|
|
|
@cindex quiet
|
|
|
|
@item -q
|
|
|
|
@itemx --quiet
|
|
|
|
Turn off Wget's output.
|
|
|
|
|
|
|
|
@cindex verbose
|
|
|
|
@item -v
|
|
|
|
@itemx --verbose
|
|
|
|
Turn on verbose output, with all the available data. The default output
|
|
|
|
is verbose.
|
|
|
|
|
|
|
|
@item -nv
|
2005-06-16 07:11:24 -04:00
|
|
|
@itemx --no-verbose
|
|
|
|
Turn off verbose without being completely quiet (use @samp{-q} for
|
|
|
|
that), which means that error messages and basic information still get
|
|
|
|
printed.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2012-08-28 10:38:21 -04:00
|
|
|
@item --report-speed=@var{type}
|
2012-06-09 07:14:51 -04:00
|
|
|
Output bandwidth as @var{type}. The only accepted value is @samp{bits}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex input-file
|
|
|
|
@item -i @var{file}
|
|
|
|
@itemx --input-file=@var{file}
|
2008-07-11 20:16:49 -04:00
|
|
|
Read @sc{url}s from a local or external @var{file}. If @samp{-} is
|
|
|
|
specified as @var{file}, @sc{url}s are read from the standard input.
|
|
|
|
(Use @samp{./-} to read from a file literally named @samp{-}.)
|
2005-04-27 14:23:41 -04:00
|
|
|
|
|
|
|
If this function is used, no @sc{url}s need be present on the command
|
|
|
|
line. If there are @sc{url}s both on the command line and in an input
|
|
|
|
file, those on the command lines will be the first ones to be
|
2009-07-07 00:11:03 -04:00
|
|
|
retrieved. If @samp{--force-html} is not specified, then @var{file}
|
|
|
|
should consist of a series of URLs, one per line.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
However, if you specify @samp{--force-html}, the document will be
|
|
|
|
regarded as @samp{html}. In that case you may have problems with
|
|
|
|
relative links, which you can solve either by adding @code{<base
|
|
|
|
href="@var{url}">} to the documents or by specifying
|
|
|
|
@samp{--base=@var{url}} on the command line.
|
|
|
|
|
2008-07-22 16:33:42 -04:00
|
|
|
If the @var{file} is an external one, the document will be automatically
|
|
|
|
treated as @samp{html} if the Content-Type matches @samp{text/html}.
|
|
|
|
Furthermore, the @var{file}'s location will be implicitly used as base
|
|
|
|
href if none was specified.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex force html
|
|
|
|
@item -F
|
|
|
|
@itemx --force-html
|
|
|
|
When input is read from a file, force it to be treated as an @sc{html}
|
|
|
|
file. This enables you to retrieve relative links from existing
|
|
|
|
@sc{html} files on your local disk, by adding @code{<base
|
|
|
|
href="@var{url}">} to @sc{html}, or using the @samp{--base} command-line
|
|
|
|
option.
|
2000-08-23 18:41:21 -04:00
|
|
|
|
|
|
|
@cindex base for relative links in input file
|
|
|
|
@item -B @var{URL}
|
|
|
|
@itemx --base=@var{URL}
|
2009-07-07 01:53:01 -04:00
|
|
|
Resolves relative links using @var{URL} as the point of reference,
|
|
|
|
when reading links from an HTML file specified via the
|
|
|
|
@samp{-i}/@samp{--input-file} option (together with
|
|
|
|
@samp{--force-html}, or when the input file was fetched remotely from
|
|
|
|
a server describing it as @sc{html}). This is equivalent to the
|
|
|
|
presence of a @code{BASE} tag in the @sc{html} input file, with
|
|
|
|
@var{URL} as the value for the @code{href} attribute.
|
|
|
|
|
|
|
|
For instance, if you specify @samp{http://foo/bar/a.html} for
|
|
|
|
@var{URL}, and Wget reads @samp{../baz/b.html} from the input file, it
|
|
|
|
would be resolved to @samp{http://foo/baz/b.html}.
|
2010-10-28 18:20:31 -04:00
|
|
|
|
|
|
|
@cindex specify config
|
|
|
|
@item --config=@var{FILE}
|
|
|
|
Specify the location of a startup file you wish to use.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Download Options, Directory Options, Logging and Input File Options, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Download Options
|
|
|
|
|
|
|
|
@table @samp
|
2005-04-27 17:53:02 -04:00
|
|
|
@cindex bind address
|
2000-10-24 02:19:17 -04:00
|
|
|
@cindex client IP address
|
|
|
|
@cindex IP address, client
|
|
|
|
@item --bind-address=@var{ADDRESS}
|
2005-04-27 17:53:02 -04:00
|
|
|
When making client TCP/IP connections, bind to @var{ADDRESS} on
|
2000-10-24 02:19:17 -04:00
|
|
|
the local machine. @var{ADDRESS} may be specified as a hostname or IP
|
|
|
|
address. This option can be useful if your machine is bound to multiple
|
|
|
|
IPs.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex retries
|
|
|
|
@cindex tries
|
2013-05-10 11:04:59 -04:00
|
|
|
@cindex number of tries
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -t @var{number}
|
|
|
|
@itemx --tries=@var{number}
|
2013-05-10 11:04:59 -04:00
|
|
|
Set number of tries to @var{number}. Specify 0 or @samp{inf} for
|
2003-09-17 17:00:03 -04:00
|
|
|
infinite retrying. The default is to retry 20 times, with the exception
|
|
|
|
of fatal errors like ``connection refused'' or ``not found'' (404),
|
|
|
|
which are not retried.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item -O @var{file}
|
|
|
|
@itemx --output-document=@var{file}
|
2005-04-27 14:23:41 -04:00
|
|
|
The documents will not be written to the appropriate files, but all
|
|
|
|
will be concatenated together and written to @var{file}. If @samp{-}
|
|
|
|
is used as @var{file}, documents will be printed to standard output,
|
|
|
|
disabling link conversion. (Use @samp{./-} to print to a file
|
|
|
|
literally named @samp{-}.)
|
|
|
|
|
2007-09-12 13:14:14 -04:00
|
|
|
Use of @samp{-O} is @emph{not} intended to mean simply ``use the name
|
|
|
|
@var{file} instead of the one in the URL;'' rather, it is
|
2007-09-12 14:40:05 -04:00
|
|
|
analogous to shell redirection:
|
2007-09-12 13:14:14 -04:00
|
|
|
@samp{wget -O file http://foo} is intended to work like
|
2007-09-12 14:40:05 -04:00
|
|
|
@samp{wget -O - http://foo > file}; @file{file} will be truncated
|
|
|
|
immediately, and @emph{all} downloaded content will be written there.
|
2007-09-12 13:14:14 -04:00
|
|
|
|
2008-04-27 05:15:31 -04:00
|
|
|
For this reason, @samp{-N} (for timestamp-checking) is not supported
|
|
|
|
in combination with @samp{-O}: since @var{file} is always newly
|
2008-05-13 02:54:48 -04:00
|
|
|
created, it will always have a very new timestamp. A warning will be
|
|
|
|
issued if this combination is used.
|
2008-04-27 05:15:31 -04:00
|
|
|
|
|
|
|
Similarly, using @samp{-r} or @samp{-p} with @samp{-O} may not work as
|
|
|
|
you expect: Wget won't just download the first file to @var{file} and
|
|
|
|
then download the rest to their normal names: @emph{all} downloaded
|
|
|
|
content will be placed in @var{file}. This was disabled in version
|
|
|
|
1.11, but has been reinstated (with a warning) in 1.11.2, as there are
|
|
|
|
some cases where this behavior can actually have some use.
|
|
|
|
|
2007-09-12 14:40:05 -04:00
|
|
|
Note that a combination with @samp{-k} is only permitted when
|
2008-04-27 05:15:31 -04:00
|
|
|
downloading a single document, as in that case it will just convert
|
|
|
|
all relative URIs to external ones; @samp{-k} makes no sense for
|
2010-05-27 06:45:15 -04:00
|
|
|
multiple URIs when they're all being downloaded to a single file;
|
|
|
|
@samp{-k} can be used only when the output is a regular file.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-08-22 23:04:20 -04:00
|
|
|
@cindex clobbering, file
|
|
|
|
@cindex downloading multiple times
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex no-clobber
|
|
|
|
@item -nc
|
|
|
|
@itemx --no-clobber
|
2000-11-16 07:35:27 -05:00
|
|
|
If a file is downloaded more than once in the same directory, Wget's
|
2000-08-22 23:04:20 -04:00
|
|
|
behavior depends on a few options, including @samp{-nc}. In certain
|
2000-11-16 07:35:27 -05:00
|
|
|
cases, the local file will be @dfn{clobbered}, or overwritten, upon
|
|
|
|
repeated download. In other cases it will be preserved.
|
2000-08-22 23:04:20 -04:00
|
|
|
|
2009-06-12 20:56:12 -04:00
|
|
|
When running Wget without @samp{-N}, @samp{-nc}, @samp{-r}, or
|
|
|
|
@samp{-p}, downloading the same file in the same directory will result
|
|
|
|
in the original copy of @var{file} being preserved and the second copy
|
|
|
|
being named @samp{@var{file}.1}. If that file is downloaded yet
|
|
|
|
again, the third copy will be named @samp{@var{file}.2}, and so on.
|
|
|
|
(This is also the behavior with @samp{-nd}, even if @samp{-r} or
|
|
|
|
@samp{-p} are in effect.) When @samp{-nc} is specified, this behavior
|
|
|
|
is suppressed, and Wget will refuse to download newer copies of
|
|
|
|
@samp{@var{file}}. Therefore, ``@code{no-clobber}'' is actually a
|
|
|
|
misnomer in this mode---it's not clobbering that's prevented (as the
|
|
|
|
numeric suffixes were already preventing clobbering), but rather the
|
|
|
|
multiple version saving that's prevented.
|
|
|
|
|
|
|
|
When running Wget with @samp{-r} or @samp{-p}, but without @samp{-N},
|
|
|
|
@samp{-nd}, or @samp{-nc}, re-downloading a file will result in the
|
|
|
|
new copy simply overwriting the old. Adding @samp{-nc} will prevent
|
|
|
|
this behavior, instead causing the original version to be preserved
|
|
|
|
and any newer copies on the server to be ignored.
|
2007-09-13 00:48:31 -04:00
|
|
|
|
|
|
|
When running Wget with @samp{-N}, with or without @samp{-r} or
|
|
|
|
@samp{-p}, the decision as to whether or not to download a newer copy
|
|
|
|
of a file depends on the local and remote timestamp and size of the
|
|
|
|
file (@pxref{Time-Stamping}). @samp{-nc} may not be specified at the
|
|
|
|
same time as @samp{-N}.
|
2000-08-22 23:04:20 -04:00
|
|
|
|
|
|
|
Note that when @samp{-nc} is specified, files with the suffixes
|
2003-11-08 18:48:36 -05:00
|
|
|
@samp{.html} or @samp{.htm} will be loaded from the local disk and
|
|
|
|
parsed as if they had been retrieved from the Web.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2013-07-08 18:50:30 -04:00
|
|
|
@cindex backing up files
|
|
|
|
@item --backups=@var{backups}
|
|
|
|
Before (over)writing a file, back up an existing file by adding a
|
|
|
|
@samp{.1} suffix (@samp{_1} on VMS) to the file name. Such backup
|
|
|
|
files are rotated to @samp{.2}, @samp{.3}, and so on, up to
|
|
|
|
@var{backups} (and lost beyond that).
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex continue retrieval
|
2001-02-19 18:22:48 -05:00
|
|
|
@cindex incomplete downloads
|
2001-01-10 01:51:51 -05:00
|
|
|
@cindex resume download
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -c
|
|
|
|
@itemx --continue
|
2001-01-10 01:51:51 -05:00
|
|
|
Continue getting a partially-downloaded file. This is useful when you
|
|
|
|
want to finish up a download started by a previous instance of Wget, or
|
|
|
|
by another program. For instance:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
|
|
|
|
@end example
|
|
|
|
|
2001-01-10 01:51:51 -05:00
|
|
|
If there is a file named @file{ls-lR.Z} in the current directory, Wget
|
1999-12-02 02:42:23 -05:00
|
|
|
will assume that it is the first portion of the remote file, and will
|
2001-01-10 01:51:51 -05:00
|
|
|
ask the server to continue the retrieval from an offset equal to the
|
1999-12-02 02:42:23 -05:00
|
|
|
length of the local file.
|
|
|
|
|
2001-01-10 01:51:51 -05:00
|
|
|
Note that you don't need to specify this option if you just want the
|
|
|
|
current invocation of Wget to retry downloading a file should the
|
|
|
|
connection be lost midway through. This is the default behavior.
|
|
|
|
@samp{-c} only affects resumption of downloads started @emph{prior} to
|
|
|
|
this invocation of Wget, and whose local files are still sitting around.
|
|
|
|
|
|
|
|
Without @samp{-c}, the previous example would just download the remote
|
|
|
|
file to @file{ls-lR.Z.1}, leaving the truncated @file{ls-lR.Z} file
|
2001-04-30 06:19:30 -04:00
|
|
|
alone.
|
2001-01-10 01:51:51 -05:00
|
|
|
|
2001-04-30 06:19:30 -04:00
|
|
|
Beginning with Wget 1.7, if you use @samp{-c} on a non-empty file, and
|
|
|
|
it turns out that the server does not support continued downloading,
|
|
|
|
Wget will refuse to start the download from scratch, which would
|
|
|
|
effectively ruin existing contents. If you really want the download to
|
|
|
|
start from scratch, remove the file.
|
|
|
|
|
|
|
|
Also beginning with Wget 1.7, if you use @samp{-c} on a file which is of
|
|
|
|
equal size as the one on the server, Wget will refuse to download the
|
|
|
|
file and print an explanatory message. The same happens when the file
|
|
|
|
is smaller on the server than locally (presumably because it was changed
|
|
|
|
on the server since your last download attempt)---because ``continuing''
|
|
|
|
is not meaningful, no download occurs.
|
2001-02-19 18:22:48 -05:00
|
|
|
|
|
|
|
On the other side of the coin, while using @samp{-c}, any file that's
|
|
|
|
bigger on the server than locally will be considered an incomplete
|
2001-04-30 06:19:30 -04:00
|
|
|
download and only @code{(length(remote) - length(local))} bytes will be
|
|
|
|
downloaded and tacked onto the end of the local file. This behavior can
|
|
|
|
be desirable in certain cases---for instance, you can use @samp{wget -c}
|
|
|
|
to download just the new portion that's been appended to a data
|
2001-02-19 18:22:48 -05:00
|
|
|
collection or log file.
|
|
|
|
|
|
|
|
However, if the file is bigger on the server because it's been
|
|
|
|
@emph{changed}, as opposed to just @emph{appended} to, you'll end up
|
|
|
|
with a garbled file. Wget has no way of verifying that the local file
|
|
|
|
is really a valid prefix of the remote file. You need to be especially
|
|
|
|
careful of this when using @samp{-c} in conjunction with @samp{-r},
|
|
|
|
since every file will be considered as an "incomplete download" candidate.
|
|
|
|
|
|
|
|
Another instance where you'll get a garbled file if you try to use
|
|
|
|
@samp{-c} is if you have a lame @sc{http} proxy that inserts a
|
|
|
|
``transfer interrupted'' string into the local file. In the future a
|
2001-01-10 01:51:51 -05:00
|
|
|
``rollback'' option may be added to deal with this case.
|
|
|
|
|
2001-02-19 18:22:48 -05:00
|
|
|
Note that @samp{-c} only works with @sc{ftp} servers and with @sc{http}
|
|
|
|
servers that support the @code{Range} header.
|
2000-08-23 17:36:31 -04:00
|
|
|
|
2014-03-19 11:42:04 -04:00
|
|
|
@cindex offset
|
|
|
|
@cindex continue retrieval
|
|
|
|
@cindex incomplete downloads
|
|
|
|
@cindex resume download
|
|
|
|
@cindex start position
|
|
|
|
@item --start-pos=@var{OFFSET}
|
|
|
|
Start downloading at zero-based position @var{OFFSET}. Offset may be expressed
|
|
|
|
in bytes, kilobytes with the `k' suffix, or megabytes with the `m' suffix, etc.
|
|
|
|
|
|
|
|
@samp{--start-pos} has higher precedence over @samp{--continue}. When
|
|
|
|
@samp{--start-pos} and @samp{--continue} are both specified, wget will emit a
|
|
|
|
warning then proceed as if @samp{--continue} was absent.
|
|
|
|
|
|
|
|
Server support for continued download is required, otherwise @samp{--start-pos}
|
|
|
|
cannot help. See @samp{-c} for details.
|
|
|
|
|
2001-11-23 12:04:26 -05:00
|
|
|
@cindex progress indicator
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex dot style
|
2001-11-23 12:04:26 -05:00
|
|
|
@item --progress=@var{type}
|
|
|
|
Select the type of the progress indicator you wish to use. Legal
|
|
|
|
indicators are ``dot'' and ``bar''.
|
|
|
|
|
2003-09-30 17:09:06 -04:00
|
|
|
The ``bar'' indicator is used by default. It draws an @sc{ascii} progress
|
2002-04-15 17:54:33 -04:00
|
|
|
bar graphics (a.k.a ``thermometer'' display) indicating the status of
|
|
|
|
retrieval. If the output is not a TTY, the ``dot'' bar will be used by
|
|
|
|
default.
|
|
|
|
|
|
|
|
Use @samp{--progress=dot} to switch to the ``dot'' display. It traces
|
|
|
|
the retrieval by printing dots on the screen, each dot representing a
|
|
|
|
fixed amount of downloaded data.
|
2001-11-23 12:04:26 -05:00
|
|
|
|
|
|
|
When using the dotted retrieval, you may also set the @dfn{style} by
|
|
|
|
specifying the type as @samp{dot:@var{style}}. Different styles assign
|
|
|
|
different meaning to one dot. With the @code{default} style each dot
|
|
|
|
represents 1K, there are ten dots in a cluster and 50 dots in a line.
|
|
|
|
The @code{binary} style has a more ``computer''-like orientation---8K
|
|
|
|
dots, 16-dots clusters and 48 dots per line (which makes for 384K
|
2013-04-14 08:43:16 -04:00
|
|
|
lines). The @code{mega} style is suitable for downloading large
|
2001-11-23 12:04:26 -05:00
|
|
|
files---each dot represents 64K retrieved, there are eight dots in a
|
|
|
|
cluster, and 48 dots on each line (so each line contains 3M).
|
2013-04-14 08:43:16 -04:00
|
|
|
If @code{mega} is not enough then you can use the @code{giga}
|
|
|
|
style---each dot represents 1M retrieved, there are eight dots in a
|
|
|
|
cluster, and 32 dots on each line (so each line contains 32M).
|
2001-11-23 12:04:26 -05:00
|
|
|
|
2002-04-15 17:54:33 -04:00
|
|
|
Note that you can set the default style using the @code{progress}
|
|
|
|
command in @file{.wgetrc}. That setting may be overridden from the
|
|
|
|
command line. The exception is that, when the output is not a TTY, the
|
|
|
|
``dot'' progress will be favored over ``bar''. To force the bar output,
|
|
|
|
use @samp{--progress=bar:force}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item -N
|
|
|
|
@itemx --timestamping
|
2000-11-14 17:49:07 -05:00
|
|
|
Turn on time-stamping. @xref{Time-Stamping}, for details.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2010-01-09 23:21:37 -05:00
|
|
|
@item --no-use-server-timestamps
|
|
|
|
Don't set the local file's timestamp by the one on the server.
|
|
|
|
|
2013-08-13 14:41:08 -04:00
|
|
|
By default, when a file is downloaded, its timestamps are set to
|
2010-01-09 23:21:37 -05:00
|
|
|
match those from the remote file. This allows the use of
|
|
|
|
@samp{--timestamping} on subsequent invocations of wget. However, it
|
|
|
|
is sometimes useful to base the local file's timestamp on when it was
|
|
|
|
actually downloaded; for that purpose, the
|
|
|
|
@samp{--no-use-server-timestamps} option has been provided.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex server response, print
|
|
|
|
@item -S
|
|
|
|
@itemx --server-response
|
|
|
|
Print the headers sent by @sc{http} servers and responses sent by
|
|
|
|
@sc{ftp} servers.
|
|
|
|
|
|
|
|
@cindex Wget as spider
|
|
|
|
@cindex spider
|
|
|
|
@item --spider
|
|
|
|
When invoked with this option, Wget will behave as a Web @dfn{spider},
|
|
|
|
which means that it will not download the pages, just check that they
|
2003-09-30 17:09:06 -04:00
|
|
|
are there. For example, you can use Wget to check your bookmarks:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget --spider --force-html -i bookmarks.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
This feature needs much more work for Wget to get close to the
|
2003-09-30 17:09:06 -04:00
|
|
|
functionality of real web spiders.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex timeout
|
|
|
|
@item -T seconds
|
|
|
|
@itemx --timeout=@var{seconds}
|
2003-09-30 17:09:06 -04:00
|
|
|
Set the network timeout to @var{seconds} seconds. This is equivalent
|
2003-09-21 00:45:37 -04:00
|
|
|
to specifying @samp{--dns-timeout}, @samp{--connect-timeout}, and
|
|
|
|
@samp{--read-timeout}, all at the same time.
|
|
|
|
|
2005-05-05 19:15:22 -04:00
|
|
|
When interacting with the network, Wget can check for timeout and
|
|
|
|
abort the operation if it takes too long. This prevents anomalies
|
|
|
|
like hanging reads and infinite connects. The only timeout enabled by
|
|
|
|
default is a 900-second read timeout. Setting a timeout to 0 disables
|
|
|
|
it altogether. Unless you know what you are doing, it is best not to
|
|
|
|
change the default timeout settings.
|
|
|
|
|
|
|
|
All timeout-related options accept decimal values, as well as
|
|
|
|
subsecond values. For example, @samp{0.1} seconds is a legal (though
|
|
|
|
unwise) choice of timeout. Subsecond timeouts are useful for checking
|
|
|
|
server response times or for testing network latency.
|
2003-09-21 00:45:37 -04:00
|
|
|
|
|
|
|
@cindex DNS timeout
|
|
|
|
@cindex timeout, DNS
|
|
|
|
@item --dns-timeout=@var{seconds}
|
|
|
|
Set the DNS lookup timeout to @var{seconds} seconds. DNS lookups that
|
|
|
|
don't complete within the specified time will fail. By default, there
|
|
|
|
is no timeout on DNS lookups, other than that implemented by system
|
|
|
|
libraries.
|
|
|
|
|
|
|
|
@cindex connect timeout
|
|
|
|
@cindex timeout, connect
|
|
|
|
@item --connect-timeout=@var{seconds}
|
|
|
|
Set the connect timeout to @var{seconds} seconds. TCP connections that
|
|
|
|
take longer to establish will be aborted. By default, there is no
|
|
|
|
connect timeout, other than that implemented by system libraries.
|
|
|
|
|
|
|
|
@cindex read timeout
|
|
|
|
@cindex timeout, read
|
|
|
|
@item --read-timeout=@var{seconds}
|
2005-05-05 19:15:22 -04:00
|
|
|
Set the read (and write) timeout to @var{seconds} seconds. The
|
2007-09-28 00:57:11 -04:00
|
|
|
``time'' of this timeout refers to @dfn{idle time}: if, at any point in
|
2005-05-05 19:15:22 -04:00
|
|
|
the download, no data is received for more than the specified number
|
|
|
|
of seconds, reading fails and the download is restarted. This option
|
|
|
|
does not directly affect the duration of the entire download.
|
|
|
|
|
|
|
|
Of course, the remote server may choose to terminate the connection
|
|
|
|
sooner than this option requires. The default read timeout is 900
|
2003-09-21 00:45:37 -04:00
|
|
|
seconds.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2002-04-13 18:44:16 -04:00
|
|
|
@cindex bandwidth, limit
|
|
|
|
@cindex rate, limit
|
|
|
|
@cindex limit bandwidth
|
|
|
|
@item --limit-rate=@var{amount}
|
|
|
|
Limit the download speed to @var{amount} bytes per second. Amount may
|
|
|
|
be expressed in bytes, kilobytes with the @samp{k} suffix, or megabytes
|
|
|
|
with the @samp{m} suffix. For example, @samp{--limit-rate=20k} will
|
2005-05-05 19:15:22 -04:00
|
|
|
limit the retrieval rate to 20KB/s. This is useful when, for whatever
|
|
|
|
reason, you don't want Wget to consume the entire available bandwidth.
|
|
|
|
|
|
|
|
This option allows the use of decimal numbers, usually in conjunction
|
|
|
|
with power suffixes; for example, @samp{--limit-rate=2.5k} is a legal
|
|
|
|
value.
|
2002-04-13 18:44:16 -04:00
|
|
|
|
2003-09-15 19:23:55 -04:00
|
|
|
Note that Wget implements the limiting by sleeping the appropriate
|
2002-04-13 18:44:16 -04:00
|
|
|
amount of time after a network read that took less time than specified
|
|
|
|
by the rate. Eventually this strategy causes the TCP transfer to slow
|
2003-09-21 00:45:37 -04:00
|
|
|
down to approximately the specified rate. However, it may take some
|
|
|
|
time for this balance to be achieved, so don't be surprised if limiting
|
|
|
|
the rate doesn't work well with very small files.
|
2002-04-13 18:44:16 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex pause
|
|
|
|
@cindex wait
|
|
|
|
@item -w @var{seconds}
|
|
|
|
@itemx --wait=@var{seconds}
|
|
|
|
Wait the specified number of seconds between the retrievals. Use of
|
|
|
|
this option is recommended, as it lightens the server load by making the
|
|
|
|
requests less frequent. Instead of in seconds, the time can be
|
|
|
|
specified in minutes using the @code{m} suffix, in hours using @code{h}
|
|
|
|
suffix, or in days using @code{d} suffix.
|
|
|
|
|
|
|
|
Specifying a large value for this option is useful if the network or the
|
|
|
|
destination host is down, so that Wget can wait long enough to
|
2006-02-05 03:35:22 -05:00
|
|
|
reasonably expect the network error to be fixed before the retry. The
|
|
|
|
waiting interval specified by this function is influenced by
|
|
|
|
@code{--random-wait}, which see.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-04-12 21:42:34 -04:00
|
|
|
@cindex retries, waiting between
|
|
|
|
@cindex waiting between retries
|
|
|
|
@item --waitretry=@var{seconds}
|
|
|
|
If you don't want Wget to wait between @emph{every} retrieval, but only
|
2000-04-13 15:37:52 -04:00
|
|
|
between retries of failed downloads, you can use this option. Wget will
|
2000-11-16 07:35:27 -05:00
|
|
|
use @dfn{linear backoff}, waiting 1 second after the first failure on a
|
2000-04-13 15:37:52 -04:00
|
|
|
given file, then waiting 2 seconds after the second failure on that
|
2010-10-25 20:00:53 -04:00
|
|
|
file, up to the maximum number of @var{seconds} you specify.
|
2000-04-13 15:37:52 -04:00
|
|
|
|
2008-11-15 07:25:20 -05:00
|
|
|
By default, Wget will assume a value of 10 seconds.
|
2000-04-12 21:42:34 -04:00
|
|
|
|
2001-11-25 16:23:15 -05:00
|
|
|
@cindex wait, random
|
|
|
|
@cindex random wait
|
2003-09-18 19:12:58 -04:00
|
|
|
@item --random-wait
|
2001-11-25 16:23:15 -05:00
|
|
|
Some web sites may perform log analysis to identify retrieval programs
|
|
|
|
such as Wget by looking for statistically significant similarities in
|
|
|
|
the time between requests. This option causes the time between requests
|
2006-02-05 03:35:22 -05:00
|
|
|
to vary between 0.5 and 1.5 * @var{wait} seconds, where @var{wait} was
|
2003-10-06 19:12:37 -04:00
|
|
|
specified using the @samp{--wait} option, in order to mask Wget's
|
|
|
|
presence from such analysis.
|
2001-11-25 16:23:15 -05:00
|
|
|
|
2006-02-05 03:43:46 -05:00
|
|
|
A 2001 article in a publication devoted to development on a popular
|
2001-11-25 16:23:15 -05:00
|
|
|
consumer platform provided code to perform this analysis on the fly.
|
|
|
|
Its author suggested blocking at the class C address level to ensure
|
|
|
|
automated retrieval programs were blocked despite changing DHCP-supplied
|
|
|
|
addresses.
|
|
|
|
|
|
|
|
The @samp{--random-wait} option was inspired by this ill-advised
|
|
|
|
recommendation to block many unrelated users from a web site due to the
|
|
|
|
actions of one.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex proxy
|
2012-11-08 06:15:41 -05:00
|
|
|
@item --no-proxy
|
2005-05-14 16:06:34 -04:00
|
|
|
Don't use proxies, even if the appropriate @code{*_proxy} environment
|
|
|
|
variable is defined.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2006-07-10 06:18:53 -04:00
|
|
|
@c man end
|
2003-09-14 18:07:38 -04:00
|
|
|
For more information about the use of proxies with Wget, @xref{Proxies}.
|
2006-07-10 06:18:53 -04:00
|
|
|
@c man begin OPTIONS
|
2003-09-14 18:07:38 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex quota
|
|
|
|
@item -Q @var{quota}
|
|
|
|
@itemx --quota=@var{quota}
|
|
|
|
Specify download quota for automatic retrievals. The value can be
|
|
|
|
specified in bytes (default), kilobytes (with @samp{k} suffix), or
|
|
|
|
megabytes (with @samp{m} suffix).
|
|
|
|
|
|
|
|
Note that quota will never affect downloading a single file. So if you
|
|
|
|
specify @samp{wget -Q10k ftp://wuarchive.wustl.edu/ls-lR.gz}, all of the
|
|
|
|
@file{ls-lR.gz} will be downloaded. The same goes even when several
|
|
|
|
@sc{url}s are specified on the command-line. However, quota is
|
|
|
|
respected when retrieving either recursively, or from an input file.
|
|
|
|
Thus you may safely type @samp{wget -Q2m -i sites}---download will be
|
|
|
|
aborted when the quota is exceeded.
|
|
|
|
|
|
|
|
Setting quota to 0 or to @samp{inf} unlimits the download quota.
|
2003-09-10 15:41:54 -04:00
|
|
|
|
|
|
|
@cindex DNS cache
|
|
|
|
@cindex caching of DNS lookups
|
2005-04-17 19:26:26 -04:00
|
|
|
@item --no-dns-cache
|
|
|
|
Turn off caching of DNS lookups. Normally, Wget remembers the IP
|
|
|
|
addresses it looked up from DNS so it doesn't have to repeatedly
|
|
|
|
contact the DNS server for the same (typically small) set of hosts it
|
|
|
|
retrieves from. This cache exists in memory only; a new Wget run will
|
|
|
|
contact DNS again.
|
|
|
|
|
|
|
|
However, it has been reported that in some situations it is not
|
|
|
|
desirable to cache host names, even for the duration of a
|
|
|
|
short-running application like Wget. With this option Wget issues a
|
|
|
|
new DNS lookup (more precisely, a new call to @code{gethostbyname} or
|
|
|
|
@code{getaddrinfo}) each time it makes a new connection. Please note
|
|
|
|
that this option will @emph{not} affect caching that might be
|
|
|
|
performed by the resolving library or by an external caching layer,
|
|
|
|
such as NSCD.
|
|
|
|
|
|
|
|
If you don't understand exactly what this option does, you probably
|
|
|
|
won't need it.
|
2003-09-14 18:04:13 -04:00
|
|
|
|
|
|
|
@cindex file names, restrict
|
|
|
|
@cindex Windows file names
|
2009-07-28 03:19:48 -04:00
|
|
|
@item --restrict-file-names=@var{modes}
|
|
|
|
Change which characters found in remote URLs must be escaped during
|
|
|
|
generation of local filenames. Characters that are @dfn{restricted}
|
2003-09-16 21:32:05 -04:00
|
|
|
by this option are escaped, i.e. replaced with @samp{%HH}, where
|
|
|
|
@samp{HH} is the hexadecimal number that corresponds to the restricted
|
2009-07-28 03:19:48 -04:00
|
|
|
character. This option may also be used to force all alphabetical
|
|
|
|
cases to be either lower- or uppercase.
|
|
|
|
|
|
|
|
By default, Wget escapes the characters that are not valid or safe as
|
|
|
|
part of file names on your operating system, as well as control
|
|
|
|
characters that are typically unprintable. This option is useful for
|
|
|
|
changing these defaults, perhaps because you are downloading to a
|
|
|
|
non-native partition, or because you want to disable escaping of the
|
|
|
|
control characters, or you want to further restrict characters to only
|
|
|
|
those in the @sc{ascii} range of values.
|
|
|
|
|
|
|
|
The @var{modes} are a comma-separated set of text values. The
|
|
|
|
acceptable values are @samp{unix}, @samp{windows}, @samp{nocontrol},
|
|
|
|
@samp{ascii}, @samp{lowercase}, and @samp{uppercase}. The values
|
|
|
|
@samp{unix} and @samp{windows} are mutually exclusive (one will
|
|
|
|
override the other), as are @samp{lowercase} and
|
|
|
|
@samp{uppercase}. Those last are special cases, as they do not change
|
|
|
|
the set of characters that would be escaped, but rather force local
|
|
|
|
file paths to be converted either to lower- or uppercase.
|
|
|
|
|
|
|
|
When ``unix'' is specified, Wget escapes the character @samp{/} and
|
2003-09-16 21:32:05 -04:00
|
|
|
the control characters in the ranges 0--31 and 128--159. This is the
|
2009-07-28 03:19:48 -04:00
|
|
|
default on Unix-like operating systems.
|
2003-09-16 21:32:05 -04:00
|
|
|
|
2009-07-28 03:19:48 -04:00
|
|
|
When ``windows'' is given, Wget escapes the characters @samp{\},
|
2003-09-16 21:32:05 -04:00
|
|
|
@samp{|}, @samp{/}, @samp{:}, @samp{?}, @samp{"}, @samp{*}, @samp{<},
|
|
|
|
@samp{>}, and the control characters in the ranges 0--31 and 128--159.
|
|
|
|
In addition to this, Wget in Windows mode uses @samp{+} instead of
|
|
|
|
@samp{:} to separate host and port in local file names, and uses
|
2003-09-14 18:04:13 -04:00
|
|
|
@samp{@@} instead of @samp{?} to separate the query portion of the file
|
|
|
|
name from the rest. Therefore, a URL that would be saved as
|
|
|
|
@samp{www.xemacs.org:4300/search.pl?input=blah} in Unix mode would be
|
|
|
|
saved as @samp{www.xemacs.org+4300/search.pl@@input=blah} in Windows
|
2003-09-16 21:32:05 -04:00
|
|
|
mode. This mode is the default on Windows.
|
|
|
|
|
2009-07-28 03:19:48 -04:00
|
|
|
If you specify @samp{nocontrol}, then the escaping of the control
|
|
|
|
characters is also switched off. This option may make sense
|
|
|
|
when you are downloading URLs whose names contain UTF-8 characters, on
|
|
|
|
a system which can save and display filenames in UTF-8 (some possible
|
|
|
|
byte values used in UTF-8 byte sequences fall in the range of values
|
|
|
|
designated by Wget as ``controls'').
|
|
|
|
|
|
|
|
The @samp{ascii} mode is used to specify that any bytes whose values
|
|
|
|
are outside the range of @sc{ascii} characters (that is, greater than
|
|
|
|
127) shall be escaped. This can be useful when saving filenames
|
|
|
|
whose encoding does not match the one used locally.
|
2005-04-19 18:57:28 -04:00
|
|
|
|
|
|
|
@cindex IPv6
|
2012-11-08 06:15:41 -05:00
|
|
|
@item -4
|
2005-04-19 18:57:28 -04:00
|
|
|
@itemx --inet4-only
|
|
|
|
@itemx -6
|
|
|
|
@itemx --inet6-only
|
|
|
|
Force connecting to IPv4 or IPv6 addresses. With @samp{--inet4-only}
|
|
|
|
or @samp{-4}, Wget will only connect to IPv4 hosts, ignoring AAAA
|
|
|
|
records in DNS, and refusing to connect to IPv6 addresses specified in
|
|
|
|
URLs. Conversely, with @samp{--inet6-only} or @samp{-6}, Wget will
|
|
|
|
only connect to IPv6 hosts and ignore A records and IPv4 addresses.
|
|
|
|
|
|
|
|
Neither options should be needed normally. By default, an IPv6-aware
|
|
|
|
Wget will use the address family specified by the host's DNS record.
|
2006-06-20 10:12:29 -04:00
|
|
|
If the DNS responds with both IPv4 and IPv6 addresses, Wget will try
|
|
|
|
them in sequence until it finds one it can connect to. (Also see
|
2005-06-25 09:49:49 -04:00
|
|
|
@code{--prefer-family} option described below.)
|
2005-04-19 18:57:28 -04:00
|
|
|
|
|
|
|
These options can be used to deliberately force the use of IPv4 or
|
|
|
|
IPv6 address families on dual family systems, usually to aid debugging
|
|
|
|
or to deal with broken network configuration. Only one of
|
2005-06-25 09:49:49 -04:00
|
|
|
@samp{--inet6-only} and @samp{--inet4-only} may be specified at the
|
|
|
|
same time. Neither option is available in Wget compiled without IPv6
|
|
|
|
support.
|
2005-04-24 16:00:19 -04:00
|
|
|
|
2008-05-17 16:19:37 -04:00
|
|
|
@item --prefer-family=none/IPv4/IPv6
|
2005-04-24 16:00:19 -04:00
|
|
|
When given a choice of several addresses, connect to the addresses
|
2008-05-17 16:19:37 -04:00
|
|
|
with specified address family first. The address order returned by
|
|
|
|
DNS is used without change by default.
|
2005-04-24 16:00:19 -04:00
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
This avoids spurious errors and connect attempts when accessing hosts
|
2005-04-24 16:00:19 -04:00
|
|
|
that resolve to both IPv6 and IPv4 addresses from IPv4 networks. For
|
|
|
|
example, @samp{www.kame.net} resolves to
|
|
|
|
@samp{2001:200:0:8002:203:47ff:fea5:3085} and to
|
|
|
|
@samp{203.178.141.194}. When the preferred family is @code{IPv4}, the
|
|
|
|
IPv4 address is used first; when the preferred family is @code{IPv6},
|
|
|
|
the IPv6 address is used first; if the specified value is @code{none},
|
|
|
|
the address order returned by DNS is used without change.
|
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
Unlike @samp{-4} and @samp{-6}, this option doesn't inhibit access to
|
2005-04-24 16:00:19 -04:00
|
|
|
any address family, it only changes the @emph{order} in which the
|
|
|
|
addresses are accessed. Also note that the reordering performed by
|
|
|
|
this option is @dfn{stable}---it doesn't affect order of addresses of
|
|
|
|
the same family. That is, the relative order of all IPv4 addresses
|
|
|
|
and of all IPv6 addresses remains intact in all cases.
|
2005-04-25 16:00:16 -04:00
|
|
|
|
|
|
|
@item --retry-connrefused
|
|
|
|
Consider ``connection refused'' a transient error and try again.
|
|
|
|
Normally Wget gives up on a URL when it is unable to connect to the
|
|
|
|
site because failure to connect is taken as a sign that the server is
|
|
|
|
not running at all and that retries would not help. This option is
|
|
|
|
for mirroring unreliable sites whose servers tend to disappear for
|
|
|
|
short periods of time.
|
2005-04-27 17:30:22 -04:00
|
|
|
|
|
|
|
@cindex user
|
|
|
|
@cindex password
|
|
|
|
@cindex authentication
|
|
|
|
@item --user=@var{user}
|
|
|
|
@itemx --password=@var{password}
|
|
|
|
Specify the username @var{user} and password @var{password} for both
|
|
|
|
@sc{ftp} and @sc{http} file retrieval. These parameters can be overridden
|
|
|
|
using the @samp{--ftp-user} and @samp{--ftp-password} options for
|
|
|
|
@sc{ftp} connections and the @samp{--http-user} and @samp{--http-password}
|
|
|
|
options for @sc{http} connections.
|
2008-04-30 18:28:23 -04:00
|
|
|
|
|
|
|
@item --ask-password
|
|
|
|
Prompt for a password for each connection established. Cannot be specified
|
|
|
|
when @samp{--password} is being used, because they are mutually exclusive.
|
2009-07-27 00:50:19 -04:00
|
|
|
|
|
|
|
@cindex iri support
|
|
|
|
@cindex idn support
|
|
|
|
@item --no-iri
|
|
|
|
|
|
|
|
Turn off internationalized URI (IRI) support. Use @samp{--iri} to
|
|
|
|
turn it on. IRI support is activated by default.
|
|
|
|
|
|
|
|
You can set the default state of IRI support using the @code{iri}
|
|
|
|
command in @file{.wgetrc}. That setting may be overridden from the
|
|
|
|
command line.
|
|
|
|
|
|
|
|
@cindex local encoding
|
|
|
|
@item --local-encoding=@var{encoding}
|
|
|
|
|
|
|
|
Force Wget to use @var{encoding} as the default system encoding. That affects
|
|
|
|
how Wget converts URLs specified as arguments from locale to @sc{utf-8} for
|
|
|
|
IRI support.
|
|
|
|
|
|
|
|
Wget use the function @code{nl_langinfo()} and then the @code{CHARSET}
|
|
|
|
environment variable to get the locale. If it fails, @sc{ascii} is used.
|
|
|
|
|
|
|
|
You can set the default local encoding using the @code{local_encoding}
|
|
|
|
command in @file{.wgetrc}. That setting may be overridden from the
|
|
|
|
command line.
|
|
|
|
|
|
|
|
@cindex remote encoding
|
|
|
|
@item --remote-encoding=@var{encoding}
|
|
|
|
|
|
|
|
Force Wget to use @var{encoding} as the default remote server encoding.
|
|
|
|
That affects how Wget converts URIs found in files from remote encoding
|
|
|
|
to @sc{utf-8} during a recursive fetch. This options is only useful for
|
|
|
|
IRI support, for the interpretation of non-@sc{ascii} characters.
|
|
|
|
|
|
|
|
For HTTP, remote encoding can be found in HTTP @code{Content-Type}
|
|
|
|
header and in HTML @code{Content-Type http-equiv} meta tag.
|
|
|
|
|
|
|
|
You can set the default encoding using the @code{remoteencoding}
|
|
|
|
command in @file{.wgetrc}. That setting may be overridden from the
|
|
|
|
command line.
|
2010-09-29 07:34:09 -04:00
|
|
|
|
|
|
|
@cindex unlink
|
|
|
|
@item --unlink
|
|
|
|
|
|
|
|
Force Wget to unlink file instead of clobbering existing file. This
|
|
|
|
option is useful for downloading to the directory with hardlinks.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Directory Options, HTTP Options, Download Options, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Directory Options
|
|
|
|
|
2008-04-30 18:28:23 -04:00
|
|
|
@table @samp
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -nd
|
|
|
|
@itemx --no-directories
|
2000-11-16 07:35:27 -05:00
|
|
|
Do not create a hierarchy of directories when retrieving recursively.
|
|
|
|
With this option turned on, all files will get saved to the current
|
|
|
|
directory, without clobbering (if a name shows up more than once, the
|
|
|
|
filenames will get extensions @samp{.n}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item -x
|
|
|
|
@itemx --force-directories
|
|
|
|
The opposite of @samp{-nd}---create a hierarchy of directories, even if
|
|
|
|
one would not have been created otherwise. E.g. @samp{wget -x
|
2000-11-10 09:47:30 -05:00
|
|
|
http://fly.srk.fer.hr/robots.txt} will save the downloaded file to
|
|
|
|
@file{fly.srk.fer.hr/robots.txt}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item -nH
|
|
|
|
@itemx --no-host-directories
|
|
|
|
Disable generation of host-prefixed directories. By default, invoking
|
2000-11-10 09:47:30 -05:00
|
|
|
Wget with @samp{-r http://fly.srk.fer.hr/} will create a structure of
|
|
|
|
directories beginning with @file{fly.srk.fer.hr/}. This option disables
|
1999-12-02 02:42:23 -05:00
|
|
|
such behavior.
|
|
|
|
|
2003-12-05 22:07:10 -05:00
|
|
|
@item --protocol-directories
|
|
|
|
Use the protocol name as a directory component of local file names. For
|
|
|
|
example, with this option, @samp{wget -r http://@var{host}} will save to
|
|
|
|
@samp{http/@var{host}/...} rather than just to @samp{@var{host}/...}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex cut directories
|
|
|
|
@item --cut-dirs=@var{number}
|
|
|
|
Ignore @var{number} directory components. This is useful for getting a
|
|
|
|
fine-grained control over the directory where recursive retrieval will
|
|
|
|
be saved.
|
|
|
|
|
|
|
|
Take, for example, the directory at
|
|
|
|
@samp{ftp://ftp.xemacs.org/pub/xemacs/}. If you retrieve it with
|
|
|
|
@samp{-r}, it will be saved locally under
|
|
|
|
@file{ftp.xemacs.org/pub/xemacs/}. While the @samp{-nH} option can
|
|
|
|
remove the @file{ftp.xemacs.org/} part, you are still stuck with
|
|
|
|
@file{pub/xemacs}. This is where @samp{--cut-dirs} comes in handy; it
|
|
|
|
makes Wget not ``see'' @var{number} remote directory components. Here
|
|
|
|
are several examples of how @samp{--cut-dirs} option works.
|
|
|
|
|
|
|
|
@example
|
|
|
|
@group
|
|
|
|
No options -> ftp.xemacs.org/pub/xemacs/
|
|
|
|
-nH -> pub/xemacs/
|
|
|
|
-nH --cut-dirs=1 -> xemacs/
|
|
|
|
-nH --cut-dirs=2 -> .
|
|
|
|
|
|
|
|
--cut-dirs=1 -> ftp.xemacs.org/xemacs/
|
|
|
|
...
|
|
|
|
@end group
|
|
|
|
@end example
|
|
|
|
|
|
|
|
If you just want to get rid of the directory structure, this option is
|
|
|
|
similar to a combination of @samp{-nd} and @samp{-P}. However, unlike
|
|
|
|
@samp{-nd}, @samp{--cut-dirs} does not lose with subdirectories---for
|
|
|
|
instance, with @samp{-nH --cut-dirs=1}, a @file{beta/} subdirectory will
|
|
|
|
be placed to @file{xemacs/beta}, as one would expect.
|
|
|
|
|
|
|
|
@cindex directory prefix
|
|
|
|
@item -P @var{prefix}
|
|
|
|
@itemx --directory-prefix=@var{prefix}
|
|
|
|
Set directory prefix to @var{prefix}. The @dfn{directory prefix} is the
|
|
|
|
directory where all other files and subdirectories will be saved to,
|
|
|
|
i.e. the top of the retrieval tree. The default is @samp{.} (the
|
|
|
|
current directory).
|
|
|
|
@end table
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node HTTP Options, HTTPS (SSL/TLS) Options, Directory Options, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section HTTP Options
|
|
|
|
|
|
|
|
@table @samp
|
2008-08-04 01:36:38 -04:00
|
|
|
@cindex default page name
|
|
|
|
@cindex index.html
|
|
|
|
@item --default-page=@var{name}
|
|
|
|
Use @var{name} as the default file name when it isn't known (i.e., for
|
|
|
|
URLs that end in a slash), instead of @file{index.html}.
|
|
|
|
|
2000-10-20 01:55:46 -04:00
|
|
|
@cindex .html extension
|
2009-07-28 20:37:58 -04:00
|
|
|
@cindex .css extension
|
2000-10-20 01:55:46 -04:00
|
|
|
@item -E
|
2009-07-28 20:37:58 -04:00
|
|
|
@itemx --adjust-extension
|
2003-09-21 08:02:57 -04:00
|
|
|
If a file of type @samp{application/xhtml+xml} or @samp{text/html} is
|
|
|
|
downloaded and the URL does not end with the regexp
|
|
|
|
@samp{\.[Hh][Tt][Mm][Ll]?}, this option will cause the suffix @samp{.html}
|
|
|
|
to be appended to the local filename. This is useful, for instance, when
|
|
|
|
you're mirroring a remote site that uses @samp{.asp} pages, but you want
|
|
|
|
the mirrored pages to be viewable on your stock Apache server. Another
|
2003-09-30 17:09:06 -04:00
|
|
|
good use for this is when you're downloading CGI-generated materials. A URL
|
2003-09-21 08:02:57 -04:00
|
|
|
like @samp{http://site.com/article.cgi?25} will be saved as
|
2000-10-20 01:55:46 -04:00
|
|
|
@file{article.cgi?25.html}.
|
|
|
|
|
|
|
|
Note that filenames changed in this way will be re-downloaded every time
|
2000-11-16 07:35:27 -05:00
|
|
|
you re-mirror a site, because Wget can't tell that the local
|
2000-10-20 01:55:46 -04:00
|
|
|
@file{@var{X}.html} file corresponds to remote URL @samp{@var{X}} (since
|
|
|
|
it doesn't yet know that the URL produces output of type
|
2010-12-22 08:08:17 -05:00
|
|
|
@samp{text/html} or @samp{application/xhtml+xml}.
|
2000-10-20 01:55:46 -04:00
|
|
|
|
2008-04-24 19:48:46 -04:00
|
|
|
As of version 1.12, Wget will also ensure that any downloaded files of
|
2009-07-28 20:37:58 -04:00
|
|
|
type @samp{text/css} end in the suffix @samp{.css}, and the option was
|
|
|
|
renamed from @samp{--html-extension}, to better reflect its new
|
|
|
|
behavior. The old option name is still acceptable, but should now be
|
|
|
|
considered deprecated.
|
|
|
|
|
|
|
|
At some point in the future, this option may well be expanded to
|
|
|
|
include suffixes for other types of content, including content types
|
|
|
|
that are not parsed by Wget.
|
2008-04-24 19:48:46 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex http user
|
|
|
|
@cindex http password
|
|
|
|
@cindex authentication
|
|
|
|
@item --http-user=@var{user}
|
2005-04-27 17:30:22 -04:00
|
|
|
@itemx --http-password=@var{password}
|
1999-12-02 02:42:23 -05:00
|
|
|
Specify the username @var{user} and password @var{password} on an
|
|
|
|
@sc{http} server. According to the type of the challenge, Wget will
|
2007-09-28 00:57:11 -04:00
|
|
|
encode them using either the @code{basic} (insecure),
|
|
|
|
the @code{digest}, or the Windows @code{NTLM} authentication scheme.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Another way to specify username and password is in the @sc{url} itself
|
2002-04-10 17:42:16 -04:00
|
|
|
(@pxref{URL Format}). Either method reveals your password to anyone who
|
|
|
|
bothers to run @code{ps}. To prevent the passwords from being seen,
|
|
|
|
store them in @file{.wgetrc} or @file{.netrc}, and make sure to protect
|
|
|
|
those files from other users with @code{chmod}. If the passwords are
|
|
|
|
really important, do not leave them lying in those files either---edit
|
|
|
|
the files and delete them after Wget has started the download.
|
|
|
|
|
2005-04-27 17:34:00 -04:00
|
|
|
@iftex
|
2002-04-10 17:42:16 -04:00
|
|
|
For more information about security issues with Wget, @xref{Security
|
|
|
|
Considerations}.
|
2005-04-27 17:34:00 -04:00
|
|
|
@end iftex
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-11-05 12:11:33 -05:00
|
|
|
@cindex Keep-Alive, turning off
|
|
|
|
@cindex Persistent Connections, disabling
|
|
|
|
@item --no-http-keep-alive
|
|
|
|
Turn off the ``keep-alive'' feature for HTTP downloads. Normally, Wget
|
|
|
|
asks the server to keep the connection open so that, when you download
|
|
|
|
more than one document from the same server, they get transferred over
|
|
|
|
the same TCP connection. This saves time and at the same time reduces
|
|
|
|
the load on the server.
|
|
|
|
|
|
|
|
This option is useful when, for some reason, persistent (keep-alive)
|
|
|
|
connections don't work for you, for example due to a server bug or due
|
|
|
|
to the inability of server-side scripts to cope with the connections.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex proxy
|
|
|
|
@cindex cache
|
2003-11-08 18:48:36 -05:00
|
|
|
@item --no-cache
|
|
|
|
Disable server-side cache. In this case, Wget will send the remote
|
|
|
|
server an appropriate directive (@samp{Pragma: no-cache}) to get the
|
|
|
|
file from the remote service, rather than returning the cached version.
|
|
|
|
This is especially useful for retrieving and flushing out-of-date
|
|
|
|
documents on proxy servers.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Caching is allowed by default.
|
|
|
|
|
2001-04-27 02:08:23 -04:00
|
|
|
@cindex cookies
|
2003-11-08 18:48:36 -05:00
|
|
|
@item --no-cookies
|
|
|
|
Disable the use of cookies. Cookies are a mechanism for maintaining
|
|
|
|
server-side state. The server sends the client a cookie using the
|
|
|
|
@code{Set-Cookie} header, and the client responds with the same cookie
|
|
|
|
upon further requests. Since cookies allow the server owners to keep
|
|
|
|
track of visitors and for sites to exchange this information, some
|
|
|
|
consider them a breach of privacy. The default is to use cookies;
|
|
|
|
however, @emph{storing} cookies is not on by default.
|
2001-04-27 02:08:23 -04:00
|
|
|
|
|
|
|
@cindex loading cookies
|
|
|
|
@cindex cookies, loading
|
|
|
|
@item --load-cookies @var{file}
|
2001-12-11 03:15:11 -05:00
|
|
|
Load cookies from @var{file} before the first HTTP retrieval.
|
|
|
|
@var{file} is a textual file in the format originally used by Netscape's
|
|
|
|
@file{cookies.txt} file.
|
2001-04-27 02:08:23 -04:00
|
|
|
|
2001-12-08 15:14:00 -05:00
|
|
|
You will typically use this option when mirroring sites that require
|
|
|
|
that you be logged in to access some or all of their content. The login
|
|
|
|
process typically works by the web server issuing an @sc{http} cookie
|
|
|
|
upon receiving and verifying your credentials. The cookie is then
|
|
|
|
resent by the browser when accessing that part of the site, and so
|
|
|
|
proves your identity.
|
|
|
|
|
|
|
|
Mirroring such a site requires Wget to send the same cookies your
|
|
|
|
browser sends when communicating with the site. This is achieved by
|
|
|
|
@samp{--load-cookies}---simply point Wget to the location of the
|
|
|
|
@file{cookies.txt} file, and it will send the same cookies your browser
|
2001-12-11 03:15:11 -05:00
|
|
|
would send in the same situation. Different browsers keep textual
|
|
|
|
cookie files in different locations:
|
|
|
|
|
|
|
|
@table @asis
|
|
|
|
@item Netscape 4.x.
|
|
|
|
The cookies are in @file{~/.netscape/cookies.txt}.
|
|
|
|
|
|
|
|
@item Mozilla and Netscape 6.x.
|
|
|
|
Mozilla's cookie file is also named @file{cookies.txt}, located
|
|
|
|
somewhere under @file{~/.mozilla}, in the directory of your profile.
|
|
|
|
The full path usually ends up looking somewhat like
|
|
|
|
@file{~/.mozilla/default/@var{some-weird-string}/cookies.txt}.
|
|
|
|
|
|
|
|
@item Internet Explorer.
|
|
|
|
You can produce a cookie file Wget can use by using the File menu,
|
|
|
|
Import and Export, Export Cookies. This has been tested with Internet
|
|
|
|
Explorer 5; it is not guaranteed to work with earlier versions.
|
|
|
|
|
|
|
|
@item Other browsers.
|
|
|
|
If you are using a different browser to create your cookies,
|
|
|
|
@samp{--load-cookies} will only work if you can locate or produce a
|
|
|
|
cookie file in the Netscape format that Wget expects.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
If you cannot use @samp{--load-cookies}, there might still be an
|
|
|
|
alternative. If your browser supports a ``cookie manager'', you can use
|
|
|
|
it to view the cookies used when accessing the site you're mirroring.
|
|
|
|
Write down the name and value of the cookie, and manually instruct Wget
|
|
|
|
to send those cookies, bypassing the ``official'' cookie support:
|
2001-12-08 15:14:00 -05:00
|
|
|
|
|
|
|
@example
|
2005-04-17 19:26:26 -04:00
|
|
|
wget --no-cookies --header "Cookie: @var{name}=@var{value}"
|
2001-12-08 15:14:00 -05:00
|
|
|
@end example
|
|
|
|
|
2001-04-27 02:08:23 -04:00
|
|
|
@cindex saving cookies
|
|
|
|
@cindex cookies, saving
|
|
|
|
@item --save-cookies @var{file}
|
2003-11-05 16:11:59 -05:00
|
|
|
Save cookies to @var{file} before exiting. This will not save cookies
|
|
|
|
that have expired or that have no expiry time (so-called ``session
|
|
|
|
cookies''), but also see @samp{--keep-session-cookies}.
|
|
|
|
|
|
|
|
@cindex cookies, session
|
|
|
|
@cindex session cookies
|
|
|
|
@item --keep-session-cookies
|
|
|
|
When specified, causes @samp{--save-cookies} to also save session
|
2005-04-26 16:26:24 -04:00
|
|
|
cookies. Session cookies are normally not saved because they are
|
|
|
|
meant to be kept in memory and forgotten when you exit the browser.
|
|
|
|
Saving them is useful on sites that require you to log in or to visit
|
|
|
|
the home page before you can access some pages. With this option,
|
|
|
|
multiple Wget runs are considered a single browser session as far as
|
|
|
|
the site is concerned.
|
2003-11-05 16:11:59 -05:00
|
|
|
|
|
|
|
Since the cookie file format does not normally carry session cookies,
|
|
|
|
Wget marks them with an expiry timestamp of 0. Wget's
|
|
|
|
@samp{--load-cookies} recognizes those as session cookies, but it might
|
|
|
|
confuse other browsers. Also note that cookies so loaded will be
|
|
|
|
treated as other session cookies, which means that if you want
|
|
|
|
@samp{--save-cookies} to preserve them again, you must use
|
|
|
|
@samp{--keep-session-cookies} again.
|
2001-04-27 02:08:23 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex Content-Length, ignore
|
|
|
|
@cindex ignore length
|
|
|
|
@item --ignore-length
|
|
|
|
Unfortunately, some @sc{http} servers (@sc{cgi} programs, to be more
|
|
|
|
precise) send out bogus @code{Content-Length} headers, which makes Wget
|
|
|
|
go wild, as it thinks not all the document was retrieved. You can spot
|
|
|
|
this syndrome if Wget retries getting the same document again and again,
|
|
|
|
each time claiming that the (otherwise normal) connection has closed on
|
|
|
|
the very same byte.
|
|
|
|
|
|
|
|
With this option, Wget will ignore the @code{Content-Length} header---as
|
|
|
|
if it never existed.
|
|
|
|
|
|
|
|
@cindex header, add
|
2005-04-27 14:23:41 -04:00
|
|
|
@item --header=@var{header-line}
|
|
|
|
Send @var{header-line} along with the rest of the headers in each
|
|
|
|
@sc{http} request. The supplied header is sent as-is, which means it
|
|
|
|
must contain name and value separated by colon, and must not contain
|
|
|
|
newlines.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
You may define more than one additional header by specifying
|
|
|
|
@samp{--header} more than once.
|
|
|
|
|
|
|
|
@example
|
|
|
|
@group
|
|
|
|
wget --header='Accept-Charset: iso-8859-2' \
|
|
|
|
--header='Accept-Language: hr' \
|
2000-11-10 09:47:30 -05:00
|
|
|
http://fly.srk.fer.hr/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end group
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Specification of an empty string as the header value will clear all
|
|
|
|
previous user-defined headers.
|
|
|
|
|
2005-04-27 14:23:41 -04:00
|
|
|
As of Wget 1.10, this option can be used to override headers otherwise
|
|
|
|
generated automatically. This example instructs Wget to connect to
|
|
|
|
localhost, but to specify @samp{foo.bar} in the @code{Host} header:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget --header="Host: foo.bar" http://localhost/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
In versions of Wget prior to 1.10 such use of @samp{--header} caused
|
|
|
|
sending of duplicate headers.
|
|
|
|
|
2007-07-28 22:37:14 -04:00
|
|
|
@cindex redirect
|
|
|
|
@item --max-redirect=@var{number}
|
|
|
|
Specifies the maximum number of redirections to follow for a resource.
|
|
|
|
The default is 20, which is usually far more than necessary. However, on
|
|
|
|
those occasions where you want to allow more (or fewer), this is the
|
|
|
|
option to use.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex proxy user
|
|
|
|
@cindex proxy password
|
|
|
|
@cindex proxy authentication
|
|
|
|
@item --proxy-user=@var{user}
|
2005-04-27 17:30:22 -04:00
|
|
|
@itemx --proxy-password=@var{password}
|
1999-12-02 02:42:23 -05:00
|
|
|
Specify the username @var{user} and password @var{password} for
|
|
|
|
authentication on a proxy server. Wget will encode them using the
|
|
|
|
@code{basic} authentication scheme.
|
|
|
|
|
2005-04-27 17:30:22 -04:00
|
|
|
Security considerations similar to those with @samp{--http-password}
|
2002-04-10 17:42:16 -04:00
|
|
|
pertain here as well.
|
|
|
|
|
2000-05-22 22:29:38 -04:00
|
|
|
@cindex http referer
|
|
|
|
@cindex referer, http
|
|
|
|
@item --referer=@var{url}
|
|
|
|
Include `Referer: @var{url}' header in HTTP request. Useful for
|
|
|
|
retrieving documents with server-side processing that assume they are
|
|
|
|
always being retrieved by interactive web browsers and only come out
|
|
|
|
properly when Referer is set to one of the pages that point to them.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex server response, save
|
2003-11-08 18:48:36 -05:00
|
|
|
@item --save-headers
|
1999-12-02 02:42:23 -05:00
|
|
|
Save the headers sent by the @sc{http} server to the file, preceding the
|
|
|
|
actual contents, with an empty line as the separator.
|
|
|
|
|
|
|
|
@cindex user-agent
|
|
|
|
@item -U @var{agent-string}
|
|
|
|
@itemx --user-agent=@var{agent-string}
|
|
|
|
Identify as @var{agent-string} to the @sc{http} server.
|
|
|
|
|
|
|
|
The @sc{http} protocol allows the clients to identify themselves using a
|
|
|
|
@code{User-Agent} header field. This enables distinguishing the
|
|
|
|
@sc{www} software, usually for statistical purposes or for tracing of
|
|
|
|
protocol violations. Wget normally identifies as
|
|
|
|
@samp{Wget/@var{version}}, @var{version} being the current version
|
|
|
|
number of Wget.
|
|
|
|
|
|
|
|
However, some sites have been known to impose the policy of tailoring
|
|
|
|
the output according to the @code{User-Agent}-supplied information.
|
2005-05-06 13:23:37 -04:00
|
|
|
While this is not such a bad idea in theory, it has been abused by
|
|
|
|
servers denying information to clients other than (historically)
|
|
|
|
Netscape or, more frequently, Microsoft Internet Explorer. This
|
|
|
|
option allows you to change the @code{User-Agent} line issued by Wget.
|
|
|
|
Use of this option is discouraged, unless you really know what you are
|
|
|
|
doing.
|
|
|
|
|
|
|
|
Specifying empty user agent with @samp{--user-agent=""} instructs Wget
|
|
|
|
not to send the @code{User-Agent} header in @sc{http} requests.
|
2003-10-06 19:12:37 -04:00
|
|
|
|
|
|
|
@cindex POST
|
|
|
|
@item --post-data=@var{string}
|
|
|
|
@itemx --post-file=@var{file}
|
2009-07-07 02:12:06 -04:00
|
|
|
Use POST as the method for all HTTP requests and send the specified
|
|
|
|
data in the request body. @samp{--post-data} sends @var{string} as
|
|
|
|
data, whereas @samp{--post-file} sends the contents of @var{file}.
|
|
|
|
Other than that, they work in exactly the same way. In particular,
|
|
|
|
they @emph{both} expect content of the form @code{key1=value1&key2=value2},
|
|
|
|
with percent-encoding for special characters; the only difference is
|
2011-07-28 07:16:32 -04:00
|
|
|
that one expects its content as a command-line parameter and the other
|
2009-07-07 02:12:06 -04:00
|
|
|
accepts its content from a file. In particular, @samp{--post-file} is
|
|
|
|
@emph{not} for transmitting files as form attachments: those must
|
|
|
|
appear as @code{key=value} data (with appropriate percent-coding) just
|
|
|
|
like everything else. Wget does not currently support
|
|
|
|
@code{multipart/form-data} for transmitting POST data; only
|
|
|
|
@code{application/x-www-form-urlencoded}. Only one of
|
|
|
|
@samp{--post-data} and @samp{--post-file} should be specified.
|
2003-10-06 19:12:37 -04:00
|
|
|
|
2013-04-12 14:14:32 -04:00
|
|
|
Please note that wget does not require the content to be of the form
|
|
|
|
@code{key1=value1&key2=value2}, and neither does it test for it. Wget will
|
|
|
|
simply transmit whatever data is provided to it. Most servers however expect
|
|
|
|
the POST data to be in the above format when processing HTML Forms.
|
|
|
|
|
2003-10-06 19:12:37 -04:00
|
|
|
Please be aware that Wget needs to know the size of the POST data in
|
|
|
|
advance. Therefore the argument to @code{--post-file} must be a regular
|
|
|
|
file; specifying a FIFO or something like @file{/dev/stdin} won't work.
|
|
|
|
It's not quite clear how to work around this limitation inherent in
|
|
|
|
HTTP/1.0. Although HTTP/1.1 introduces @dfn{chunked} transfer that
|
|
|
|
doesn't require knowing the request length in advance, a client can't
|
|
|
|
use chunked unless it knows it's talking to an HTTP/1.1 server. And it
|
|
|
|
can't know that until it receives a response, which in turn requires the
|
|
|
|
request to have been completed -- a chicken-and-egg problem.
|
|
|
|
|
2013-06-16 14:46:50 -04:00
|
|
|
Note: As of version 1.15 if Wget is redirected after the POST request is
|
|
|
|
completed, its behaviour will depend on the response code returned by the
|
|
|
|
server. In case of a 301 Moved Permanently, 302 Moved Temporarily or
|
|
|
|
307 Temporary Redirect, Wget will, in accordance with RFC2616, continue
|
|
|
|
to send a POST request.
|
|
|
|
In case a server wants the client to change the Request method upon
|
|
|
|
redirection, it should send a 303 See Other response code.
|
|
|
|
|
|
|
|
This example shows how to log in to a server using POST and then proceed to
|
2003-10-06 19:12:37 -04:00
|
|
|
download the desired pages, presumably only accessible to authorized
|
|
|
|
users:
|
|
|
|
|
|
|
|
@example
|
|
|
|
@group
|
|
|
|
# @r{Log in to the server. This can be done only once.}
|
|
|
|
wget --save-cookies cookies.txt \
|
|
|
|
--post-data 'user=foo&password=bar' \
|
|
|
|
http://server.com/auth.php
|
|
|
|
|
|
|
|
# @r{Now grab the page or pages we care about.}
|
|
|
|
wget --load-cookies cookies.txt \
|
|
|
|
-p http://server.com/interesting/article.php
|
|
|
|
@end group
|
|
|
|
@end example
|
2005-04-27 13:44:02 -04:00
|
|
|
|
|
|
|
If the server is using session cookies to track user authentication,
|
|
|
|
the above will not work because @samp{--save-cookies} will not save
|
|
|
|
them (and neither will browsers) and the @file{cookies.txt} file will
|
|
|
|
be empty. In that case use @samp{--keep-session-cookies} along with
|
|
|
|
@samp{--save-cookies} to force saving of session cookies.
|
2008-01-23 23:19:56 -05:00
|
|
|
|
2013-04-12 14:14:32 -04:00
|
|
|
@cindex Other HTTP Methods
|
|
|
|
@item --method=@var{HTTP-Method}
|
|
|
|
For the purpose of RESTful scripting, Wget allows sending of other HTTP Methods
|
|
|
|
without the need to explicitly set them using @samp{--header=Header-Line}.
|
|
|
|
Wget will use whatever string is passed to it after @samp{--method} as the HTTP
|
|
|
|
Method to the server.
|
|
|
|
|
|
|
|
@item --body-data=@var{Data-String}
|
|
|
|
@itemx --body-file=@var{Data-File}
|
|
|
|
Must be set when additional data needs to be sent to the server along with the
|
2013-06-16 14:46:50 -04:00
|
|
|
Method specified using @samp{--method}. @samp{--body-data} sends @var{string} as
|
|
|
|
data, whereas @samp{--body-file} sends the contents of @var{file}. Other than that,
|
2013-04-12 14:14:32 -04:00
|
|
|
they work in exactly the same way.
|
|
|
|
|
|
|
|
Currently, @samp{--body-file} is @emph{not} for transmitting files as a whole.
|
|
|
|
Wget does not currently support @code{multipart/form-data} for transmitting data;
|
|
|
|
only @code{application/x-www-form-urlencoded}. In the future, this may be changed
|
|
|
|
so that wget sends the @samp{--body-file} as a complete file instead of sending its
|
|
|
|
contents to the server. Please be aware that Wget needs to know the contents of
|
|
|
|
BODY Data in advance, and hence the argument to @samp{--body-file} should be a
|
|
|
|
regular file. See @samp{--post-file} for a more detailed explanation.
|
|
|
|
Only one of @samp{--body-data} and @samp{--body-file} should be specified.
|
|
|
|
|
2013-06-16 14:46:50 -04:00
|
|
|
If Wget is redirected after the request is completed, Wget will
|
|
|
|
suspend the current method and send a GET request till the redirection
|
|
|
|
is completed. This is true for all redirection response codes except
|
|
|
|
307 Temporary Redirect which is used to explicitly specify that the
|
|
|
|
request method should @emph{not} change. Another exception is when
|
|
|
|
the method is set to @code{POST}, in which case the redirection rules
|
|
|
|
specified under @samp{--post-data} are followed.
|
2013-04-12 14:14:32 -04:00
|
|
|
|
2008-01-23 23:19:56 -05:00
|
|
|
@cindex Content-Disposition
|
|
|
|
@item --content-disposition
|
|
|
|
|
|
|
|
If this is set to on, experimental (not fully-functional) support for
|
|
|
|
@code{Content-Disposition} headers is enabled. This can currently result in
|
|
|
|
extra round-trips to the server for a @code{HEAD} request, and is known
|
|
|
|
to suffer from a few bugs, which is why it is not currently enabled by default.
|
|
|
|
|
|
|
|
This option is useful for some file-downloading CGI programs that use
|
|
|
|
@code{Content-Disposition} headers to describe what the name of a
|
|
|
|
downloaded file should be.
|
|
|
|
|
2011-10-06 07:25:17 -04:00
|
|
|
@cindex Content On Error
|
|
|
|
@item --content-on-error
|
|
|
|
|
|
|
|
If this is set to on, wget will not skip the content when the server responds
|
|
|
|
with a http status code that indicates error.
|
|
|
|
|
2010-07-28 15:22:22 -04:00
|
|
|
@cindex Trust server names
|
|
|
|
@item --trust-server-names
|
|
|
|
|
|
|
|
If this is set to on, on a redirect the last component of the
|
|
|
|
redirection URL will be used as the local file name. By default it is
|
|
|
|
used the last component in the original URL.
|
|
|
|
|
2008-02-10 20:31:27 -05:00
|
|
|
@cindex authentication
|
|
|
|
@item --auth-no-challenge
|
|
|
|
|
|
|
|
If this option is given, Wget will send Basic HTTP authentication
|
|
|
|
information (plaintext username and password) for all requests, just
|
|
|
|
like Wget 1.10.2 and prior did by default.
|
|
|
|
|
|
|
|
Use of this option is not recommended, and is intended only to support
|
|
|
|
some few obscure servers, which never send HTTP authentication
|
|
|
|
challenges, but accept unsolicited auth info, say, in addition to
|
|
|
|
form-based authentication.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node HTTPS (SSL/TLS) Options, FTP Options, HTTP Options, Invoking
|
2005-04-24 05:21:07 -04:00
|
|
|
@section HTTPS (SSL/TLS) Options
|
|
|
|
|
|
|
|
@cindex SSL
|
2005-04-27 13:44:02 -04:00
|
|
|
To support encrypted HTTP (HTTPS) downloads, Wget must be compiled
|
2005-04-24 05:21:07 -04:00
|
|
|
with an external SSL library, currently OpenSSL. If Wget is compiled
|
|
|
|
without SSL support, none of these options are available.
|
|
|
|
|
|
|
|
@table @samp
|
2005-04-27 13:44:02 -04:00
|
|
|
@cindex SSL protocol, choose
|
|
|
|
@item --secure-protocol=@var{protocol}
|
|
|
|
Choose the secure protocol to be used. Legal values are @samp{auto},
|
2013-09-03 05:49:01 -04:00
|
|
|
@samp{SSLv2}, @samp{SSLv3}, @samp{TLSv1} and @samp{PFS}. If @samp{auto}
|
|
|
|
is used, the SSL library is given the liberty of choosing the appropriate
|
2005-04-27 13:44:02 -04:00
|
|
|
protocol automatically, which is achieved by sending an SSLv2 greeting
|
|
|
|
and announcing support for SSLv3 and TLSv1. This is the default.
|
|
|
|
|
|
|
|
Specifying @samp{SSLv2}, @samp{SSLv3}, or @samp{TLSv1} forces the use
|
|
|
|
of the corresponding protocol. This is useful when talking to old and
|
2013-09-03 05:49:01 -04:00
|
|
|
buggy SSL server implementations that make it hard for the underlying
|
|
|
|
SSL library to choose the correct protocol version. Fortunately, such
|
|
|
|
servers are quite rare.
|
|
|
|
|
|
|
|
Specifying @samp{PFS} enforces the use of the so-called Perfect Forward
|
|
|
|
Security cipher suites. In short, PFS adds security by creating a one-time
|
|
|
|
key for each SSL connection. It has a bit more CPU impact on client and server.
|
|
|
|
We use known to be secure ciphers (e.g. no MD4) and the TLS protocol.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
2013-08-22 06:28:11 -04:00
|
|
|
@item --https-only
|
|
|
|
When in recursive mode, only HTTPS links are followed.
|
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
@cindex SSL certificate, check
|
|
|
|
@item --no-check-certificate
|
2005-05-11 04:44:43 -04:00
|
|
|
Don't check the server certificate against the available certificate
|
|
|
|
authorities. Also don't require the URL host name to match the common
|
|
|
|
name presented by the certificate.
|
|
|
|
|
|
|
|
As of Wget 1.10, the default is to verify the server's certificate
|
|
|
|
against the recognized certificate authorities, breaking the SSL
|
|
|
|
handshake and aborting the download if the verification fails.
|
|
|
|
Although this provides more secure downloads, it does break
|
|
|
|
interoperability with some sites that worked with previous Wget
|
|
|
|
versions, particularly those using self-signed, expired, or otherwise
|
|
|
|
invalid certificates. This option forces an ``insecure'' mode of
|
|
|
|
operation that turns the certificate verification errors into warnings
|
|
|
|
and allows you to proceed.
|
|
|
|
|
2005-05-11 06:57:01 -04:00
|
|
|
If you encounter ``certificate verification'' errors or ones saying
|
|
|
|
that ``common name doesn't match requested host name'', you can use
|
|
|
|
this option to bypass the verification and proceed with the download.
|
|
|
|
@emph{Only use this option if you are otherwise convinced of the
|
|
|
|
site's authenticity, or if you really don't care about the validity of
|
|
|
|
its certificate.} It is almost always a bad idea not to check the
|
|
|
|
certificates when transmitting confidential or important data.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
|
|
|
@cindex SSL certificate
|
|
|
|
@item --certificate=@var{file}
|
2005-04-24 05:21:07 -04:00
|
|
|
Use the client certificate stored in @var{file}. This is needed for
|
|
|
|
servers that are configured to require certificates from the clients
|
|
|
|
that connect to them. Normally a certificate is not required and this
|
|
|
|
switch is optional.
|
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
@cindex SSL certificate type, specify
|
|
|
|
@item --certificate-type=@var{type}
|
|
|
|
Specify the type of the client certificate. Legal values are
|
|
|
|
@samp{PEM} (assumed by default) and @samp{DER}, also known as
|
|
|
|
@samp{ASN1}.
|
2005-04-24 05:21:07 -04:00
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
@item --private-key=@var{file}
|
|
|
|
Read the private key from @var{file}. This allows you to provide the
|
|
|
|
private key in a file separate from the certificate.
|
2005-04-24 05:21:07 -04:00
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
@item --private-key-type=@var{type}
|
|
|
|
Specify the type of the private key. Accepted values are @samp{PEM}
|
|
|
|
(the default) and @samp{DER}.
|
2005-04-24 05:21:07 -04:00
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
@item --ca-certificate=@var{file}
|
|
|
|
Use @var{file} as the file with the bundle of certificate authorities
|
|
|
|
(``CA'') to verify the peers. The certificates must be in PEM format.
|
2005-04-24 05:21:07 -04:00
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
Without this option Wget looks for CA certificates at the
|
|
|
|
system-specified locations, chosen at OpenSSL installation time.
|
2005-04-24 05:21:07 -04:00
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
@cindex SSL certificate authority
|
|
|
|
@item --ca-directory=@var{directory}
|
|
|
|
Specifies directory containing CA certificates in PEM format. Each
|
|
|
|
file contains one CA certificate, and the file name is based on a hash
|
|
|
|
value derived from the certificate. This is achieved by processing a
|
|
|
|
certificate directory with the @code{c_rehash} utility supplied with
|
|
|
|
OpenSSL. Using @samp{--ca-directory} is more efficient than
|
|
|
|
@samp{--ca-certificate} when many certificates are installed because
|
|
|
|
it allows Wget to fetch certificates on demand.
|
|
|
|
|
|
|
|
Without this option Wget looks for CA certificates at the
|
|
|
|
system-specified locations, chosen at OpenSSL installation time.
|
2005-04-24 05:21:07 -04:00
|
|
|
|
2005-04-27 17:10:30 -04:00
|
|
|
@cindex entropy, specifying source of
|
|
|
|
@cindex randomness, specifying source of
|
|
|
|
@item --random-file=@var{file}
|
|
|
|
Use @var{file} as the source of random data for seeding the
|
|
|
|
pseudo-random number generator on systems without @file{/dev/random}.
|
|
|
|
|
|
|
|
On such systems the SSL library needs an external source of randomness
|
|
|
|
to initialize. Randomness may be provided by EGD (see
|
|
|
|
@samp{--egd-file} below) or read from an external source specified by
|
|
|
|
the user. If this option is not specified, Wget looks for random data
|
|
|
|
in @code{$RANDFILE} or, if that is unset, in @file{$HOME/.rnd}. If
|
|
|
|
none of those are available, it is likely that SSL encryption will not
|
|
|
|
be usable.
|
|
|
|
|
|
|
|
If you're getting the ``Could not seed OpenSSL PRNG; disabling SSL.''
|
|
|
|
error, you should provide random data using some of the methods
|
|
|
|
described above.
|
|
|
|
|
2005-04-24 05:21:07 -04:00
|
|
|
@cindex EGD
|
|
|
|
@item --egd-file=@var{file}
|
|
|
|
Use @var{file} as the EGD socket. EGD stands for @dfn{Entropy
|
|
|
|
Gathering Daemon}, a user-space program that collects data from
|
|
|
|
various unpredictable system sources and makes it available to other
|
|
|
|
programs that might need it. Encryption software, such as the SSL
|
|
|
|
library, needs sources of non-repeating randomness to seed the random
|
|
|
|
number generator used to produce cryptographically strong keys.
|
|
|
|
|
|
|
|
OpenSSL allows the user to specify his own source of entropy using the
|
|
|
|
@code{RAND_FILE} environment variable. If this variable is unset, or
|
|
|
|
if the specified file does not produce enough randomness, OpenSSL will
|
|
|
|
read random data from EGD socket specified using this option.
|
|
|
|
|
|
|
|
If this option is not specified (and the equivalent startup command is
|
|
|
|
not used), EGD is never contacted. EGD is not needed on modern Unix
|
|
|
|
systems that support @file{/dev/random}.
|
|
|
|
@end table
|
|
|
|
|
2012-06-09 07:14:51 -04:00
|
|
|
@cindex WARC
|
2012-06-09 07:17:27 -04:00
|
|
|
@table @samp
|
2012-06-09 07:14:51 -04:00
|
|
|
@item --warc-file=@var{file}
|
|
|
|
Use @var{file} as the destination WARC file.
|
|
|
|
|
|
|
|
@item --warc-header=@var{string}
|
|
|
|
Use @var{string} into as the warcinfo record.
|
|
|
|
|
|
|
|
@item --warc-max-size=@var{size}
|
|
|
|
Set the maximum size of the WARC files to @var{size}.
|
|
|
|
|
|
|
|
@item --warc-cdx
|
|
|
|
Write CDX index files.
|
|
|
|
|
|
|
|
@item --warc-dedup=@var{file}
|
|
|
|
Do not store records listed in this CDX file.
|
|
|
|
|
|
|
|
@item --no-warc-compression
|
|
|
|
Do not compress WARC files with GZIP.
|
|
|
|
|
|
|
|
@item --no-warc-digests
|
|
|
|
Do not calculate SHA1 digests.
|
|
|
|
|
|
|
|
@item --no-warc-keep-log
|
|
|
|
Do not store the log file in a WARC record.
|
|
|
|
|
2012-06-09 07:17:27 -04:00
|
|
|
@item --warc-tempdir=@var{dir}
|
2012-06-09 07:14:51 -04:00
|
|
|
Specify the location for temporary files created by the WARC writer.
|
2012-06-09 07:17:27 -04:00
|
|
|
@end table
|
2012-06-09 07:14:51 -04:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node FTP Options, Recursive Retrieval Options, HTTPS (SSL/TLS) Options, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section FTP Options
|
|
|
|
|
|
|
|
@table @samp
|
2005-04-27 17:30:22 -04:00
|
|
|
@cindex ftp user
|
|
|
|
@cindex ftp password
|
|
|
|
@cindex ftp authentication
|
|
|
|
@item --ftp-user=@var{user}
|
|
|
|
@itemx --ftp-password=@var{password}
|
|
|
|
Specify the username @var{user} and password @var{password} on an
|
|
|
|
@sc{ftp} server. Without this, or the corresponding startup option,
|
|
|
|
the password defaults to @samp{-wget@@}, normally used for anonymous
|
|
|
|
FTP.
|
|
|
|
|
|
|
|
Another way to specify username and password is in the @sc{url} itself
|
|
|
|
(@pxref{URL Format}). Either method reveals your password to anyone who
|
|
|
|
bothers to run @code{ps}. To prevent the passwords from being seen,
|
|
|
|
store them in @file{.wgetrc} or @file{.netrc}, and make sure to protect
|
|
|
|
those files from other users with @code{chmod}. If the passwords are
|
|
|
|
really important, do not leave them lying in those files either---edit
|
|
|
|
the files and delete them after Wget has started the download.
|
|
|
|
|
2005-04-27 17:34:00 -04:00
|
|
|
@iftex
|
2005-04-27 17:30:22 -04:00
|
|
|
For more information about security issues with Wget, @xref{Security
|
|
|
|
Considerations}.
|
2005-04-27 17:34:00 -04:00
|
|
|
@end iftex
|
2005-04-23 05:43:21 -04:00
|
|
|
|
2001-02-23 16:31:54 -05:00
|
|
|
@cindex .listing files, removing
|
2003-11-14 18:39:14 -05:00
|
|
|
@item --no-remove-listing
|
2001-02-23 16:31:54 -05:00
|
|
|
Don't remove the temporary @file{.listing} files generated by @sc{ftp}
|
|
|
|
retrievals. Normally, these files contain the raw directory listings
|
2001-02-23 17:49:42 -05:00
|
|
|
received from @sc{ftp} servers. Not removing them can be useful for
|
|
|
|
debugging purposes, or when you want to be able to easily check on the
|
|
|
|
contents of remote server directories (e.g. to verify that a mirror
|
|
|
|
you're running is complete).
|
|
|
|
|
|
|
|
Note that even though Wget writes to a known filename for this file,
|
|
|
|
this is not a security hole in the scenario of a user making
|
|
|
|
@file{.listing} a symbolic link to @file{/etc/passwd} or something and
|
|
|
|
asking @code{root} to run Wget in his or her directory. Depending on
|
|
|
|
the options used, either Wget will refuse to write to @file{.listing},
|
|
|
|
making the globbing/recursion/time-stamping operation fail, or the
|
|
|
|
symbolic link will be deleted and replaced with the actual
|
|
|
|
@file{.listing} file, or the listing will be written to a
|
|
|
|
@file{.listing.@var{number}} file.
|
|
|
|
|
|
|
|
Even though this situation isn't a problem, though, @code{root} should
|
|
|
|
never run Wget in a non-trusted user's directory. A user could do
|
|
|
|
something as simple as linking @file{index.html} to @file{/etc/passwd}
|
|
|
|
and asking @code{root} to run Wget with @samp{-N} or @samp{-r} so the file
|
|
|
|
will be overwritten.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex globbing, toggle
|
2003-11-08 18:48:36 -05:00
|
|
|
@item --no-glob
|
|
|
|
Turn off @sc{ftp} globbing. Globbing refers to the use of shell-like
|
|
|
|
special characters (@dfn{wildcards}), like @samp{*}, @samp{?}, @samp{[}
|
|
|
|
and @samp{]} to retrieve more than one file from the same directory at
|
|
|
|
once, like:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget ftp://gnjilux.srk.fer.hr/*.msg
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
By default, globbing will be turned on if the @sc{url} contains a
|
|
|
|
globbing character. This option may be used to turn globbing on or off
|
|
|
|
permanently.
|
|
|
|
|
|
|
|
You may have to quote the @sc{url} to protect it from being expanded by
|
|
|
|
your shell. Globbing makes Wget look for a directory listing, which is
|
|
|
|
system-specific. This is why it currently works only with Unix @sc{ftp}
|
|
|
|
servers (and the ones emulating Unix @code{ls} output).
|
|
|
|
|
|
|
|
@cindex passive ftp
|
2005-03-06 14:34:25 -05:00
|
|
|
@item --no-passive-ftp
|
|
|
|
Disable the use of the @dfn{passive} FTP transfer mode. Passive FTP
|
|
|
|
mandates that the client connect to the server to establish the data
|
|
|
|
connection rather than the other way around.
|
|
|
|
|
|
|
|
If the machine is connected to the Internet directly, both passive and
|
|
|
|
active FTP should work equally well. Behind most firewall and NAT
|
|
|
|
configurations passive FTP has a better chance of working. However,
|
|
|
|
in some rare firewall configurations, active FTP actually works when
|
|
|
|
passive FTP doesn't. If you suspect this to be the case, use this
|
|
|
|
option, or set @code{passive_ftp=off} in your init file.
|
2001-02-23 16:31:54 -05:00
|
|
|
|
2013-07-11 11:52:28 -04:00
|
|
|
@cindex file permissions
|
|
|
|
@item --preserve-permissions
|
|
|
|
Preserve remote file permissions instead of permissions set by umask.
|
|
|
|
|
2001-02-23 16:31:54 -05:00
|
|
|
@cindex symbolic links, retrieving
|
|
|
|
@item --retr-symlinks
|
|
|
|
Usually, when retrieving @sc{ftp} directories recursively and a symbolic
|
|
|
|
link is encountered, the linked-to file is not downloaded. Instead, a
|
|
|
|
matching symbolic link is created on the local filesystem. The
|
|
|
|
pointed-to file will not be downloaded unless this recursive retrieval
|
|
|
|
would have encountered it separately and downloaded it anyway.
|
|
|
|
|
|
|
|
When @samp{--retr-symlinks} is specified, however, symbolic links are
|
|
|
|
traversed and the pointed-to files are retrieved. At this time, this
|
|
|
|
option does not cause Wget to traverse symlinks to directories and
|
|
|
|
recurse through them, but in the future it should be enhanced to do
|
|
|
|
this.
|
|
|
|
|
|
|
|
Note that when retrieving a file (not a directory) because it was
|
2003-09-30 17:09:06 -04:00
|
|
|
specified on the command-line, rather than because it was recursed to,
|
2001-02-23 16:31:54 -05:00
|
|
|
this option has no effect. Symbolic links are always traversed in this
|
|
|
|
case.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Recursive Retrieval Options, Recursive Accept/Reject Options, FTP Options, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Recursive Retrieval Options
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@item -r
|
|
|
|
@itemx --recursive
|
2003-11-08 19:09:26 -05:00
|
|
|
Turn on recursive retrieving. @xref{Recursive Download}, for more
|
2011-07-05 04:28:00 -04:00
|
|
|
details. The default maximum depth is 5.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item -l @var{depth}
|
|
|
|
@itemx --level=@var{depth}
|
2000-11-14 17:49:07 -05:00
|
|
|
Specify recursion maximum depth level @var{depth} (@pxref{Recursive
|
2011-07-05 04:28:00 -04:00
|
|
|
Download}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex proxy filling
|
|
|
|
@cindex delete after retrieval
|
|
|
|
@cindex filling proxy cache
|
|
|
|
@item --delete-after
|
|
|
|
This option tells Wget to delete every single file it downloads,
|
|
|
|
@emph{after} having done so. It is useful for pre-fetching popular
|
2000-10-23 23:43:47 -04:00
|
|
|
pages through a proxy, e.g.:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -nd --delete-after http://whatever.com/~popular/page/
|
|
|
|
@end example
|
|
|
|
|
2000-10-23 23:43:47 -04:00
|
|
|
The @samp{-r} option is to retrieve recursively, and @samp{-nd} to not
|
|
|
|
create directories.
|
|
|
|
|
|
|
|
Note that @samp{--delete-after} deletes files on the local machine. It
|
|
|
|
does not issue the @samp{DELE} command to remote FTP sites, for
|
|
|
|
instance. Also note that when @samp{--delete-after} is specified,
|
|
|
|
@samp{--convert-links} is ignored, so @samp{.orig} files are simply not
|
|
|
|
created in the first place.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex conversion of links
|
2000-03-11 01:48:06 -05:00
|
|
|
@cindex link conversion
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -k
|
|
|
|
@itemx --convert-links
|
2001-03-31 21:54:31 -05:00
|
|
|
After the download is complete, convert the links in the document to
|
|
|
|
make them suitable for local viewing. This affects not only the visible
|
|
|
|
hyperlinks, but any part of the document that links to external content,
|
2003-09-30 17:09:06 -04:00
|
|
|
such as embedded images, links to style sheets, hyperlinks to non-@sc{html}
|
2001-03-31 21:54:31 -05:00
|
|
|
content, etc.
|
|
|
|
|
|
|
|
Each link will be changed in one of the two ways:
|
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
|
|
|
The links to files that have been downloaded by Wget will be changed to
|
|
|
|
refer to the file they point to as a relative link.
|
|
|
|
|
|
|
|
Example: if the downloaded file @file{/foo/doc.html} links to
|
|
|
|
@file{/bar/img.gif}, also downloaded, then the link in @file{doc.html}
|
|
|
|
will be modified to point to @samp{../bar/img.gif}. This kind of
|
|
|
|
transformation works reliably for arbitrary combinations of directories.
|
|
|
|
|
|
|
|
@item
|
|
|
|
The links to files that have not been downloaded by Wget will be changed
|
|
|
|
to include host name and absolute path of the location they point to.
|
|
|
|
|
|
|
|
Example: if the downloaded file @file{/foo/doc.html} links to
|
|
|
|
@file{/bar/img.gif} (or to @file{../bar/img.gif}), then the link in
|
|
|
|
@file{doc.html} will be modified to point to
|
|
|
|
@file{http://@var{hostname}/bar/img.gif}.
|
|
|
|
@end itemize
|
|
|
|
|
|
|
|
Because of this, local browsing works reliably: if a linked file was
|
|
|
|
downloaded, the link will refer to its local name; if it was not
|
|
|
|
downloaded, the link will refer to its full Internet address rather than
|
|
|
|
presenting a broken link. The fact that the former links are converted
|
|
|
|
to relative links ensures that you can move the downloaded hierarchy to
|
|
|
|
another directory.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Note that only at the end of the download can Wget know which links have
|
2001-03-31 21:54:31 -05:00
|
|
|
been downloaded. Because of that, the work done by @samp{-k} will be
|
|
|
|
performed at the end of all the downloads.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-02-29 19:17:23 -05:00
|
|
|
@cindex backing up converted files
|
|
|
|
@item -K
|
|
|
|
@itemx --backup-converted
|
2000-03-11 01:48:06 -05:00
|
|
|
When converting a file, back up the original version with a @samp{.orig}
|
2000-11-14 17:49:07 -05:00
|
|
|
suffix. Affects the behavior of @samp{-N} (@pxref{HTTP Time-Stamping
|
2000-03-11 01:48:06 -05:00
|
|
|
Internals}).
|
2000-02-29 19:17:23 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -m
|
|
|
|
@itemx --mirror
|
|
|
|
Turn on options suitable for mirroring. This option turns on recursion
|
|
|
|
and time-stamping, sets infinite recursion depth and keeps @sc{ftp}
|
|
|
|
directory listings. It is currently equivalent to
|
2003-11-14 18:39:14 -05:00
|
|
|
@samp{-r -N -l inf --no-remove-listing}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-08-30 07:26:21 -04:00
|
|
|
@cindex page requisites
|
|
|
|
@cindex required images, downloading
|
|
|
|
@item -p
|
|
|
|
@itemx --page-requisites
|
2000-11-16 07:35:27 -05:00
|
|
|
This option causes Wget to download all the files that are necessary to
|
2003-09-30 17:09:06 -04:00
|
|
|
properly display a given @sc{html} page. This includes such things as
|
2000-08-30 07:26:21 -04:00
|
|
|
inlined images, sounds, and referenced stylesheets.
|
|
|
|
|
2003-09-30 17:09:06 -04:00
|
|
|
Ordinarily, when downloading a single @sc{html} page, any requisite documents
|
2000-08-30 07:26:21 -04:00
|
|
|
that may be needed to display it properly are not downloaded. Using
|
2000-11-16 07:35:27 -05:00
|
|
|
@samp{-r} together with @samp{-l} can help, but since Wget does not
|
2000-08-30 07:26:21 -04:00
|
|
|
ordinarily distinguish between external and inlined documents, one is
|
2000-11-16 07:35:27 -05:00
|
|
|
generally left with ``leaf documents'' that are missing their
|
|
|
|
requisites.
|
2000-08-30 07:26:21 -04:00
|
|
|
|
|
|
|
For instance, say document @file{1.html} contains an @code{<IMG>} tag
|
|
|
|
referencing @file{1.gif} and an @code{<A>} tag pointing to external
|
2001-03-26 22:22:17 -05:00
|
|
|
document @file{2.html}. Say that @file{2.html} is similar but that its
|
2000-08-30 07:26:21 -04:00
|
|
|
image is @file{2.gif} and it links to @file{3.html}. Say this
|
|
|
|
continues up to some arbitrarily high number.
|
|
|
|
|
|
|
|
If one executes the command:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l 2 http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
then @file{1.html}, @file{1.gif}, @file{2.html}, @file{2.gif}, and
|
|
|
|
@file{3.html} will be downloaded. As you can see, @file{3.html} is
|
2000-11-16 07:35:27 -05:00
|
|
|
without its requisite @file{3.gif} because Wget is simply counting the
|
2000-08-30 07:26:21 -04:00
|
|
|
number of hops (up to 2) away from @file{1.html} in order to determine
|
|
|
|
where to stop the recursion. However, with this command:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l 2 -p http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
all the above files @emph{and} @file{3.html}'s requisite @file{3.gif}
|
|
|
|
will be downloaded. Similarly,
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l 1 -p http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
will cause @file{1.html}, @file{1.gif}, @file{2.html}, and @file{2.gif}
|
|
|
|
to be downloaded. One might think that:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l 0 -p http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
would download just @file{1.html} and @file{1.gif}, but unfortunately
|
2000-11-16 07:35:27 -05:00
|
|
|
this is not the case, because @samp{-l 0} is equivalent to
|
2003-09-30 17:09:06 -04:00
|
|
|
@samp{-l inf}---that is, infinite recursion. To download a single @sc{html}
|
|
|
|
page (or a handful of them, all specified on the command-line or in a
|
2001-03-26 22:22:17 -05:00
|
|
|
@samp{-i} @sc{url} input file) and its (or their) requisites, simply leave off
|
|
|
|
@samp{-r} and @samp{-l}:
|
2000-08-30 07:26:21 -04:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -p http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
2000-11-16 07:35:27 -05:00
|
|
|
Note that Wget will behave as if @samp{-r} had been specified, but only
|
2000-08-30 07:26:21 -04:00
|
|
|
that single page and its requisites will be downloaded. Links from that
|
|
|
|
page to external documents will not be followed. Actually, to download
|
|
|
|
a single page and all its requisites (even if they exist on separate
|
|
|
|
websites), and make sure the lot displays properly locally, this author
|
|
|
|
likes to use a few options in addition to @samp{-p}:
|
|
|
|
|
|
|
|
@example
|
2001-11-30 21:36:21 -05:00
|
|
|
wget -E -H -k -K -p http://@var{site}/@var{document}
|
2000-08-30 07:26:21 -04:00
|
|
|
@end example
|
|
|
|
|
2000-11-16 11:29:46 -05:00
|
|
|
To finish off this topic, it's worth knowing that Wget's idea of an
|
2000-08-30 07:26:21 -04:00
|
|
|
external document link is any URL specified in an @code{<A>} tag, an
|
|
|
|
@code{<AREA>} tag, or a @code{<LINK>} tag other than @code{<LINK
|
|
|
|
REL="stylesheet">}.
|
2003-09-18 20:33:22 -04:00
|
|
|
|
2003-09-30 17:09:06 -04:00
|
|
|
@cindex @sc{html} comments
|
|
|
|
@cindex comments, @sc{html}
|
2003-09-18 20:33:22 -04:00
|
|
|
@item --strict-comments
|
2003-09-30 17:09:06 -04:00
|
|
|
Turn on strict parsing of @sc{html} comments. The default is to terminate
|
2003-09-18 20:33:22 -04:00
|
|
|
comments at the first occurrence of @samp{-->}.
|
|
|
|
|
2003-09-30 17:09:06 -04:00
|
|
|
According to specifications, @sc{html} comments are expressed as @sc{sgml}
|
2003-09-18 20:33:22 -04:00
|
|
|
@dfn{declarations}. Declaration is special markup that begins with
|
|
|
|
@samp{<!} and ends with @samp{>}, such as @samp{<!DOCTYPE ...>}, that
|
2003-09-30 17:09:06 -04:00
|
|
|
may contain comments between a pair of @samp{--} delimiters. @sc{html}
|
|
|
|
comments are ``empty declarations'', @sc{sgml} declarations without any
|
2003-09-18 20:33:22 -04:00
|
|
|
non-comment text. Therefore, @samp{<!--foo-->} is a valid comment, and
|
|
|
|
so is @samp{<!--one-- --two-->}, but @samp{<!--1--2-->} is not.
|
|
|
|
|
2003-09-30 17:09:06 -04:00
|
|
|
On the other hand, most @sc{html} writers don't perceive comments as anything
|
2003-09-18 20:33:22 -04:00
|
|
|
other than text delimited with @samp{<!--} and @samp{-->}, which is not
|
|
|
|
quite the same. For example, something like @samp{<!------------>}
|
|
|
|
works as a valid comment as long as the number of dashes is a multiple
|
|
|
|
of four (!). If not, the comment technically lasts until the next
|
|
|
|
@samp{--}, which may be at the other end of the document. Because of
|
|
|
|
this, many popular browsers completely ignore the specification and
|
|
|
|
implement what users have come to expect: comments delimited with
|
|
|
|
@samp{<!--} and @samp{-->}.
|
|
|
|
|
|
|
|
Until version 1.9, Wget interpreted comments strictly, which resulted in
|
|
|
|
missing links in many web pages that displayed fine in browsers, but had
|
|
|
|
the misfortune of containing non-compliant comments. Beginning with
|
|
|
|
version 1.9, Wget has joined the ranks of clients that implements
|
|
|
|
``naive'' comments, terminating each comment at the first occurrence of
|
|
|
|
@samp{-->}.
|
|
|
|
|
|
|
|
If, for whatever reason, you want strict comment parsing, use this
|
|
|
|
option to turn it on.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2009-08-28 02:57:09 -04:00
|
|
|
@node Recursive Accept/Reject Options, Exit Status, Recursive Retrieval Options, Invoking
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Recursive Accept/Reject Options
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@item -A @var{acclist} --accept @var{acclist}
|
|
|
|
@itemx -R @var{rejlist} --reject @var{rejlist}
|
|
|
|
Specify comma-separated lists of file name suffixes or patterns to
|
2007-09-28 00:57:11 -04:00
|
|
|
accept or reject (@pxref{Types of Files}). Note that if
|
2007-09-12 23:21:30 -04:00
|
|
|
any of the wildcard characters, @samp{*}, @samp{?}, @samp{[} or
|
|
|
|
@samp{]}, appear in an element of @var{acclist} or @var{rejlist},
|
|
|
|
it will be treated as a pattern, rather than a suffix.
|
2013-10-06 15:45:31 -04:00
|
|
|
In this case, you have to enclose the pattern into quotes to prevent
|
|
|
|
your shell from expanding it, like in @samp{-A "*.mp3"} or @samp{-A '*.mp3'}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2013-04-28 16:41:24 -04:00
|
|
|
@item --accept-regex @var{urlregex}
|
|
|
|
@itemx --reject-regex @var{urlregex}
|
|
|
|
Specify a regular expression to accept or reject the complete URL.
|
|
|
|
|
2013-07-11 11:52:28 -04:00
|
|
|
@item --regex-type @var{regextype}
|
|
|
|
Specify the regular expression type. Possible types are @samp{posix} or
|
|
|
|
@samp{pcre}. Note that to be able to use @samp{pcre} type, wget has to be
|
|
|
|
compiled with libpcre support.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -D @var{domain-list}
|
|
|
|
@itemx --domains=@var{domain-list}
|
2001-11-30 21:36:21 -05:00
|
|
|
Set domains to be followed. @var{domain-list} is a comma-separated list
|
|
|
|
of domains. Note that it does @emph{not} turn on @samp{-H}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item --exclude-domains @var{domain-list}
|
2010-09-13 17:44:51 -04:00
|
|
|
Specify the domains that are @emph{not} to be followed
|
2001-11-30 21:36:21 -05:00
|
|
|
(@pxref{Spanning Hosts}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex follow FTP links
|
|
|
|
@item --follow-ftp
|
|
|
|
Follow @sc{ftp} links from @sc{html} documents. Without this option,
|
|
|
|
Wget will ignore all the @sc{ftp} links.
|
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
@cindex tag-based recursive pruning
|
|
|
|
@item --follow-tags=@var{list}
|
2003-09-30 17:09:06 -04:00
|
|
|
Wget has an internal table of @sc{html} tag / attribute pairs that it
|
2000-03-11 01:48:06 -05:00
|
|
|
considers when looking for linked documents during a recursive
|
|
|
|
retrieval. If a user wants only a subset of those tags to be
|
|
|
|
considered, however, he or she should be specify such tags in a
|
|
|
|
comma-separated @var{list} with this option.
|
|
|
|
|
2003-11-08 18:48:36 -05:00
|
|
|
@item --ignore-tags=@var{list}
|
2000-03-11 01:48:06 -05:00
|
|
|
This is the opposite of the @samp{--follow-tags} option. To skip
|
2003-09-30 17:09:06 -04:00
|
|
|
certain @sc{html} tags when recursively looking for documents to download,
|
2000-08-30 07:26:21 -04:00
|
|
|
specify them in a comma-separated @var{list}.
|
|
|
|
|
2003-11-08 18:48:36 -05:00
|
|
|
In the past, this option was the best bet for downloading a single page
|
|
|
|
and its requisites, using a command-line like:
|
2000-03-11 01:48:06 -05:00
|
|
|
|
|
|
|
@example
|
2003-11-08 18:48:36 -05:00
|
|
|
wget --ignore-tags=a,area -H -k -K -r http://@var{site}/@var{document}
|
2000-03-11 01:48:06 -05:00
|
|
|
@end example
|
|
|
|
|
2000-08-30 07:26:21 -04:00
|
|
|
However, the author of this option came across a page with tags like
|
|
|
|
@code{<LINK REL="home" HREF="/">} and came to the realization that
|
2003-11-08 18:48:36 -05:00
|
|
|
specifying tags to ignore was not enough. One can't just tell Wget to
|
|
|
|
ignore @code{<LINK>}, because then stylesheets will not be downloaded.
|
|
|
|
Now the best bet for downloading a single page and its requisites is the
|
2000-08-30 07:26:21 -04:00
|
|
|
dedicated @samp{--page-requisites} option.
|
|
|
|
|
2006-06-26 14:37:52 -04:00
|
|
|
@cindex case fold
|
|
|
|
@cindex ignore case
|
|
|
|
@item --ignore-case
|
|
|
|
Ignore case when matching files and directories. This influences the
|
|
|
|
behavior of -R, -A, -I, and -X options, as well as globbing
|
|
|
|
implemented when downloading from FTP sites. For example, with this
|
2013-10-06 15:45:31 -04:00
|
|
|
option, @samp{-A "*.txt"} will match @samp{file1.txt}, but also
|
2006-06-26 14:37:52 -04:00
|
|
|
@samp{file2.TXT}, @samp{file3.TxT}, and so on.
|
2013-10-06 15:45:31 -04:00
|
|
|
The quotes in the example are to prevent the shell from expanding the
|
|
|
|
pattern.
|
2006-06-26 14:37:52 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -H
|
|
|
|
@itemx --span-hosts
|
2001-11-30 21:36:21 -05:00
|
|
|
Enable spanning across hosts when doing recursive retrieving
|
|
|
|
(@pxref{Spanning Hosts}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
@item -L
|
|
|
|
@itemx --relative
|
|
|
|
Follow relative links only. Useful for retrieving a specific home page
|
|
|
|
without any distractions, not even those from the same hosts
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Relative Links}).
|
2000-03-11 01:48:06 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -I @var{list}
|
|
|
|
@itemx --include-directories=@var{list}
|
|
|
|
Specify a comma-separated list of directories you wish to follow when
|
2007-09-28 00:57:11 -04:00
|
|
|
downloading (@pxref{Directory-Based Limits}). Elements
|
1999-12-02 02:42:23 -05:00
|
|
|
of @var{list} may contain wildcards.
|
|
|
|
|
|
|
|
@item -X @var{list}
|
|
|
|
@itemx --exclude-directories=@var{list}
|
|
|
|
Specify a comma-separated list of directories you wish to exclude from
|
2007-09-28 00:57:11 -04:00
|
|
|
download (@pxref{Directory-Based Limits}). Elements of
|
1999-12-02 02:42:23 -05:00
|
|
|
@var{list} may contain wildcards.
|
|
|
|
|
|
|
|
@item -np
|
|
|
|
@item --no-parent
|
|
|
|
Do not ever ascend to the parent directory when retrieving recursively.
|
|
|
|
This is a useful option, since it guarantees that only the files
|
|
|
|
@emph{below} a certain hierarchy will be downloaded.
|
2000-11-14 17:49:07 -05:00
|
|
|
@xref{Directory-Based Limits}, for more details.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
|
|
|
|
2009-08-28 02:57:09 -04:00
|
|
|
@node Exit Status, , Recursive Accept/Reject Options, Invoking
|
|
|
|
@section Exit Status
|
|
|
|
|
|
|
|
@c man begin EXITSTATUS
|
|
|
|
|
|
|
|
Wget may return one of several error codes if it encounters problems.
|
|
|
|
|
|
|
|
|
|
|
|
@table @asis
|
|
|
|
@item 0
|
|
|
|
No problems occurred.
|
|
|
|
|
|
|
|
@item 1
|
|
|
|
Generic error code.
|
|
|
|
|
|
|
|
@item 2
|
|
|
|
Parse error---for instance, when parsing command-line options, the
|
|
|
|
@samp{.wgetrc} or @samp{.netrc}...
|
|
|
|
|
|
|
|
@item 3
|
|
|
|
File I/O error.
|
|
|
|
|
|
|
|
@item 4
|
|
|
|
Network failure.
|
|
|
|
|
|
|
|
@item 5
|
|
|
|
SSL verification failure.
|
|
|
|
|
|
|
|
@item 6
|
|
|
|
Username/password authentication failure.
|
|
|
|
|
|
|
|
@item 7
|
|
|
|
Protocol errors.
|
|
|
|
|
|
|
|
@item 8
|
|
|
|
Server issued an error response.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
|
|
|
|
With the exceptions of 0 and 1, the lower-numbered exit codes take
|
|
|
|
precedence over higher-numbered ones, when multiple types of errors
|
|
|
|
are encountered.
|
|
|
|
|
|
|
|
In versions of Wget prior to 1.12, Wget's exit status tended to be
|
|
|
|
unhelpful and inconsistent. Recursive downloads would virtually always
|
|
|
|
return 0 (success), regardless of any issues encountered, and
|
|
|
|
non-recursive fetches only returned the status corresponding to the
|
|
|
|
most recently-attempted download.
|
|
|
|
|
|
|
|
@c man end
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Recursive Download, Following Links, Invoking, Top
|
2003-11-08 19:09:26 -05:00
|
|
|
@chapter Recursive Download
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex recursion
|
|
|
|
@cindex retrieving
|
2003-11-08 19:09:26 -05:00
|
|
|
@cindex recursive download
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
GNU Wget is capable of traversing parts of the Web (or a single
|
2001-11-30 21:36:21 -05:00
|
|
|
@sc{http} or @sc{ftp} server), following links and directory structure.
|
2003-09-30 17:09:06 -04:00
|
|
|
We refer to this as to @dfn{recursive retrieval}, or @dfn{recursion}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-04-24 19:48:46 -04:00
|
|
|
With @sc{http} @sc{url}s, Wget retrieves and parses the @sc{html} or
|
|
|
|
@sc{css} from the given @sc{url}, retrieving the files the document
|
|
|
|
refers to, through markup like @code{href} or @code{src}, or @sc{css}
|
|
|
|
@sc{uri} values specified using the @samp{url()} functional notation.
|
|
|
|
If the freshly downloaded file is also of type @code{text/html},
|
|
|
|
@code{application/xhtml+xml}, or @code{text/css}, it will be parsed
|
|
|
|
and followed further.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-04-24 19:48:46 -04:00
|
|
|
Recursive retrieval of @sc{http} and @sc{html}/@sc{css} content is
|
2001-11-30 21:36:21 -05:00
|
|
|
@dfn{breadth-first}. This means that Wget first downloads the requested
|
2008-04-24 19:48:46 -04:00
|
|
|
document, then the documents linked from that document, then the
|
2001-11-30 21:36:21 -05:00
|
|
|
documents linked by them, and so on. In other words, Wget first
|
|
|
|
downloads the documents at depth 1, then those at depth 2, and so on
|
|
|
|
until the specified maximum depth.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
The maximum @dfn{depth} to which the retrieval may descend is specified
|
2001-11-30 21:36:21 -05:00
|
|
|
with the @samp{-l} option. The default maximum depth is five layers.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
When retrieving an @sc{ftp} @sc{url} recursively, Wget will retrieve all
|
|
|
|
the data from the given directory tree (including the subdirectories up
|
|
|
|
to the specified depth) on the remote server, creating its mirror image
|
|
|
|
locally. @sc{ftp} retrieval is also limited by the @code{depth}
|
2001-11-30 21:36:21 -05:00
|
|
|
parameter. Unlike @sc{http} recursion, @sc{ftp} recursion is performed
|
|
|
|
depth-first.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
By default, Wget will create a local directory tree, corresponding to
|
|
|
|
the one found on the remote server.
|
|
|
|
|
|
|
|
Recursive retrieving can find a number of applications, the most
|
|
|
|
important of which is mirroring. It is also useful for @sc{www}
|
|
|
|
presentations, and any other opportunities where slow network
|
|
|
|
connections should be bypassed by storing the files locally.
|
|
|
|
|
2001-11-30 21:36:21 -05:00
|
|
|
You should be warned that recursive downloads can overload the remote
|
|
|
|
servers. Because of that, many administrators frown upon them and may
|
|
|
|
ban access from your site if they detect very fast downloads of big
|
|
|
|
amounts of content. When downloading from Internet servers, consider
|
|
|
|
using the @samp{-w} option to introduce a delay between accesses to the
|
|
|
|
server. The download will take a while longer, but the server
|
|
|
|
administrator will not be alarmed by your rudeness.
|
|
|
|
|
|
|
|
Of course, recursive download may cause problems on your machine. If
|
|
|
|
left to run unchecked, it can easily fill up the disk. If downloading
|
|
|
|
from local network, it can also take bandwidth on the system, as well as
|
|
|
|
consume memory and CPU.
|
|
|
|
|
|
|
|
Try to specify the criteria that match the kind of download you are
|
|
|
|
trying to achieve. If you want to download only one page, use
|
|
|
|
@samp{--page-requisites} without any additional recursion. If you want
|
|
|
|
to download things under one directory, use @samp{-np} to avoid
|
|
|
|
downloading things from other directories. If you want to download all
|
|
|
|
the files from one directory, use @samp{-l 1} to make sure the recursion
|
|
|
|
depth never exceeds one. @xref{Following Links}, for more information
|
|
|
|
about this.
|
|
|
|
|
|
|
|
Recursive retrieval should be used with care. Don't say you were not
|
|
|
|
warned.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Following Links, Time-Stamping, Recursive Download, Top
|
1999-12-02 02:42:23 -05:00
|
|
|
@chapter Following Links
|
|
|
|
@cindex links
|
|
|
|
@cindex following links
|
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
When retrieving recursively, one does not wish to retrieve loads of
|
1999-12-02 02:42:23 -05:00
|
|
|
unnecessary data. Most of the time the users bear in mind exactly what
|
|
|
|
they want to download, and want Wget to follow only specific links.
|
|
|
|
|
|
|
|
For example, if you wish to download the music archive from
|
2000-11-10 09:47:30 -05:00
|
|
|
@samp{fly.srk.fer.hr}, you will not want to download all the home pages
|
1999-12-02 02:42:23 -05:00
|
|
|
that happen to be referenced by an obscure part of the archive.
|
|
|
|
|
|
|
|
Wget possesses several mechanisms that allows you to fine-tune which
|
|
|
|
links it will follow.
|
|
|
|
|
|
|
|
@menu
|
2009-08-28 02:57:09 -04:00
|
|
|
* Spanning Hosts:: (Un)limiting retrieval based on host name.
|
|
|
|
* Types of Files:: Getting only certain files.
|
|
|
|
* Directory-Based Limits:: Getting only certain directories.
|
|
|
|
* Relative Links:: Follow relative links only.
|
|
|
|
* FTP Links:: Following FTP links.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end menu
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Spanning Hosts, Types of Files, Following Links, Following Links
|
2001-11-30 21:36:21 -05:00
|
|
|
@section Spanning Hosts
|
|
|
|
@cindex spanning hosts
|
|
|
|
@cindex hosts, spanning
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-11-30 21:36:21 -05:00
|
|
|
Wget's recursive retrieval normally refuses to visit hosts different
|
|
|
|
than the one you specified on the command line. This is a reasonable
|
|
|
|
default; without it, every retrieval would have the potential to turn
|
|
|
|
your Wget into a small version of google.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-11-30 21:36:21 -05:00
|
|
|
However, visiting different hosts, or @dfn{host spanning,} is sometimes
|
|
|
|
a useful option. Maybe the images are served from a different server.
|
|
|
|
Maybe you're mirroring a site that consists of pages interlinked between
|
2003-09-30 17:09:06 -04:00
|
|
|
three servers. Maybe the server has two equivalent names, and the @sc{html}
|
2001-11-30 21:36:21 -05:00
|
|
|
pages refer to both interchangeably.
|
|
|
|
|
|
|
|
@table @asis
|
|
|
|
@item Span to any host---@samp{-H}
|
|
|
|
|
|
|
|
The @samp{-H} option turns on host spanning, thus allowing Wget's
|
|
|
|
recursive run to visit any host referenced by a link. Unless sufficient
|
|
|
|
recursion-limiting criteria are applied depth, these foreign hosts will
|
|
|
|
typically link to yet more hosts, and so on until Wget ends up sucking
|
|
|
|
up much more data than you have intended.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-11-30 21:36:21 -05:00
|
|
|
@item Limit spanning to certain domains---@samp{-D}
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-11-30 21:36:21 -05:00
|
|
|
The @samp{-D} option allows you to specify the domains that will be
|
|
|
|
followed, thus limiting the recursion only to the hosts that belong to
|
|
|
|
these domains. Obviously, this makes sense only in conjunction with
|
|
|
|
@samp{-H}. A typical example would be downloading the contents of
|
|
|
|
@samp{www.server.com}, but allowing downloads from
|
|
|
|
@samp{images.server.com}, etc.:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-11-30 21:36:21 -05:00
|
|
|
wget -rH -Dserver.com http://www.server.com/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-11-30 21:36:21 -05:00
|
|
|
You can specify more than one address by separating them with a comma,
|
|
|
|
e.g. @samp{-Ddomain1.com,domain2.com}.
|
|
|
|
|
|
|
|
@item Keep download off certain domains---@samp{--exclude-domains}
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
If there are domains you want to exclude specifically, you can do it
|
|
|
|
with @samp{--exclude-domains}, which accepts the same type of arguments
|
|
|
|
of @samp{-D}, but will @emph{exclude} all the listed domains. For
|
|
|
|
example, if you want to download all the hosts from @samp{foo.edu}
|
|
|
|
domain, with the exception of @samp{sunsite.foo.edu}, you can do it like
|
|
|
|
this:
|
|
|
|
|
|
|
|
@example
|
2001-11-30 21:36:21 -05:00
|
|
|
wget -rH -Dfoo.edu --exclude-domains sunsite.foo.edu \
|
|
|
|
http://www.foo.edu/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-11-30 21:36:21 -05:00
|
|
|
@end table
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Types of Files, Directory-Based Limits, Spanning Hosts, Following Links
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Types of Files
|
|
|
|
@cindex types of files
|
|
|
|
|
|
|
|
When downloading material from the web, you will often want to restrict
|
|
|
|
the retrieval to only certain file types. For example, if you are
|
2000-03-02 08:44:56 -05:00
|
|
|
interested in downloading @sc{gif}s, you will not be overjoyed to get
|
|
|
|
loads of PostScript documents, and vice versa.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Wget offers two options to deal with this problem. Each option
|
|
|
|
description lists a short name, a long name, and the equivalent command
|
|
|
|
in @file{.wgetrc}.
|
|
|
|
|
|
|
|
@cindex accept wildcards
|
|
|
|
@cindex accept suffixes
|
|
|
|
@cindex wildcards, accept
|
|
|
|
@cindex suffixes, accept
|
|
|
|
@table @samp
|
|
|
|
@item -A @var{acclist}
|
|
|
|
@itemx --accept @var{acclist}
|
|
|
|
@itemx accept = @var{acclist}
|
2012-05-13 11:38:00 -04:00
|
|
|
@itemx --accept-regex @var{urlregex}
|
|
|
|
@itemx accept-regex = @var{urlregex}
|
1999-12-02 02:42:23 -05:00
|
|
|
The argument to @samp{--accept} option is a list of file suffixes or
|
|
|
|
patterns that Wget will download during recursive retrieval. A suffix
|
|
|
|
is the ending part of a file, and consists of ``normal'' letters,
|
|
|
|
e.g. @samp{gif} or @samp{.jpg}. A matching pattern contains shell-like
|
|
|
|
wildcards, e.g. @samp{books*} or @samp{zelazny*196[0-9]*}.
|
|
|
|
|
|
|
|
So, specifying @samp{wget -A gif,jpg} will make Wget download only the
|
|
|
|
files ending with @samp{gif} or @samp{jpg}, i.e. @sc{gif}s and
|
|
|
|
@sc{jpeg}s. On the other hand, @samp{wget -A "zelazny*196[0-9]*"} will
|
|
|
|
download only files beginning with @samp{zelazny} and containing numbers
|
|
|
|
from 1960 to 1969 anywhere within. Look up the manual of your shell for
|
|
|
|
a description of how pattern matching works.
|
|
|
|
|
|
|
|
Of course, any number of suffixes and patterns can be combined into a
|
|
|
|
comma-separated list, and given as an argument to @samp{-A}.
|
|
|
|
|
2012-05-13 11:38:00 -04:00
|
|
|
The argument to @samp{--accept-regex} option is a regular expression which
|
|
|
|
is matched against the complete URL.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex reject wildcards
|
|
|
|
@cindex reject suffixes
|
|
|
|
@cindex wildcards, reject
|
|
|
|
@cindex suffixes, reject
|
|
|
|
@item -R @var{rejlist}
|
|
|
|
@itemx --reject @var{rejlist}
|
|
|
|
@itemx reject = @var{rejlist}
|
2012-05-13 11:38:00 -04:00
|
|
|
@itemx --reject-regex @var{urlregex}
|
|
|
|
@itemx reject-regex = @var{urlregex}
|
1999-12-02 02:42:23 -05:00
|
|
|
The @samp{--reject} option works the same way as @samp{--accept}, only
|
|
|
|
its logic is the reverse; Wget will download all files @emph{except} the
|
|
|
|
ones matching the suffixes (or patterns) in the list.
|
|
|
|
|
|
|
|
So, if you want to download a whole page except for the cumbersome
|
|
|
|
@sc{mpeg}s and @sc{.au} files, you can use @samp{wget -R mpg,mpeg,au}.
|
|
|
|
Analogously, to download all files except the ones beginning with
|
|
|
|
@samp{bjork}, use @samp{wget -R "bjork*"}. The quotes are to prevent
|
|
|
|
expansion by the shell.
|
|
|
|
@end table
|
|
|
|
|
2012-05-13 11:38:00 -04:00
|
|
|
The argument to @samp{--accept-regex} option is a regular expression which
|
|
|
|
is matched against the complete URL.
|
|
|
|
|
2008-03-24 17:56:06 -04:00
|
|
|
@noindent
|
1999-12-02 02:42:23 -05:00
|
|
|
The @samp{-A} and @samp{-R} options may be combined to achieve even
|
|
|
|
better fine-tuning of which files to retrieve. E.g. @samp{wget -A
|
|
|
|
"*zelazny*" -R .ps} will download all the files having @samp{zelazny} as
|
2000-03-02 08:44:56 -05:00
|
|
|
a part of their name, but @emph{not} the PostScript files.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Note that these two options do not affect the downloading of @sc{html}
|
2008-03-24 15:26:37 -04:00
|
|
|
files (as determined by a @samp{.htm} or @samp{.html} filename
|
|
|
|
prefix). This behavior may not be desirable for all users, and may be
|
|
|
|
changed for future versions of Wget.
|
|
|
|
|
|
|
|
Note, too, that query strings (strings at the end of a URL beginning
|
|
|
|
with a question mark (@samp{?}) are not included as part of the
|
|
|
|
filename for accept/reject rules, even though these will actually
|
|
|
|
contribute to the name chosen for the local file. It is expected that
|
|
|
|
a future version of Wget will provide an option to allow matching
|
|
|
|
against query strings.
|
|
|
|
|
|
|
|
Finally, it's worth noting that the accept/reject lists are matched
|
|
|
|
@emph{twice} against downloaded files: once against the URL's filename
|
|
|
|
portion, to determine if the file should be downloaded in the first
|
|
|
|
place; then, after it has been accepted and successfully downloaded,
|
|
|
|
the local file's name is also checked against the accept/reject lists
|
|
|
|
to see if it should be removed. The rationale was that, since
|
|
|
|
@samp{.htm} and @samp{.html} files are always downloaded regardless of
|
|
|
|
accept/reject rules, they should be removed @emph{after} being
|
|
|
|
downloaded and scanned for links, if they did match the accept/reject
|
|
|
|
lists. However, this can lead to unexpected results, since the local
|
|
|
|
filenames can differ from the original URL filenames in the following
|
|
|
|
ways, all of which can change whether an accept/reject rule matches:
|
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
|
|
|
If the local file already exists and @samp{--no-directories} was
|
|
|
|
specified, a numeric suffix will be appended to the original name.
|
|
|
|
@item
|
2009-07-28 20:37:58 -04:00
|
|
|
If @samp{--adjust-extension} was specified, the local filename might have
|
2008-03-24 15:26:37 -04:00
|
|
|
@samp{.html} appended to it. If Wget is invoked with @samp{-E -A.php},
|
|
|
|
a filename such as @samp{index.php} will match be accepted, but upon
|
|
|
|
download will be named @samp{index.php.html}, which no longer matches,
|
|
|
|
and so the file will be deleted.
|
|
|
|
@item
|
|
|
|
Query strings do not contribute to URL matching, but are included in
|
|
|
|
local filenames, and so @emph{do} contribute to filename matching.
|
|
|
|
@end itemize
|
|
|
|
|
2008-03-24 17:56:06 -04:00
|
|
|
@noindent
|
2008-03-24 15:26:37 -04:00
|
|
|
This behavior, too, is considered less-than-desirable, and may change
|
|
|
|
in a future version of Wget.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Directory-Based Limits, Relative Links, Types of Files, Following Links
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Directory-Based Limits
|
|
|
|
@cindex directories
|
|
|
|
@cindex directory limits
|
|
|
|
|
|
|
|
Regardless of other link-following facilities, it is often useful to
|
|
|
|
place the restriction of what files to retrieve based on the directories
|
|
|
|
those files are placed in. There can be many reasons for this---the
|
|
|
|
home pages may be organized in a reasonable directory structure; or some
|
|
|
|
directories may contain useless information, e.g. @file{/cgi-bin} or
|
|
|
|
@file{/dev} directories.
|
|
|
|
|
|
|
|
Wget offers three different options to deal with this requirement. Each
|
|
|
|
option description lists a short name, a long name, and the equivalent
|
|
|
|
command in @file{.wgetrc}.
|
|
|
|
|
|
|
|
@cindex directories, include
|
|
|
|
@cindex include directories
|
|
|
|
@cindex accept directories
|
|
|
|
@table @samp
|
|
|
|
@item -I @var{list}
|
|
|
|
@itemx --include @var{list}
|
|
|
|
@itemx include_directories = @var{list}
|
|
|
|
@samp{-I} option accepts a comma-separated list of directories included
|
|
|
|
in the retrieval. Any other directories will simply be ignored. The
|
|
|
|
directories are absolute paths.
|
|
|
|
|
|
|
|
So, if you wish to download from @samp{http://host/people/bozo/}
|
|
|
|
following only links to bozo's colleagues in the @file{/people}
|
|
|
|
directory and the bogus scripts in @file{/cgi-bin}, you can specify:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -I /people,/cgi-bin http://host/people/bozo/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@cindex directories, exclude
|
|
|
|
@cindex exclude directories
|
|
|
|
@cindex reject directories
|
|
|
|
@item -X @var{list}
|
|
|
|
@itemx --exclude @var{list}
|
|
|
|
@itemx exclude_directories = @var{list}
|
|
|
|
@samp{-X} option is exactly the reverse of @samp{-I}---this is a list of
|
|
|
|
directories @emph{excluded} from the download. E.g. if you do not want
|
|
|
|
Wget to download things from @file{/cgi-bin} directory, specify @samp{-X
|
|
|
|
/cgi-bin} on the command line.
|
|
|
|
|
|
|
|
The same as with @samp{-A}/@samp{-R}, these two options can be combined
|
|
|
|
to get a better fine-tuning of downloading subdirectories. E.g. if you
|
|
|
|
want to load all the files from @file{/pub} hierarchy except for
|
|
|
|
@file{/pub/worthless}, specify @samp{-I/pub -X/pub/worthless}.
|
|
|
|
|
|
|
|
@cindex no parent
|
|
|
|
@item -np
|
|
|
|
@itemx --no-parent
|
|
|
|
@itemx no_parent = on
|
|
|
|
The simplest, and often very useful way of limiting directories is
|
|
|
|
disallowing retrieval of the links that refer to the hierarchy
|
2000-03-02 08:44:56 -05:00
|
|
|
@dfn{above} than the beginning directory, i.e. disallowing ascent to the
|
1999-12-02 02:42:23 -05:00
|
|
|
parent directory/directories.
|
|
|
|
|
|
|
|
The @samp{--no-parent} option (short @samp{-np}) is useful in this case.
|
|
|
|
Using it guarantees that you will never leave the existing hierarchy.
|
|
|
|
Supposing you issue Wget with:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r --no-parent http://somehost/~luzer/my-archive/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
You may rest assured that none of the references to
|
|
|
|
@file{/~his-girls-homepage/} or @file{/~luzer/all-my-mpegs/} will be
|
|
|
|
followed. Only the archive you are interested in will be downloaded.
|
|
|
|
Essentially, @samp{--no-parent} is similar to
|
|
|
|
@samp{-I/~luzer/my-archive}, only it handles redirections in a more
|
|
|
|
intelligent fashion.
|
2008-03-17 04:53:49 -04:00
|
|
|
|
|
|
|
@strong{Note} that, for HTTP (and HTTPS), the trailing slash is very
|
|
|
|
important to @samp{--no-parent}. HTTP has no concept of a ``directory''---Wget
|
|
|
|
relies on you to indicate what's a directory and what isn't. In
|
|
|
|
@samp{http://foo/bar/}, Wget will consider @samp{bar} to be a
|
|
|
|
directory, while in @samp{http://foo/bar} (no trailing slash),
|
|
|
|
@samp{bar} will be considered a filename (so @samp{--no-parent} would be
|
|
|
|
meaningless, as its parent is @samp{/}).
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Relative Links, FTP Links, Directory-Based Limits, Following Links
|
2001-11-30 21:36:21 -05:00
|
|
|
@section Relative Links
|
|
|
|
@cindex relative links
|
|
|
|
|
|
|
|
When @samp{-L} is turned on, only the relative links are ever followed.
|
|
|
|
Relative links are here defined those that do not refer to the web
|
|
|
|
server root. For example, these links are relative:
|
|
|
|
|
|
|
|
@example
|
|
|
|
<a href="foo.gif">
|
|
|
|
<a href="foo/bar.gif">
|
|
|
|
<a href="../foo/bar.gif">
|
|
|
|
@end example
|
|
|
|
|
|
|
|
These links are not relative:
|
|
|
|
|
|
|
|
@example
|
|
|
|
<a href="/foo.gif">
|
|
|
|
<a href="/foo/bar.gif">
|
|
|
|
<a href="http://www.server.com/foo/bar.gif">
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Using this option guarantees that recursive retrieval will not span
|
|
|
|
hosts, even without @samp{-H}. In simple cases it also allows downloads
|
|
|
|
to ``just work'' without having to convert links.
|
|
|
|
|
|
|
|
This option is probably not very useful and might be removed in a future
|
|
|
|
release.
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node FTP Links, , Relative Links, Following Links
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Following FTP Links
|
|
|
|
@cindex following ftp links
|
|
|
|
|
|
|
|
The rules for @sc{ftp} are somewhat specific, as it is necessary for
|
|
|
|
them to be. @sc{ftp} links in @sc{html} documents are often included
|
|
|
|
for purposes of reference, and it is often inconvenient to download them
|
|
|
|
by default.
|
|
|
|
|
|
|
|
To have @sc{ftp} links followed from @sc{html} documents, you need to
|
|
|
|
specify the @samp{--follow-ftp} option. Having done that, @sc{ftp}
|
|
|
|
links will span hosts regardless of @samp{-H} setting. This is logical,
|
|
|
|
as @sc{ftp} links rarely point to the same host where the @sc{http}
|
|
|
|
server resides. For similar reasons, the @samp{-L} options has no
|
|
|
|
effect on such downloads. On the other hand, domain acceptance
|
|
|
|
(@samp{-D}) and suffix rules (@samp{-A} and @samp{-R}) apply normally.
|
|
|
|
|
|
|
|
Also note that followed links to @sc{ftp} directories will not be
|
|
|
|
retrieved recursively further.
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Time-Stamping, Startup File, Following Links, Top
|
1999-12-02 02:42:23 -05:00
|
|
|
@chapter Time-Stamping
|
|
|
|
@cindex time-stamping
|
|
|
|
@cindex timestamping
|
|
|
|
@cindex updating the archives
|
|
|
|
@cindex incremental updating
|
|
|
|
|
|
|
|
One of the most important aspects of mirroring information from the
|
|
|
|
Internet is updating your archives.
|
|
|
|
|
|
|
|
Downloading the whole archive again and again, just to replace a few
|
|
|
|
changed files is expensive, both in terms of wasted bandwidth and money,
|
|
|
|
and the time to do the update. This is why all the mirroring tools
|
|
|
|
offer the option of incremental updating.
|
|
|
|
|
|
|
|
Such an updating mechanism means that the remote server is scanned in
|
|
|
|
search of @dfn{new} files. Only those new files will be downloaded in
|
|
|
|
the place of the old ones.
|
|
|
|
|
|
|
|
A file is considered new if one of these two conditions are met:
|
|
|
|
|
|
|
|
@enumerate
|
|
|
|
@item
|
|
|
|
A file of that name does not already exist locally.
|
|
|
|
|
|
|
|
@item
|
|
|
|
A file of that name does exist, but the remote file was modified more
|
|
|
|
recently than the local file.
|
|
|
|
@end enumerate
|
|
|
|
|
|
|
|
To implement this, the program needs to be aware of the time of last
|
2001-02-23 15:16:07 -05:00
|
|
|
modification of both local and remote files. We call this information the
|
|
|
|
@dfn{time-stamp} of a file.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
The time-stamping in GNU Wget is turned on using @samp{--timestamping}
|
|
|
|
(@samp{-N}) option, or through @code{timestamping = on} directive in
|
|
|
|
@file{.wgetrc}. With this option, for each file it intends to download,
|
|
|
|
Wget will check whether a local file of the same name exists. If it
|
2009-09-04 17:29:03 -04:00
|
|
|
does, and the remote file is not newer, Wget will not download it.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
If the local file does not exist, or the sizes of the files do not
|
|
|
|
match, Wget will download the remote file no matter what the time-stamps
|
|
|
|
say.
|
|
|
|
|
|
|
|
@menu
|
2009-08-28 02:57:09 -04:00
|
|
|
* Time-Stamping Usage::
|
|
|
|
* HTTP Time-Stamping Internals::
|
|
|
|
* FTP Time-Stamping Internals::
|
1999-12-02 02:42:23 -05:00
|
|
|
@end menu
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Time-Stamping Usage, HTTP Time-Stamping Internals, Time-Stamping, Time-Stamping
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Time-Stamping Usage
|
|
|
|
@cindex time-stamping usage
|
|
|
|
@cindex usage, time-stamping
|
|
|
|
|
|
|
|
The usage of time-stamping is simple. Say you would like to download a
|
|
|
|
file so that it keeps its date of modification.
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -S http://www.gnu.ai.mit.edu/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
A simple @code{ls -l} shows that the time stamp on the local file equals
|
|
|
|
the state of the @code{Last-Modified} header, as returned by the server.
|
|
|
|
As you can see, the time-stamping info is preserved locally, even
|
2001-02-23 15:16:07 -05:00
|
|
|
without @samp{-N} (at least for @sc{http}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Several days later, you would like Wget to check if the remote file has
|
|
|
|
changed, and download it if it has.
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -N http://www.gnu.ai.mit.edu/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Wget will ask the server for the last-modified date. If the local file
|
2001-02-23 15:16:07 -05:00
|
|
|
has the same timestamp as the server, or a newer one, the remote file
|
|
|
|
will not be re-fetched. However, if the remote file is more recent,
|
|
|
|
Wget will proceed to fetch it.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
The same goes for @sc{ftp}. For example:
|
|
|
|
|
|
|
|
@example
|
2001-02-23 15:16:07 -05:00
|
|
|
wget "ftp://ftp.ifi.uio.no/pub/emacs/gnus/*"
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
(The quotes around that URL are to prevent the shell from trying to
|
|
|
|
interpret the @samp{*}.)
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
After download, a local directory listing will show that the timestamps
|
|
|
|
match those on the remote server. Reissuing the command with @samp{-N}
|
|
|
|
will make Wget re-fetch @emph{only} the files that have been modified
|
|
|
|
since the last download.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
If you wished to mirror the GNU archive every week, you would use a
|
|
|
|
command like the following, weekly:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-02-23 15:16:07 -05:00
|
|
|
wget --timestamping -r ftp://ftp.gnu.org/pub/gnu/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
Note that time-stamping will only work for files for which the server
|
|
|
|
gives a timestamp. For @sc{http}, this depends on getting a
|
|
|
|
@code{Last-Modified} header. For @sc{ftp}, this depends on getting a
|
|
|
|
directory listing with dates in a format that Wget can parse
|
|
|
|
(@pxref{FTP Time-Stamping Internals}).
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node HTTP Time-Stamping Internals, FTP Time-Stamping Internals, Time-Stamping Usage, Time-Stamping
|
1999-12-02 02:42:23 -05:00
|
|
|
@section HTTP Time-Stamping Internals
|
|
|
|
@cindex http time-stamping
|
|
|
|
|
|
|
|
Time-stamping in @sc{http} is implemented by checking of the
|
|
|
|
@code{Last-Modified} header. If you wish to retrieve the file
|
|
|
|
@file{foo.html} through @sc{http}, Wget will check whether
|
|
|
|
@file{foo.html} exists locally. If it doesn't, @file{foo.html} will be
|
|
|
|
retrieved unconditionally.
|
|
|
|
|
|
|
|
If the file does exist locally, Wget will first check its local
|
|
|
|
time-stamp (similar to the way @code{ls -l} checks it), and then send a
|
|
|
|
@code{HEAD} request to the remote server, demanding the information on
|
|
|
|
the remote file.
|
|
|
|
|
|
|
|
The @code{Last-Modified} header is examined to find which file was
|
|
|
|
modified more recently (which makes it ``newer''). If the remote file
|
|
|
|
is newer, it will be downloaded; if it is older, Wget will give
|
|
|
|
up.@footnote{As an additional check, Wget will look at the
|
|
|
|
@code{Content-Length} header, and compare the sizes; if they are not the
|
|
|
|
same, the remote file will be downloaded no matter what the time-stamp
|
|
|
|
says.}
|
|
|
|
|
2000-03-02 02:06:10 -05:00
|
|
|
When @samp{--backup-converted} (@samp{-K}) is specified in conjunction
|
|
|
|
with @samp{-N}, server file @samp{@var{X}} is compared to local file
|
|
|
|
@samp{@var{X}.orig}, if extant, rather than being compared to local file
|
|
|
|
@samp{@var{X}}, which will always differ if it's been converted by
|
|
|
|
@samp{--convert-links} (@samp{-k}).
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
Arguably, @sc{http} time-stamping should be implemented using the
|
|
|
|
@code{If-Modified-Since} request.
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node FTP Time-Stamping Internals, , HTTP Time-Stamping Internals, Time-Stamping
|
1999-12-02 02:42:23 -05:00
|
|
|
@section FTP Time-Stamping Internals
|
|
|
|
@cindex ftp time-stamping
|
|
|
|
|
|
|
|
In theory, @sc{ftp} time-stamping works much the same as @sc{http}, only
|
2001-02-23 16:31:54 -05:00
|
|
|
@sc{ftp} has no headers---time-stamps must be ferreted out of directory
|
|
|
|
listings.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
If an @sc{ftp} download is recursive or uses globbing, Wget will use the
|
|
|
|
@sc{ftp} @code{LIST} command to get a file listing for the directory
|
|
|
|
containing the desired file(s). It will try to analyze the listing,
|
|
|
|
treating it like Unix @code{ls -l} output, extracting the time-stamps.
|
|
|
|
The rest is exactly the same as for @sc{http}. Note that when
|
|
|
|
retrieving individual files from an @sc{ftp} server without using
|
|
|
|
globbing or recursion, listing files will not be downloaded (and thus
|
|
|
|
files will not be time-stamped) unless @samp{-N} is specified.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Assumption that every directory listing is a Unix-style listing may
|
|
|
|
sound extremely constraining, but in practice it is not, as many
|
|
|
|
non-Unix @sc{ftp} servers use the Unixoid listing format because most
|
|
|
|
(all?) of the clients understand it. Bear in mind that @sc{rfc959}
|
|
|
|
defines no standard way to get a file list, let alone the time-stamps.
|
|
|
|
We can only hope that a future standard will define this.
|
|
|
|
|
|
|
|
Another non-standard solution includes the use of @code{MDTM} command
|
|
|
|
that is supported by some @sc{ftp} servers (including the popular
|
|
|
|
@code{wu-ftpd}), which returns the exact time of the specified file.
|
|
|
|
Wget may support this command in the future.
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Startup File, Examples, Time-Stamping, Top
|
1999-12-02 02:42:23 -05:00
|
|
|
@chapter Startup File
|
|
|
|
@cindex startup file
|
|
|
|
@cindex wgetrc
|
|
|
|
@cindex .wgetrc
|
|
|
|
@cindex startup
|
|
|
|
@cindex .netrc
|
|
|
|
|
|
|
|
Once you know how to change default settings of Wget through command
|
|
|
|
line arguments, you may wish to make some of those settings permanent.
|
|
|
|
You can do that in a convenient way by creating the Wget startup
|
|
|
|
file---@file{.wgetrc}.
|
|
|
|
|
|
|
|
Besides @file{.wgetrc} is the ``main'' initialization file, it is
|
|
|
|
convenient to have a special facility for storing passwords. Thus Wget
|
|
|
|
reads and interprets the contents of @file{$HOME/.netrc}, if it finds
|
|
|
|
it. You can find @file{.netrc} format in your system manuals.
|
|
|
|
|
|
|
|
Wget reads @file{.wgetrc} upon startup, recognizing a limited set of
|
|
|
|
commands.
|
|
|
|
|
|
|
|
@menu
|
2009-08-28 02:57:09 -04:00
|
|
|
* Wgetrc Location:: Location of various wgetrc files.
|
|
|
|
* Wgetrc Syntax:: Syntax of wgetrc.
|
|
|
|
* Wgetrc Commands:: List of available commands.
|
|
|
|
* Sample Wgetrc:: A wgetrc example.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end menu
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Wgetrc Location, Wgetrc Syntax, Startup File, Startup File
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Wgetrc Location
|
|
|
|
@cindex wgetrc location
|
|
|
|
@cindex location of wgetrc
|
|
|
|
|
|
|
|
When initializing, Wget will look for a @dfn{global} startup file,
|
|
|
|
@file{/usr/local/etc/wgetrc} by default (or some prefix other than
|
|
|
|
@file{/usr/local}, if Wget was not installed there) and read commands
|
|
|
|
from there, if it exists.
|
|
|
|
|
|
|
|
Then it will look for the user's file. If the environmental variable
|
|
|
|
@code{WGETRC} is set, Wget will try to load that file. Failing that, no
|
|
|
|
further attempts will be made.
|
|
|
|
|
|
|
|
If @code{WGETRC} is not set, Wget will try to load @file{$HOME/.wgetrc}.
|
|
|
|
|
|
|
|
The fact that user's settings are loaded after the system-wide ones
|
|
|
|
means that in case of collision user's wgetrc @emph{overrides} the
|
|
|
|
system-wide wgetrc (in @file{/usr/local/etc/wgetrc} by default).
|
|
|
|
Fascist admins, away!
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Wgetrc Syntax, Wgetrc Commands, Wgetrc Location, Startup File
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Wgetrc Syntax
|
|
|
|
@cindex wgetrc syntax
|
|
|
|
@cindex syntax of wgetrc
|
|
|
|
|
|
|
|
The syntax of a wgetrc command is simple:
|
|
|
|
|
|
|
|
@example
|
|
|
|
variable = value
|
|
|
|
@end example
|
|
|
|
|
|
|
|
The @dfn{variable} will also be called @dfn{command}. Valid
|
|
|
|
@dfn{values} are different for different commands.
|
|
|
|
|
|
|
|
The commands are case-insensitive and underscore-insensitive. Thus
|
|
|
|
@samp{DIr__PrefiX} is the same as @samp{dirprefix}. Empty lines, lines
|
|
|
|
beginning with @samp{#} and lines containing white-space only are
|
|
|
|
discarded.
|
|
|
|
|
|
|
|
Commands that expect a comma-separated list will clear the list on an
|
|
|
|
empty command. So, if you wish to reset the rejection list specified in
|
|
|
|
global @file{wgetrc}, you can do it with:
|
|
|
|
|
|
|
|
@example
|
|
|
|
reject =
|
|
|
|
@end example
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Wgetrc Commands, Sample Wgetrc, Wgetrc Syntax, Startup File
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Wgetrc Commands
|
|
|
|
@cindex wgetrc commands
|
|
|
|
|
2000-10-20 02:59:30 -04:00
|
|
|
The complete set of commands is listed below. Legal values are listed
|
|
|
|
after the @samp{=}. Simple Boolean values can be set or unset using
|
2005-06-21 21:56:02 -04:00
|
|
|
@samp{on} and @samp{off} or @samp{1} and @samp{0}.
|
2000-10-20 02:59:30 -04:00
|
|
|
|
2000-10-24 02:19:17 -04:00
|
|
|
Some commands take pseudo-arbitrary values. @var{address} values can be
|
|
|
|
hostnames or dotted-quad IP addresses. @var{n} can be any positive
|
|
|
|
integer, or @samp{inf} for infinity, where appropriate. @var{string}
|
|
|
|
values can be any non-empty string.
|
2000-10-20 02:59:30 -04:00
|
|
|
|
2004-02-12 17:16:48 -05:00
|
|
|
Most of these commands have direct command-line equivalents. Also, any
|
|
|
|
wgetrc command can be specified on the command line using the
|
|
|
|
@samp{--execute} switch (@pxref{Basic Startup Options}.)
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@table @asis
|
|
|
|
@item accept/reject = @var{string}
|
2000-11-14 17:49:07 -05:00
|
|
|
Same as @samp{-A}/@samp{-R} (@pxref{Types of Files}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item add_hostdir = on/off
|
|
|
|
Enable/disable host-prefixed file names. @samp{-nH} disables it.
|
|
|
|
|
2009-07-27 01:14:07 -04:00
|
|
|
@item ask_password = on/off
|
|
|
|
Prompt for a password for each connection established. Cannot be specified
|
|
|
|
when @samp{--password} is being used, because they are mutually
|
|
|
|
exclusive. Equivalent to @samp{--ask-password}.
|
|
|
|
|
|
|
|
@item auth_no_challenge = on/off
|
|
|
|
If this option is given, Wget will send Basic HTTP authentication
|
|
|
|
information (plaintext username and password) for all requests. See
|
|
|
|
@samp{--auth-no-challenge}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item background = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Enable/disable going to background---the same as @samp{-b} (which
|
|
|
|
enables it).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-02-29 19:17:23 -05:00
|
|
|
@item backup_converted = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Enable/disable saving pre-converted files with the suffix
|
|
|
|
@samp{.orig}---the same as @samp{-K} (which enables it).
|
2000-02-29 19:17:23 -05:00
|
|
|
|
2013-07-08 18:50:30 -04:00
|
|
|
@item backups = @var{number}
|
|
|
|
Use up to @var{number} backups for a file. Backups are rotated by
|
|
|
|
adding an incremental counter that starts at @samp{1}. The default is
|
|
|
|
@samp{0}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item base = @var{string}
|
2009-07-07 01:53:01 -04:00
|
|
|
Consider relative @sc{url}s in input files (specified via the
|
|
|
|
@samp{input} command or the @samp{--input-file}/@samp{-i} option,
|
|
|
|
together with @samp{force_html} or @samp{--force-html})
|
|
|
|
as being relative to @var{string}---the same as @samp{--base=@var{string}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-10-24 02:19:17 -04:00
|
|
|
@item bind_address = @var{address}
|
2005-04-27 14:23:41 -04:00
|
|
|
Bind to @var{address}, like the @samp{--bind-address=@var{address}}.
|
2000-10-24 02:19:17 -04:00
|
|
|
|
2005-04-27 14:23:41 -04:00
|
|
|
@item ca_certificate = @var{file}
|
|
|
|
Set the certificate authority bundle file to @var{file}. The same
|
|
|
|
as @samp{--ca-certificate=@var{file}}.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
2005-04-27 14:23:41 -04:00
|
|
|
@item ca_directory = @var{directory}
|
2005-04-27 13:44:02 -04:00
|
|
|
Set the directory used for certificate authorities. The same as
|
2005-04-27 14:23:41 -04:00
|
|
|
@samp{--ca-directory=@var{directory}}.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item cache = on/off
|
2003-11-08 18:48:36 -05:00
|
|
|
When set to off, disallow server-caching. See the @samp{--no-cache}
|
|
|
|
option.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2005-04-27 14:23:41 -04:00
|
|
|
@item certificate = @var{file}
|
|
|
|
Set the client certificate file name to @var{file}. The same as
|
|
|
|
@samp{--certificate=@var{file}}.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
|
|
|
@item certificate_type = @var{string}
|
|
|
|
Specify the type of the client certificate, legal values being
|
|
|
|
@samp{PEM} (the default) and @samp{DER} (aka ASN1). The same as
|
2005-04-27 14:23:41 -04:00
|
|
|
@samp{--certificate-type=@var{string}}.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
|
|
|
@item check_certificate = on/off
|
|
|
|
If this is set to off, the server certificate is not checked against
|
|
|
|
the specified client authorities. The default is ``on''. The same as
|
|
|
|
@samp{--check-certificate}.
|
|
|
|
|
2007-10-03 14:09:51 -04:00
|
|
|
@item connect_timeout = @var{n}
|
|
|
|
Set the connect timeout---the same as @samp{--connect-timeout}.
|
|
|
|
|
2007-10-03 14:18:21 -04:00
|
|
|
@item content_disposition = on/off
|
2008-01-23 23:19:56 -05:00
|
|
|
Turn on recognition of the (non-standard) @samp{Content-Disposition}
|
|
|
|
HTTP header---if set to @samp{on}, the same as @samp{--content-disposition}.
|
2007-10-03 14:18:21 -04:00
|
|
|
|
2010-07-28 15:22:22 -04:00
|
|
|
@item trust_server_names = on/off
|
|
|
|
If set to on, use the last component of a redirection URL for the local
|
|
|
|
file name.
|
|
|
|
|
2007-10-03 14:09:51 -04:00
|
|
|
@item continue = on/off
|
|
|
|
If set to on, force continuation of preexistent partially retrieved
|
|
|
|
files. See @samp{-c} before setting it.
|
|
|
|
|
2003-10-26 10:23:30 -05:00
|
|
|
@item convert_links = on/off
|
1999-12-02 02:42:23 -05:00
|
|
|
Convert non-relative links locally. The same as @samp{-k}.
|
|
|
|
|
2001-04-27 02:08:23 -04:00
|
|
|
@item cookies = on/off
|
|
|
|
When set to off, disallow cookies. See the @samp{--cookies} option.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item cut_dirs = @var{n}
|
2005-04-27 14:23:41 -04:00
|
|
|
Ignore @var{n} remote directory components. Equivalent to
|
|
|
|
@samp{--cut-dirs=@var{n}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item debug = on/off
|
|
|
|
Debug mode, same as @samp{-d}.
|
|
|
|
|
2008-10-26 17:32:11 -04:00
|
|
|
@item default_page = @var{string}
|
|
|
|
Default page name---the same as @samp{--default-page=@var{string}}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item delete_after = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Delete after download---the same as @samp{--delete-after}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item dir_prefix = @var{string}
|
2005-04-27 14:23:41 -04:00
|
|
|
Top of directory tree---the same as @samp{-P @var{string}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item dirstruct = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Turning dirstruct on or off---the same as @samp{-x} or @samp{-nd},
|
1999-12-02 02:42:23 -05:00
|
|
|
respectively.
|
|
|
|
|
2003-09-10 15:41:54 -04:00
|
|
|
@item dns_cache = on/off
|
|
|
|
Turn DNS caching on/off. Since DNS caching is on by default, this
|
2005-04-27 14:23:41 -04:00
|
|
|
option is normally used to turn it off and is equivalent to
|
|
|
|
@samp{--no-dns-cache}.
|
2003-09-10 15:41:54 -04:00
|
|
|
|
2003-09-21 00:45:37 -04:00
|
|
|
@item dns_timeout = @var{n}
|
|
|
|
Set the DNS timeout---the same as @samp{--dns-timeout}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item domains = @var{string}
|
2001-11-30 21:36:21 -05:00
|
|
|
Same as @samp{-D} (@pxref{Spanning Hosts}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item dot_bytes = @var{n}
|
|
|
|
Specify the number of bytes ``contained'' in a dot, as seen throughout
|
|
|
|
the retrieval (1024 by default). You can postfix the value with
|
|
|
|
@samp{k} or @samp{m}, representing kilobytes and megabytes,
|
|
|
|
respectively. With dot settings you can tailor the dot retrieval to
|
|
|
|
suit your needs, or you can use the predefined @dfn{styles}
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Download Options}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2007-10-03 14:09:51 -04:00
|
|
|
@item dot_spacing = @var{n}
|
|
|
|
Specify the number of dots in a single cluster (10 by default).
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item dots_in_line = @var{n}
|
|
|
|
Specify the number of dots that will be printed in each line throughout
|
|
|
|
the retrieval (50 by default).
|
|
|
|
|
2005-04-27 14:23:41 -04:00
|
|
|
@item egd_file = @var{file}
|
2005-04-24 05:21:07 -04:00
|
|
|
Use @var{string} as the EGD socket file name. The same as
|
2005-04-27 14:23:41 -04:00
|
|
|
@samp{--egd-file=@var{file}}.
|
2005-04-24 05:21:07 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item exclude_directories = @var{string}
|
|
|
|
Specify a comma-separated list of directories you wish to exclude from
|
2005-04-27 14:23:41 -04:00
|
|
|
download---the same as @samp{-X @var{string}} (@pxref{Directory-Based
|
|
|
|
Limits}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item exclude_domains = @var{string}
|
2005-04-27 14:23:41 -04:00
|
|
|
Same as @samp{--exclude-domains=@var{string}} (@pxref{Spanning
|
|
|
|
Hosts}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item follow_ftp = on/off
|
2001-11-21 19:52:19 -05:00
|
|
|
Follow @sc{ftp} links from @sc{html} documents---the same as
|
|
|
|
@samp{--follow-ftp}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
@item follow_tags = @var{string}
|
2005-04-27 14:23:41 -04:00
|
|
|
Only follow certain @sc{html} tags when doing a recursive retrieval,
|
|
|
|
just like @samp{--follow-tags=@var{string}}.
|
2000-03-11 01:48:06 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item force_html = on/off
|
|
|
|
If set to on, force the input filename to be regarded as an @sc{html}
|
2000-11-16 07:35:27 -05:00
|
|
|
document---the same as @samp{-F}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2005-04-27 17:30:22 -04:00
|
|
|
@item ftp_password = @var{string}
|
2005-04-23 05:43:21 -04:00
|
|
|
Set your @sc{ftp} password to @var{string}. Without this setting, the
|
|
|
|
password defaults to @samp{-wget@@}, which is a useful default for
|
|
|
|
anonymous @sc{ftp} access.
|
|
|
|
|
|
|
|
This command used to be named @code{passwd} prior to Wget 1.10.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item ftp_proxy = @var{string}
|
|
|
|
Use @var{string} as @sc{ftp} proxy, instead of the one specified in
|
|
|
|
environment.
|
|
|
|
|
2005-04-27 17:30:22 -04:00
|
|
|
@item ftp_user = @var{string}
|
|
|
|
Set @sc{ftp} user to @var{string}.
|
|
|
|
|
|
|
|
This command used to be named @code{login} prior to Wget 1.10.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item glob = on/off
|
2003-11-08 18:48:36 -05:00
|
|
|
Turn globbing on/off---the same as @samp{--glob} and @samp{--no-glob}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item header = @var{string}
|
2007-10-11 00:28:52 -04:00
|
|
|
Define a header for HTTP downloads, like using
|
2005-04-27 14:23:41 -04:00
|
|
|
@samp{--header=@var{string}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2009-07-28 20:37:58 -04:00
|
|
|
@item adjust_extension = on/off
|
2005-04-27 14:23:41 -04:00
|
|
|
Add a @samp{.html} extension to @samp{text/html} or
|
2009-07-28 20:37:58 -04:00
|
|
|
@samp{application/xhtml+xml} files that lack one, or a @samp{.css}
|
|
|
|
extension to @samp{text/css} files that lack one, like
|
|
|
|
@samp{-E}. Previously named @samp{html_extension} (still acceptable,
|
|
|
|
but deprecated).
|
2000-10-20 01:55:46 -04:00
|
|
|
|
2004-02-06 11:50:14 -05:00
|
|
|
@item http_keep_alive = on/off
|
2005-04-27 14:23:41 -04:00
|
|
|
Turn the keep-alive feature on or off (defaults to on). Turning it
|
|
|
|
off is equivalent to @samp{--no-http-keep-alive}.
|
2004-02-06 11:50:14 -05:00
|
|
|
|
2005-04-27 17:30:22 -04:00
|
|
|
@item http_password = @var{string}
|
2005-04-27 14:23:41 -04:00
|
|
|
Set @sc{http} password, equivalent to
|
2005-04-27 17:30:22 -04:00
|
|
|
@samp{--http-password=@var{string}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item http_proxy = @var{string}
|
|
|
|
Use @var{string} as @sc{http} proxy, instead of the one specified in
|
|
|
|
environment.
|
|
|
|
|
|
|
|
@item http_user = @var{string}
|
2005-04-27 14:23:41 -04:00
|
|
|
Set @sc{http} user to @var{string}, equivalent to
|
|
|
|
@samp{--http-user=@var{string}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2005-11-15 16:18:14 -05:00
|
|
|
@item https_proxy = @var{string}
|
|
|
|
Use @var{string} as @sc{https} proxy, instead of the one specified in
|
|
|
|
environment.
|
|
|
|
|
2006-06-26 14:37:52 -04:00
|
|
|
@item ignore_case = on/off
|
|
|
|
When set to on, match files and directories case insensitively; the
|
|
|
|
same as @samp{--ignore-case}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item ignore_length = on/off
|
|
|
|
When set to on, ignore @code{Content-Length} header; the same as
|
|
|
|
@samp{--ignore-length}.
|
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
@item ignore_tags = @var{string}
|
2005-04-27 14:23:41 -04:00
|
|
|
Ignore certain @sc{html} tags when doing a recursive retrieval, like
|
|
|
|
@samp{--ignore-tags=@var{string}}.
|
2000-03-11 01:48:06 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item include_directories = @var{string}
|
|
|
|
Specify a comma-separated list of directories you wish to follow when
|
2005-04-27 14:23:41 -04:00
|
|
|
downloading---the same as @samp{-I @var{string}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2009-07-27 00:50:19 -04:00
|
|
|
@item iri = on/off
|
|
|
|
When set to on, enable internationalized URI (IRI) support; the same as
|
|
|
|
@samp{--iri}.
|
|
|
|
|
2005-04-19 18:57:28 -04:00
|
|
|
@item inet4_only = on/off
|
|
|
|
Force connecting to IPv4 addresses, off by default. You can put this
|
|
|
|
in the global init file to disable Wget's attempts to resolve and
|
|
|
|
connect to IPv6 hosts. Available only if Wget was compiled with IPv6
|
|
|
|
support. The same as @samp{--inet4-only} or @samp{-4}.
|
|
|
|
|
|
|
|
@item inet6_only = on/off
|
|
|
|
Force connecting to IPv6 addresses, off by default. Available only if
|
|
|
|
Wget was compiled with IPv6 support. The same as @samp{--inet6-only}
|
|
|
|
or @samp{-6}.
|
|
|
|
|
2005-04-27 14:23:41 -04:00
|
|
|
@item input = @var{file}
|
|
|
|
Read the @sc{url}s from @var{string}, like @samp{-i @var{file}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2009-07-27 01:14:07 -04:00
|
|
|
@item keep_session_cookies = on/off
|
|
|
|
When specified, causes @samp{save_cookies = on} to also save session
|
|
|
|
cookies. See @samp{--keep-session-cookies}.
|
|
|
|
|
2002-04-13 18:44:16 -04:00
|
|
|
@item limit_rate = @var{rate}
|
|
|
|
Limit the download speed to no more than @var{rate} bytes per second.
|
2005-04-27 14:23:41 -04:00
|
|
|
The same as @samp{--limit-rate=@var{rate}}.
|
2002-04-13 18:44:16 -04:00
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
@item load_cookies = @var{file}
|
2005-04-27 14:23:41 -04:00
|
|
|
Load cookies from @var{file}. See @samp{--load-cookies @var{file}}.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
2009-07-27 00:50:19 -04:00
|
|
|
@item local_encoding = @var{encoding}
|
|
|
|
Force Wget to use @var{encoding} as the default system encoding. See
|
|
|
|
@samp{--local-encoding}.
|
|
|
|
|
2005-04-27 14:23:41 -04:00
|
|
|
@item logfile = @var{file}
|
|
|
|
Set logfile to @var{file}, the same as @samp{-o @var{file}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2007-10-03 21:50:31 -04:00
|
|
|
@item max_redirect = @var{number}
|
|
|
|
Specifies the maximum number of redirections to follow for a resource.
|
|
|
|
See @samp{--max-redirect=@var{number}}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item mirror = on/off
|
|
|
|
Turn mirroring on/off. The same as @samp{-m}.
|
|
|
|
|
|
|
|
@item netrc = on/off
|
|
|
|
Turn reading netrc on or off.
|
|
|
|
|
2007-10-03 14:09:51 -04:00
|
|
|
@item no_clobber = on/off
|
1999-12-02 02:42:23 -05:00
|
|
|
Same as @samp{-nc}.
|
|
|
|
|
|
|
|
@item no_parent = on/off
|
|
|
|
Disallow retrieving outside the directory hierarchy, like
|
2000-11-14 17:49:07 -05:00
|
|
|
@samp{--no-parent} (@pxref{Directory-Based Limits}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item no_proxy = @var{string}
|
|
|
|
Use @var{string} as the comma-separated list of domains to avoid in
|
|
|
|
proxy loading, instead of the one specified in environment.
|
|
|
|
|
2005-04-27 14:23:41 -04:00
|
|
|
@item output_document = @var{file}
|
|
|
|
Set the output filename---the same as @samp{-O @var{file}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-08-30 07:26:21 -04:00
|
|
|
@item page_requisites = on/off
|
2003-09-30 17:09:06 -04:00
|
|
|
Download all ancillary documents necessary for a single @sc{html} page to
|
2000-11-16 07:35:27 -05:00
|
|
|
display properly---the same as @samp{-p}.
|
2000-08-30 07:26:21 -04:00
|
|
|
|
2005-06-21 21:56:02 -04:00
|
|
|
@item passive_ftp = on/off
|
2005-03-06 14:34:25 -05:00
|
|
|
Change setting of passive @sc{ftp}, equivalent to the
|
2005-06-21 21:56:02 -04:00
|
|
|
@samp{--passive-ftp} option.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2012-11-08 06:15:41 -05:00
|
|
|
@item password = @var{string}
|
2005-04-27 17:30:22 -04:00
|
|
|
Specify password @var{string} for both @sc{ftp} and @sc{http} file retrieval.
|
|
|
|
This command can be overridden using the @samp{ftp_password} and
|
|
|
|
@samp{http_password} command for @sc{ftp} and @sc{http} respectively.
|
|
|
|
|
2003-10-06 19:12:37 -04:00
|
|
|
@item post_data = @var{string}
|
|
|
|
Use POST as the method for all HTTP requests and send @var{string} in
|
2005-04-27 14:23:41 -04:00
|
|
|
the request body. The same as @samp{--post-data=@var{string}}.
|
2003-10-06 19:12:37 -04:00
|
|
|
|
|
|
|
@item post_file = @var{file}
|
|
|
|
Use POST as the method for all HTTP requests and send the contents of
|
2005-04-27 14:23:41 -04:00
|
|
|
@var{file} in the request body. The same as
|
|
|
|
@samp{--post-file=@var{file}}.
|
2003-10-06 19:12:37 -04:00
|
|
|
|
2008-05-17 16:19:37 -04:00
|
|
|
@item prefer_family = none/IPv4/IPv6
|
2005-04-24 16:00:19 -04:00
|
|
|
When given a choice of several addresses, connect to the addresses
|
2008-05-17 16:19:37 -04:00
|
|
|
with specified address family first. The address order returned by
|
|
|
|
DNS is used without change by default. The same as @samp{--prefer-family},
|
|
|
|
which see for a detailed discussion of why this is useful.
|
2005-04-24 16:00:19 -04:00
|
|
|
|
2005-04-27 14:23:41 -04:00
|
|
|
@item private_key = @var{file}
|
|
|
|
Set the private key file to @var{file}. The same as
|
|
|
|
@samp{--private-key=@var{file}}.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
|
|
|
@item private_key_type = @var{string}
|
|
|
|
Specify the type of the private key, legal values being @samp{PEM}
|
|
|
|
(the default) and @samp{DER} (aka ASN1). The same as
|
2005-04-27 14:23:41 -04:00
|
|
|
@samp{--private-type=@var{string}}.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
2001-11-23 12:04:26 -05:00
|
|
|
@item progress = @var{string}
|
2005-04-27 14:23:41 -04:00
|
|
|
Set the type of the progress indicator. Legal types are @samp{dot}
|
|
|
|
and @samp{bar}. Equivalent to @samp{--progress=@var{string}}.
|
2001-11-23 12:04:26 -05:00
|
|
|
|
2003-12-05 22:07:10 -05:00
|
|
|
@item protocol_directories = on/off
|
|
|
|
When set, use the protocol name as a directory component of local file
|
|
|
|
names. The same as @samp{--protocol-directories}.
|
|
|
|
|
2005-04-27 17:30:22 -04:00
|
|
|
@item proxy_password = @var{string}
|
2005-04-27 14:23:41 -04:00
|
|
|
Set proxy authentication password to @var{string}, like
|
2005-04-27 17:30:22 -04:00
|
|
|
@samp{--proxy-password=@var{string}}.
|
2000-10-20 02:59:30 -04:00
|
|
|
|
2007-10-03 14:09:51 -04:00
|
|
|
@item proxy_user = @var{string}
|
|
|
|
Set proxy authentication user name to @var{string}, like
|
|
|
|
@samp{--proxy-user=@var{string}}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item quiet = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Quiet mode---the same as @samp{-q}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item quota = @var{quota}
|
2000-04-13 15:37:52 -04:00
|
|
|
Specify the download quota, which is useful to put in the global
|
2000-11-16 07:35:27 -05:00
|
|
|
@file{wgetrc}. When download quota is specified, Wget will stop
|
|
|
|
retrieving after the download sum has become greater than quota. The
|
|
|
|
quota can be specified in bytes (default), kbytes @samp{k} appended) or
|
|
|
|
mbytes (@samp{m} appended). Thus @samp{quota = 5m} will set the quota
|
2005-04-27 14:23:41 -04:00
|
|
|
to 5 megabytes. Note that the user's startup file overrides system
|
2000-11-16 07:35:27 -05:00
|
|
|
settings.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2005-04-27 17:10:30 -04:00
|
|
|
@item random_file = @var{file}
|
|
|
|
Use @var{file} as a source of randomness on systems lacking
|
|
|
|
@file{/dev/random}.
|
|
|
|
|
2006-02-05 03:40:25 -05:00
|
|
|
@item random_wait = on/off
|
|
|
|
Turn random between-request wait times on or off. The same as
|
|
|
|
@samp{--random-wait}.
|
|
|
|
|
2003-09-21 00:45:37 -04:00
|
|
|
@item read_timeout = @var{n}
|
2005-04-27 14:23:41 -04:00
|
|
|
Set the read (and write) timeout---the same as
|
|
|
|
@samp{--read-timeout=@var{n}}.
|
2003-09-21 00:45:37 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item reclevel = @var{n}
|
2005-04-27 14:23:41 -04:00
|
|
|
Recursion level (depth)---the same as @samp{-l @var{n}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item recursive = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Recursive on/off---the same as @samp{-r}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2005-04-24 05:21:07 -04:00
|
|
|
@item referer = @var{string}
|
2005-04-27 14:23:41 -04:00
|
|
|
Set HTTP @samp{Referer:} header just like
|
2007-10-03 14:09:51 -04:00
|
|
|
@samp{--referer=@var{string}}. (Note that it was the folks who wrote
|
|
|
|
the @sc{http} spec who got the spelling of ``referrer'' wrong.)
|
2005-04-24 05:21:07 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item relative_only = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Follow only relative links---the same as @samp{-L} (@pxref{Relative
|
1999-12-02 02:42:23 -05:00
|
|
|
Links}).
|
|
|
|
|
2009-07-27 00:50:19 -04:00
|
|
|
@item remote_encoding = @var{encoding}
|
|
|
|
Force Wget to use @var{encoding} as the default remote server encoding.
|
|
|
|
See @samp{--remote-encoding}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item remove_listing = on/off
|
|
|
|
If set to on, remove @sc{ftp} listings downloaded by Wget. Setting it
|
2003-11-14 18:39:14 -05:00
|
|
|
to off is the same as @samp{--no-remove-listing}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2003-09-16 21:32:05 -04:00
|
|
|
@item restrict_file_names = unix/windows
|
2003-09-14 18:04:13 -04:00
|
|
|
Restrict the file names generated by Wget from URLs. See
|
|
|
|
@samp{--restrict-file-names} for a more detailed description.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item retr_symlinks = on/off
|
|
|
|
When set to on, retrieve symbolic links as if they were plain files; the
|
|
|
|
same as @samp{--retr-symlinks}.
|
|
|
|
|
2005-04-25 16:00:16 -04:00
|
|
|
@item retry_connrefused = on/off
|
|
|
|
When set to on, consider ``connection refused'' a transient
|
|
|
|
error---the same as @samp{--retry-connrefused}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item robots = on/off
|
2002-04-23 20:37:39 -04:00
|
|
|
Specify whether the norobots convention is respected by Wget, ``on'' by
|
|
|
|
default. This switch controls both the @file{/robots.txt} and the
|
|
|
|
@samp{nofollow} aspect of the spec. @xref{Robot Exclusion}, for more
|
|
|
|
details about this. Be sure you know what you are doing before turning
|
|
|
|
this off.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
@item save_cookies = @var{file}
|
2005-04-27 14:23:41 -04:00
|
|
|
Save cookies to @var{file}. The same as @samp{--save-cookies
|
|
|
|
@var{file}}.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
2008-10-26 17:32:11 -04:00
|
|
|
@item save_headers = on/off
|
|
|
|
Same as @samp{--save-headers}.
|
|
|
|
|
2005-04-27 13:44:02 -04:00
|
|
|
@item secure_protocol = @var{string}
|
|
|
|
Choose the secure protocol to be used. Legal values are @samp{auto}
|
|
|
|
(the default), @samp{SSLv2}, @samp{SSLv3}, and @samp{TLSv1}. The same
|
2005-04-27 14:23:41 -04:00
|
|
|
as @samp{--secure-protocol=@var{string}}.
|
2005-04-27 13:44:02 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item server_response = on/off
|
|
|
|
Choose whether or not to print the @sc{http} and @sc{ftp} server
|
2000-11-16 07:35:27 -05:00
|
|
|
responses---the same as @samp{-S}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2011-08-06 06:38:42 -04:00
|
|
|
@item show_all_dns_entries = on/off
|
|
|
|
When a DNS name is resolved, show all the IP addresses, not just the first
|
|
|
|
three.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item span_hosts = on/off
|
|
|
|
Same as @samp{-H}.
|
|
|
|
|
2008-10-26 17:32:11 -04:00
|
|
|
@item spider = on/off
|
|
|
|
Same as @samp{--spider}.
|
|
|
|
|
2003-09-18 20:33:22 -04:00
|
|
|
@item strict_comments = on/off
|
|
|
|
Same as @samp{--strict-comments}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item timeout = @var{n}
|
2005-04-27 14:23:41 -04:00
|
|
|
Set all applicable timeout values to @var{n}, the same as @samp{-T
|
|
|
|
@var{n}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item timestamping = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Turn timestamping on/off. The same as @samp{-N} (@pxref{Time-Stamping}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2010-01-09 23:21:37 -05:00
|
|
|
@item use_server_timestamps = on/off
|
|
|
|
If set to @samp{off}, Wget won't set the local file's timestamp by the
|
|
|
|
one on the server (same as @samp{--no-use-server-timestamps}).
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item tries = @var{n}
|
2005-04-27 14:23:41 -04:00
|
|
|
Set number of retries per @sc{url}---the same as @samp{-t @var{n}}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item use_proxy = on/off
|
2005-04-27 14:23:41 -04:00
|
|
|
When set to off, don't use proxy even when proxy-related environment
|
|
|
|
variables are set. In that case it is the same as using
|
|
|
|
@samp{--no-proxy}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2005-04-27 17:30:22 -04:00
|
|
|
@item user = @var{string}
|
|
|
|
Specify username @var{string} for both @sc{ftp} and @sc{http} file retrieval.
|
|
|
|
This command can be overridden using the @samp{ftp_user} and
|
|
|
|
@samp{http_user} command for @sc{ftp} and @sc{http} respectively.
|
|
|
|
|
2008-10-26 17:32:11 -04:00
|
|
|
@item user_agent = @var{string}
|
|
|
|
User agent identification sent to the HTTP Server---the same as
|
|
|
|
@samp{--user-agent=@var{string}}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item verbose = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Turn verbose on/off---the same as @samp{-v}/@samp{-nv}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item wait = @var{n}
|
2005-04-27 14:23:41 -04:00
|
|
|
Wait @var{n} seconds between retrievals---the same as @samp{-w
|
|
|
|
@var{n}}.
|
2000-04-12 21:42:34 -04:00
|
|
|
|
2007-10-03 14:09:51 -04:00
|
|
|
@item wait_retry = @var{n}
|
2000-11-16 07:35:27 -05:00
|
|
|
Wait up to @var{n} seconds between retries of failed retrievals
|
2005-04-27 14:23:41 -04:00
|
|
|
only---the same as @samp{--waitretry=@var{n}}. Note that this is
|
|
|
|
turned on by default in the global @file{wgetrc}.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Sample Wgetrc, , Wgetrc Commands, Startup File
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Sample Wgetrc
|
|
|
|
@cindex sample wgetrc
|
|
|
|
|
|
|
|
This is the sample initialization file, as given in the distribution.
|
|
|
|
It is divided in two section---one for global usage (suitable for global
|
|
|
|
startup file), and one for local usage (suitable for
|
|
|
|
@file{$HOME/.wgetrc}). Be careful about the things you change.
|
|
|
|
|
2000-04-13 15:37:52 -04:00
|
|
|
Note that almost all the lines are commented out. For a command to have
|
|
|
|
any effect, you must remove the @samp{#} character at the beginning of
|
|
|
|
its line.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2000-04-13 15:37:52 -04:00
|
|
|
@include sample.wgetrc.munged_for_texi_inclusion
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Examples, Various, Startup File, Top
|
1999-12-02 02:42:23 -05:00
|
|
|
@chapter Examples
|
|
|
|
@cindex examples
|
|
|
|
|
2001-12-08 01:47:48 -05:00
|
|
|
@c man begin EXAMPLES
|
|
|
|
The examples are divided into three sections loosely based on their
|
|
|
|
complexity.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@menu
|
2009-08-28 02:57:09 -04:00
|
|
|
* Simple Usage:: Simple, basic usage of the program.
|
|
|
|
* Advanced Usage:: Advanced tips.
|
|
|
|
* Very Advanced Usage:: The hairy stuff.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end menu
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Simple Usage, Advanced Usage, Examples, Examples
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Simple Usage
|
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
|
|
|
Say you want to download a @sc{url}. Just type:
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget http://fly.srk.fer.hr/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
But what will happen if the connection is slow, and the file is lengthy?
|
|
|
|
The connection will probably fail before the whole file is retrieved,
|
|
|
|
more than once. In this case, Wget will try getting the file until it
|
|
|
|
either gets the whole of it, or exceeds the default number of retries
|
|
|
|
(this being 20). It is easy to change the number of tries to 45, to
|
|
|
|
insure that the whole file will arrive safely:
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget --tries=45 http://fly.srk.fer.hr/jpg/flyweb.jpg
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
Now let's leave Wget to work in the background, and write its progress
|
|
|
|
to log file @file{log}. It is tiring to type @samp{--tries}, so we
|
|
|
|
shall use @samp{-t}.
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget -t 45 -o log http://fly.srk.fer.hr/jpg/flyweb.jpg &
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
The ampersand at the end of the line makes sure that Wget works in the
|
|
|
|
background. To unlimit the number of retries, use @samp{-t inf}.
|
|
|
|
|
|
|
|
@item
|
|
|
|
The usage of @sc{ftp} is as simple. Wget will take care of login and
|
|
|
|
password.
|
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget ftp://gnjilux.srk.fer.hr/welcome.msg
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
If you specify a directory, Wget will retrieve the directory listing,
|
|
|
|
parse it and convert it to @sc{html}. Try:
|
|
|
|
|
|
|
|
@example
|
2003-10-01 17:12:09 -04:00
|
|
|
wget ftp://ftp.gnu.org/pub/gnu/
|
2001-12-08 01:47:48 -05:00
|
|
|
links index.html
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
@end itemize
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Advanced Usage, Very Advanced Usage, Simple Usage, Examples
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Advanced Usage
|
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
You have a file that contains the URLs you want to download? Use the
|
|
|
|
@samp{-i} switch:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget -i @var{file}
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
If you specify @samp{-} as file name, the @sc{url}s will be read from
|
|
|
|
standard input.
|
|
|
|
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
Create a five levels deep mirror image of the GNU web site, with the
|
|
|
|
same directory structure the original has, with only one try per
|
|
|
|
document, saving the log of the activities to @file{gnulog}:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r http://www.gnu.org/ -o gnulog
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
2008-04-24 19:48:46 -04:00
|
|
|
The same as the above, but convert the links in the downloaded files to
|
2001-12-08 01:47:48 -05:00
|
|
|
point to local files, so you can view the documents off-line:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget --convert-links -r http://www.gnu.org/ -o gnulog
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
2003-09-30 17:09:06 -04:00
|
|
|
Retrieve only one @sc{html} page, but make sure that all the elements needed
|
2001-12-08 01:47:48 -05:00
|
|
|
for the page to be displayed, such as inline images and external style
|
|
|
|
sheets, are also downloaded. Also make sure the downloaded page
|
|
|
|
references the downloaded links.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget -p --convert-links http://www.server.com/dir/page.html
|
|
|
|
@end example
|
|
|
|
|
2003-09-30 17:09:06 -04:00
|
|
|
The @sc{html} page will be saved to @file{www.server.com/dir/page.html}, and
|
2001-12-08 01:47:48 -05:00
|
|
|
the images, stylesheets, etc., somewhere under @file{www.server.com/},
|
|
|
|
depending on where they were on the remote server.
|
|
|
|
|
|
|
|
@item
|
|
|
|
The same as the above, but without the @file{www.server.com/} directory.
|
|
|
|
In fact, I don't want to have all those random server directories
|
|
|
|
anyway---just save @emph{all} those files under a @file{download/}
|
|
|
|
subdirectory of the current directory.
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -p --convert-links -nH -nd -Pdownload \
|
|
|
|
http://www.server.com/dir/page.html
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
Retrieve the index.html of @samp{www.lycos.com}, showing the original
|
|
|
|
server headers:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -S http://www.lycos.com/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
Save the server headers with the file, perhaps for post-processing.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@example
|
2005-04-25 16:21:06 -04:00
|
|
|
wget --save-headers http://www.lycos.com/
|
1999-12-02 02:42:23 -05:00
|
|
|
more index.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
Retrieve the first two levels of @samp{wuarchive.wustl.edu}, saving them
|
2001-12-08 01:47:48 -05:00
|
|
|
to @file{/tmp}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget -r -l2 -P/tmp ftp://wuarchive.wustl.edu/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
You want to download all the @sc{gif}s from a directory on an @sc{http}
|
2001-12-08 15:14:00 -05:00
|
|
|
server. You tried @samp{wget http://www.server.com/dir/*.gif}, but that
|
|
|
|
didn't work because @sc{http} retrieval does not support globbing. In
|
|
|
|
that case, use:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget -r -l1 --no-parent -A.gif http://www.server.com/dir/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-12-08 01:47:48 -05:00
|
|
|
More verbose, but the effect is the same. @samp{-r -l1} means to
|
2003-11-08 19:09:26 -05:00
|
|
|
retrieve recursively (@pxref{Recursive Download}), with maximum depth
|
2001-12-08 01:47:48 -05:00
|
|
|
of 1. @samp{--no-parent} means that references to the parent directory
|
|
|
|
are ignored (@pxref{Directory-Based Limits}), and @samp{-A.gif} means to
|
1999-12-02 02:42:23 -05:00
|
|
|
download only the @sc{gif} files. @samp{-A "*.gif"} would have worked
|
|
|
|
too.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Suppose you were in the middle of downloading, when Wget was
|
|
|
|
interrupted. Now you do not want to clobber the files already present.
|
|
|
|
It would be:
|
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget -nc -r http://www.gnu.org/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
If you want to encode your own username and password to @sc{http} or
|
2000-11-14 17:49:07 -05:00
|
|
|
@sc{ftp}, use the appropriate @sc{url} syntax (@pxref{URL Format}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget ftp://hniksic:mypassword@@unix.server.com/.emacs
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2002-04-10 17:42:16 -04:00
|
|
|
Note, however, that this usage is not advisable on multi-user systems
|
|
|
|
because it reveals your password to anyone who looks at the output of
|
|
|
|
@code{ps}.
|
|
|
|
|
2001-12-08 01:47:48 -05:00
|
|
|
@cindex redirecting output
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
You would like the output documents to go to standard output instead of
|
|
|
|
to files?
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget -O - http://jagor.srce.hr/ http://www.srce.hr/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-12-08 01:47:48 -05:00
|
|
|
You can also combine the two options and make pipelines to retrieve the
|
|
|
|
documents from remote hotlists:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget -O - http://cool.list.com/ | wget --force-html -i -
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
@end itemize
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Very Advanced Usage, , Advanced Usage, Examples
|
2001-12-08 01:47:48 -05:00
|
|
|
@section Very Advanced Usage
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex mirroring
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
|
|
|
If you wish Wget to keep a mirror of a page (or @sc{ftp}
|
|
|
|
subdirectories), use @samp{--mirror} (@samp{-m}), which is the shorthand
|
2001-12-08 01:47:48 -05:00
|
|
|
for @samp{-r -l inf -N}. You can put Wget in the crontab file asking it
|
|
|
|
to recheck a site each Sunday:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
crontab
|
2001-12-08 01:47:48 -05:00
|
|
|
0 0 * * 0 wget --mirror http://www.gnu.org/ -o /home/me/weeklog
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
In addition to the above, you want the links to be converted for local
|
|
|
|
viewing. But, after having read this manual, you know that link
|
|
|
|
conversion doesn't play well with timestamping, so you also want Wget to
|
2003-09-30 17:09:06 -04:00
|
|
|
back up the original @sc{html} files before the conversion. Wget invocation
|
2001-12-08 01:47:48 -05:00
|
|
|
would look like this:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget --mirror --convert-links --backup-converted \
|
|
|
|
http://www.gnu.org/ -o /home/me/weeklog
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
2001-12-08 01:47:48 -05:00
|
|
|
But you've also noticed that local viewing doesn't work all that well
|
2003-09-30 17:09:06 -04:00
|
|
|
when @sc{html} files are saved under extensions other than @samp{.html},
|
2001-12-08 01:47:48 -05:00
|
|
|
perhaps because they were served as @file{index.cgi}. So you'd like
|
|
|
|
Wget to rename all the files served with content-type @samp{text/html}
|
2003-09-21 08:02:57 -04:00
|
|
|
or @samp{application/xhtml+xml} to @file{@var{name}.html}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget --mirror --convert-links --backup-converted \
|
|
|
|
--html-extension -o /home/me/weeklog \
|
|
|
|
http://www.gnu.org/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-12-08 01:47:48 -05:00
|
|
|
Or, with less typing:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-12-08 01:47:48 -05:00
|
|
|
wget -m -k -K -E http://www.gnu.org/ -o /home/me/weeklog
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
@end itemize
|
2001-12-08 01:47:48 -05:00
|
|
|
@c man end
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Various, Appendices, Examples, Top
|
1999-12-02 02:42:23 -05:00
|
|
|
@chapter Various
|
|
|
|
@cindex various
|
|
|
|
|
|
|
|
This chapter contains all the stuff that could not fit anywhere else.
|
|
|
|
|
|
|
|
@menu
|
2009-08-28 02:57:09 -04:00
|
|
|
* Proxies:: Support for proxy servers.
|
|
|
|
* Distribution:: Getting the latest version.
|
|
|
|
* Web Site:: GNU Wget's presence on the World Wide Web.
|
|
|
|
* Mailing Lists:: Wget mailing list for announcements and discussion.
|
|
|
|
* Internet Relay Chat:: Wget's presence on IRC.
|
|
|
|
* Reporting Bugs:: How and where to report bugs.
|
|
|
|
* Portability:: The systems Wget works on.
|
|
|
|
* Signals:: Signal-handling performed by Wget.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end menu
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Proxies, Distribution, Various, Various
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Proxies
|
|
|
|
@cindex proxies
|
|
|
|
|
|
|
|
@dfn{Proxies} are special-purpose @sc{http} servers designed to transfer
|
|
|
|
data from remote servers to local clients. One typical use of proxies
|
|
|
|
is lightening network load for users behind a slow connection. This is
|
|
|
|
achieved by channeling all @sc{http} and @sc{ftp} requests through the
|
|
|
|
proxy which caches the transferred data. When a cached resource is
|
|
|
|
requested again, proxy will return the data from cache. Another use for
|
|
|
|
proxies is for companies that separate (for security reasons) their
|
|
|
|
internal networks from the rest of Internet. In order to obtain
|
|
|
|
information from the Web, their users connect and retrieve remote data
|
|
|
|
using an authorized proxy.
|
|
|
|
|
2012-08-02 09:50:40 -04:00
|
|
|
@c man begin ENVIRONMENT
|
1999-12-02 02:42:23 -05:00
|
|
|
Wget supports proxies for both @sc{http} and @sc{ftp} retrievals. The
|
|
|
|
standard way to specify proxy location, which Wget recognizes, is using
|
|
|
|
the following environment variables:
|
|
|
|
|
2012-08-02 09:50:40 -04:00
|
|
|
@table @env
|
1999-12-02 02:42:23 -05:00
|
|
|
@item http_proxy
|
2005-11-15 16:18:14 -05:00
|
|
|
@itemx https_proxy
|
2012-08-02 09:50:40 -04:00
|
|
|
If set, the @env{http_proxy} and @env{https_proxy} variables should
|
2005-11-15 16:23:18 -05:00
|
|
|
contain the @sc{url}s of the proxies for @sc{http} and @sc{https}
|
|
|
|
connections respectively.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item ftp_proxy
|
2001-11-21 19:52:19 -05:00
|
|
|
This variable should contain the @sc{url} of the proxy for @sc{ftp}
|
2012-08-02 09:50:40 -04:00
|
|
|
connections. It is quite common that @env{http_proxy} and
|
|
|
|
@env{ftp_proxy} are set to the same @sc{url}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item no_proxy
|
2000-03-02 08:44:56 -05:00
|
|
|
This variable should contain a comma-separated list of domain extensions
|
1999-12-02 02:42:23 -05:00
|
|
|
proxy should @emph{not} be used for. For instance, if the value of
|
2012-08-02 09:50:40 -04:00
|
|
|
@env{no_proxy} is @samp{.mit.edu}, proxy will not be used to retrieve
|
1999-12-02 02:42:23 -05:00
|
|
|
documents from MIT.
|
|
|
|
@end table
|
2012-08-02 09:50:40 -04:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
In addition to the environment variables, proxy location and settings
|
|
|
|
may be specified from within Wget itself.
|
|
|
|
|
|
|
|
@table @samp
|
2012-11-08 06:15:41 -05:00
|
|
|
@item --no-proxy
|
1999-12-02 02:42:23 -05:00
|
|
|
@itemx proxy = on/off
|
2005-05-14 16:06:34 -04:00
|
|
|
This option and the corresponding command may be used to suppress the
|
|
|
|
use of proxy, even if the appropriate environment variables are set.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item http_proxy = @var{URL}
|
2005-11-15 16:18:14 -05:00
|
|
|
@itemx https_proxy = @var{URL}
|
1999-12-02 02:42:23 -05:00
|
|
|
@itemx ftp_proxy = @var{URL}
|
|
|
|
@itemx no_proxy = @var{string}
|
|
|
|
These startup file variables allow you to override the proxy settings
|
|
|
|
specified by the environment.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
Some proxy servers require authorization to enable you to use them. The
|
|
|
|
authorization consists of @dfn{username} and @dfn{password}, which must
|
|
|
|
be sent by Wget. As with @sc{http} authorization, several
|
|
|
|
authentication schemes exist. For proxy authorization only the
|
|
|
|
@code{Basic} authentication scheme is currently implemented.
|
|
|
|
|
|
|
|
You may specify your username and password either through the proxy
|
|
|
|
@sc{url} or through the command-line options. Assuming that the
|
2001-11-21 19:52:19 -05:00
|
|
|
company's proxy is located at @samp{proxy.company.com} at port 8001, a
|
|
|
|
proxy @sc{url} location containing authorization data might look like
|
|
|
|
this:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
http://hniksic:mypassword@@proxy.company.com:8001/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Alternatively, you may use the @samp{proxy-user} and
|
|
|
|
@samp{proxy-password} options, and the equivalent @file{.wgetrc}
|
2005-04-27 17:30:22 -04:00
|
|
|
settings @code{proxy_user} and @code{proxy_password} to set the proxy
|
1999-12-02 02:42:23 -05:00
|
|
|
username and password.
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Distribution, Web Site, Proxies, Various
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Distribution
|
|
|
|
@cindex latest version
|
|
|
|
|
|
|
|
Like all GNU utilities, the latest version of Wget can be found at the
|
2003-10-01 17:12:09 -04:00
|
|
|
master GNU archive site ftp.gnu.org, and its mirrors. For example,
|
1999-12-02 02:42:23 -05:00
|
|
|
Wget @value{VERSION} can be found at
|
2003-10-01 17:12:09 -04:00
|
|
|
@url{ftp://ftp.gnu.org/pub/gnu/wget/wget-@value{VERSION}.tar.gz}
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Web Site, Mailing Lists, Distribution, Various
|
2007-10-01 04:22:37 -04:00
|
|
|
@section Web Site
|
|
|
|
@cindex web site
|
|
|
|
|
|
|
|
The official web site for GNU Wget is at
|
|
|
|
@url{http://www.gnu.org/software/wget/}. However, most useful
|
|
|
|
information resides at ``The Wget Wgiki'',
|
|
|
|
@url{http://wget.addictivecode.org/}.
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Mailing Lists, Internet Relay Chat, Web Site, Various
|
2008-11-10 14:21:17 -05:00
|
|
|
@section Mailing Lists
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex mailing list
|
|
|
|
@cindex list
|
|
|
|
|
2008-11-10 14:21:17 -05:00
|
|
|
@unnumberedsubsec Primary List
|
|
|
|
|
2008-11-01 02:15:06 -04:00
|
|
|
The primary mailinglist for discussion, bug-reports, or questions
|
|
|
|
about GNU Wget is at @email{bug-wget@@gnu.org}. To subscribe, send an
|
|
|
|
email to @email{bug-wget-join@@gnu.org}, or visit
|
2008-11-10 14:21:17 -05:00
|
|
|
@url{http://lists.gnu.org/mailman/listinfo/bug-wget}.
|
|
|
|
|
|
|
|
You do not need to subscribe to send a message to the list; however,
|
|
|
|
please note that unsubscribed messages are moderated, and may take a
|
|
|
|
while before they hit the list---@strong{usually around a day}. If
|
|
|
|
you want your message to show up immediately, please subscribe to the
|
|
|
|
list before posting. Archives for the list may be found at
|
2008-11-05 17:22:40 -05:00
|
|
|
@url{http://lists.gnu.org/pipermail/bug-wget/}.
|
2008-11-01 02:15:06 -04:00
|
|
|
|
2008-11-10 14:21:17 -05:00
|
|
|
An NNTP/Usenettish gateway is also available via
|
|
|
|
@uref{http://gmane.org/about.php,Gmane}. You can see the Gmane
|
|
|
|
archives at
|
|
|
|
@url{http://news.gmane.org/gmane.comp.web.wget.general}. Note that the
|
|
|
|
Gmane archives conveniently include messages from both the current
|
|
|
|
list, and the previous one. Messages also show up in the Gmane
|
|
|
|
archives sooner than they do at @url{lists.gnu.org}.
|
|
|
|
|
|
|
|
@unnumberedsubsec Bug Notices List
|
|
|
|
|
2008-11-01 02:15:06 -04:00
|
|
|
Additionally, there is the @email{wget-notify@@addictivecode.org} mailing
|
|
|
|
list. This is a non-discussion list that receives bug report
|
|
|
|
notifications from the bug-tracker. To subscribe to this list,
|
|
|
|
send an email to @email{wget-notify-join@@addictivecode.org},
|
|
|
|
or visit @url{http://addictivecode.org/mailman/listinfo/wget-notify}.
|
|
|
|
|
2008-11-10 14:21:17 -05:00
|
|
|
@unnumberedsubsec Obsolete Lists
|
|
|
|
|
2008-11-01 02:15:06 -04:00
|
|
|
Previously, the mailing list @email{wget@@sunsite.dk} was used as the
|
|
|
|
main discussion list, and another list,
|
|
|
|
@email{wget-patches@@sunsite.dk} was used for submitting and
|
|
|
|
discussing patches to GNU Wget.
|
|
|
|
|
|
|
|
Messages from @email{wget@@sunsite.dk} are archived at
|
|
|
|
@itemize @tie{}
|
|
|
|
@item
|
2005-04-18 16:52:27 -04:00
|
|
|
@url{http://www.mail-archive.com/wget%40sunsite.dk/} and at
|
2008-11-01 02:15:06 -04:00
|
|
|
@item
|
2008-11-10 14:21:17 -05:00
|
|
|
@url{http://news.gmane.org/gmane.comp.web.wget.general} (which also
|
|
|
|
continues to archive the current list, @email{bug-wget@@gnu.org}).
|
2008-11-01 02:15:06 -04:00
|
|
|
@end itemize
|
2005-04-18 16:52:27 -04:00
|
|
|
|
2008-11-01 02:15:06 -04:00
|
|
|
Messages from @email{wget-patches@@sunsite.dk} are archived at
|
|
|
|
@itemize @tie{}
|
|
|
|
@item
|
2005-04-18 16:52:27 -04:00
|
|
|
@url{http://news.gmane.org/gmane.comp.web.wget.patches}.
|
2008-11-01 02:15:06 -04:00
|
|
|
@end itemize
|
2007-10-01 04:22:37 -04:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Internet Relay Chat, Reporting Bugs, Mailing Lists, Various
|
2007-10-13 21:17:31 -04:00
|
|
|
@section Internet Relay Chat
|
|
|
|
@cindex Internet Relay Chat
|
|
|
|
@cindex IRC
|
|
|
|
@cindex #wget
|
|
|
|
|
2008-06-13 23:41:58 -04:00
|
|
|
In addition to the mailinglists, we also have a support channel set up
|
|
|
|
via IRC at @code{irc.freenode.org}, @code{#wget}. Come check it out!
|
2007-10-13 21:17:31 -04:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Reporting Bugs, Portability, Internet Relay Chat, Various
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Reporting Bugs
|
|
|
|
@cindex bugs
|
|
|
|
@cindex reporting bugs
|
|
|
|
@cindex bug reports
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin BUGS
|
2007-10-01 04:22:37 -04:00
|
|
|
You are welcome to submit bug reports via the GNU Wget bug tracker (see
|
|
|
|
@url{http://wget.addictivecode.org/BugTracker}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Before actually submitting a bug report, please try to follow a few
|
|
|
|
simple guidelines.
|
|
|
|
|
|
|
|
@enumerate
|
|
|
|
@item
|
2003-09-30 17:09:06 -04:00
|
|
|
Please try to ascertain that the behavior you see really is a bug. If
|
1999-12-02 02:42:23 -05:00
|
|
|
Wget crashes, it's a bug. If Wget does not behave as documented,
|
|
|
|
it's a bug. If things work strange, but you are not sure about the way
|
2007-10-01 04:22:37 -04:00
|
|
|
they are supposed to work, it might well be a bug, but you might want to
|
|
|
|
double-check the documentation and the mailing lists (@pxref{Mailing
|
2008-11-10 14:21:17 -05:00
|
|
|
Lists}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Try to repeat the bug in as simple circumstances as possible. E.g. if
|
2007-12-12 15:13:50 -05:00
|
|
|
Wget crashes while downloading @samp{wget -rl0 -kKE -t5 --no-proxy
|
2001-12-08 01:47:48 -05:00
|
|
|
http://yoyodyne.com -o /tmp/log}, you should try to see if the crash is
|
|
|
|
repeatable, and if will occur with a simpler set of options. You might
|
|
|
|
even try to start the download at the page where the crash occurred to
|
|
|
|
see if that page somehow triggered the crash.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Also, while I will probably be interested to know the contents of your
|
|
|
|
@file{.wgetrc} file, just dumping it into the debug message is probably
|
|
|
|
a bad idea. Instead, you should first try to see if the bug repeats
|
|
|
|
with @file{.wgetrc} moved out of the way. Only if it turns out that
|
2001-12-08 01:47:48 -05:00
|
|
|
@file{.wgetrc} settings affect the bug, mail me the relevant parts of
|
|
|
|
the file.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
2005-04-18 16:52:27 -04:00
|
|
|
Please start Wget with @samp{-d} option and send us the resulting
|
|
|
|
output (or relevant parts thereof). If Wget was compiled without
|
|
|
|
debug support, recompile it---it is @emph{much} easier to trace bugs
|
|
|
|
with debug support on.
|
|
|
|
|
|
|
|
Note: please make sure to remove any potentially sensitive information
|
|
|
|
from the debug log before sending it to the bug address. The
|
|
|
|
@code{-d} won't go out of its way to collect sensitive information,
|
|
|
|
but the log @emph{will} contain a fairly complete transcript of Wget's
|
|
|
|
communication with the server, which may include passwords and pieces
|
|
|
|
of downloaded data. Since the bug address is publically archived, you
|
|
|
|
may assume that all bug reports are visible to the public.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
If Wget has crashed, try to run it in a debugger, e.g. @code{gdb `which
|
2005-04-18 16:52:27 -04:00
|
|
|
wget` core} and type @code{where} to get the backtrace. This may not
|
|
|
|
work if the system administrator has disabled core files, but it is
|
|
|
|
safe to try.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end enumerate
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Portability, Signals, Reporting Bugs, Various
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Portability
|
|
|
|
@cindex portability
|
|
|
|
@cindex operating systems
|
|
|
|
|
2003-11-14 18:59:59 -05:00
|
|
|
Like all GNU software, Wget works on the GNU system. However, since it
|
|
|
|
uses GNU Autoconf for building and configuring, and mostly avoids using
|
|
|
|
``special'' features of any particular Unix, it should compile (and
|
|
|
|
work) on all common Unix flavors.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2007-10-01 04:22:37 -04:00
|
|
|
Various Wget versions have been compiled and tested under many kinds of
|
|
|
|
Unix systems, including GNU/Linux, Solaris, SunOS 4.x, Mac OS X, OSF
|
|
|
|
(aka Digital Unix or Tru64), Ultrix, *BSD, IRIX, AIX, and others. Some
|
|
|
|
of those systems are no longer in widespread use and may not be able to
|
2005-04-18 16:21:04 -04:00
|
|
|
support recent versions of Wget. If Wget fails to compile on your
|
|
|
|
system, we would like to know about it.
|
|
|
|
|
|
|
|
Thanks to kind contributors, this version of Wget compiles and works
|
|
|
|
on 32-bit Microsoft Windows platforms. It has been compiled
|
|
|
|
successfully using MS Visual C++ 6.0, Watcom, Borland C, and GCC
|
|
|
|
compilers. Naturally, it is crippled of some features available on
|
|
|
|
Unix, but it should work as a substitute for people stuck with
|
|
|
|
Windows. Note that Windows-specific portions of Wget are not
|
|
|
|
guaranteed to be supported in the future, although this has been the
|
|
|
|
case in practice for many years now. All questions and problems in
|
|
|
|
Windows usage should be reported to Wget mailing list at
|
2003-11-14 18:59:59 -05:00
|
|
|
@email{wget@@sunsite.dk} where the volunteers who maintain the
|
|
|
|
Windows-related features might look at them.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2007-10-01 04:22:37 -04:00
|
|
|
Support for building on MS-DOS via DJGPP has been contributed by Gisle
|
|
|
|
Vanem; a port to VMS is maintained by Steven Schweda, and is available
|
|
|
|
at @url{http://antinode.org/}.
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Signals, , Portability, Various
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Signals
|
|
|
|
@cindex signal handling
|
|
|
|
@cindex hangup
|
|
|
|
|
|
|
|
Since the purpose of Wget is background work, it catches the hangup
|
|
|
|
signal (@code{SIGHUP}) and ignores it. If the output was on standard
|
|
|
|
output, it will be redirected to a file named @file{wget-log}.
|
|
|
|
Otherwise, @code{SIGHUP} is ignored. This is convenient when you wish
|
|
|
|
to redirect the output of Wget after having started it.
|
|
|
|
|
|
|
|
@example
|
2005-04-18 16:21:04 -04:00
|
|
|
$ wget http://www.gnus.org/dist/gnus.tar.gz &
|
|
|
|
...
|
|
|
|
$ kill -HUP %%
|
|
|
|
SIGHUP received, redirecting output to `wget-log'.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2000-11-16 07:35:27 -05:00
|
|
|
Other than that, Wget will not try to interfere with signals in any way.
|
|
|
|
@kbd{C-c}, @code{kill -TERM} and @code{kill -KILL} should kill it alike.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Appendices, Copying this manual, Various, Top
|
1999-12-02 02:42:23 -05:00
|
|
|
@chapter Appendices
|
|
|
|
|
2000-11-14 17:49:07 -05:00
|
|
|
This chapter contains some references I consider useful.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@menu
|
2009-08-28 02:57:09 -04:00
|
|
|
* Robot Exclusion:: Wget's support for RES.
|
|
|
|
* Security Considerations:: Security with Wget.
|
|
|
|
* Contributors:: People who helped.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end menu
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Robot Exclusion, Security Considerations, Appendices, Appendices
|
2002-04-23 20:37:39 -04:00
|
|
|
@section Robot Exclusion
|
|
|
|
@cindex robot exclusion
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex robots.txt
|
|
|
|
@cindex server maintenance
|
|
|
|
|
2000-11-15 05:44:18 -05:00
|
|
|
It is extremely easy to make Wget wander aimlessly around a web site,
|
|
|
|
sucking all the available data in progress. @samp{wget -r @var{site}},
|
|
|
|
and you're set. Great? Not for the server admin.
|
|
|
|
|
2002-04-23 20:37:39 -04:00
|
|
|
As long as Wget is only retrieving static pages, and doing it at a
|
|
|
|
reasonable rate (see the @samp{--wait} option), there's not much of a
|
|
|
|
problem. The trouble is that Wget can't tell the difference between the
|
|
|
|
smallest static page and the most demanding CGI. A site I know has a
|
2003-09-30 17:09:06 -04:00
|
|
|
section handled by a CGI Perl script that converts Info files to @sc{html} on
|
|
|
|
the fly. The script is slow, but works well enough for human users
|
|
|
|
viewing an occasional Info file. However, when someone's recursive Wget
|
|
|
|
download stumbles upon the index page that links to all the Info files
|
|
|
|
through the script, the system is brought to its knees without providing
|
|
|
|
anything useful to the user (This task of converting Info files could be
|
|
|
|
done locally and access to Info documentation for all installed GNU
|
|
|
|
software on a system is available from the @code{info} command).
|
2002-04-23 20:37:39 -04:00
|
|
|
|
|
|
|
To avoid this kind of accident, as well as to preserve privacy for
|
|
|
|
documents that need to be protected from well-behaved robots, the
|
2003-09-30 17:09:06 -04:00
|
|
|
concept of @dfn{robot exclusion} was invented. The idea is that
|
2002-04-23 20:37:39 -04:00
|
|
|
the server administrators and document authors can specify which
|
2003-09-30 17:09:06 -04:00
|
|
|
portions of the site they wish to protect from robots and those
|
|
|
|
they will permit access.
|
|
|
|
|
|
|
|
The most popular mechanism, and the @i{de facto} standard supported by
|
|
|
|
all the major robots, is the ``Robots Exclusion Standard'' (RES) written
|
|
|
|
by Martijn Koster et al. in 1994. It specifies the format of a text
|
|
|
|
file containing directives that instruct the robots which URL paths to
|
|
|
|
avoid. To be found by the robots, the specifications must be placed in
|
|
|
|
@file{/robots.txt} in the server root, which the robots are expected to
|
2002-04-23 20:37:39 -04:00
|
|
|
download and parse.
|
|
|
|
|
|
|
|
Although Wget is not a web robot in the strictest sense of the word, it
|
2008-09-10 11:42:42 -04:00
|
|
|
can download large parts of the site without the user's intervention to
|
2002-04-23 20:37:39 -04:00
|
|
|
download an individual page. Because of that, Wget honors RES when
|
|
|
|
downloading recursively. For instance, when you issue:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-11-30 21:36:21 -05:00
|
|
|
wget -r http://www.server.com/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-11-30 21:36:21 -05:00
|
|
|
First the index of @samp{www.server.com} will be downloaded. If Wget
|
|
|
|
finds that it wants to download more documents from that server, it will
|
|
|
|
request @samp{http://www.server.com/robots.txt} and, if found, use it
|
|
|
|
for further downloads. @file{robots.txt} is loaded only once per each
|
|
|
|
server.
|
|
|
|
|
|
|
|
Until version 1.8, Wget supported the first version of the standard,
|
|
|
|
written by Martijn Koster in 1994 and available at
|
2001-12-13 02:29:05 -05:00
|
|
|
@url{http://www.robotstxt.org/wc/norobots.html}. As of version 1.8,
|
|
|
|
Wget has supported the additional directives specified in the internet
|
|
|
|
draft @samp{<draft-koster-robots-00.txt>} titled ``A Method for Web
|
|
|
|
Robots Control''. The draft, which has as far as I know never made to
|
|
|
|
an @sc{rfc}, is available at
|
|
|
|
@url{http://www.robotstxt.org/wc/norobots-rfc.txt}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-11-30 21:36:21 -05:00
|
|
|
This manual no longer includes the text of the Robot Exclusion Standard.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-11-15 05:44:18 -05:00
|
|
|
The second, less known mechanism, enables the author of an individual
|
|
|
|
document to specify whether they want the links from the file to be
|
|
|
|
followed by a robot. This is achieved using the @code{META} tag, like
|
|
|
|
this:
|
|
|
|
|
|
|
|
@example
|
|
|
|
<meta name="robots" content="nofollow">
|
|
|
|
@end example
|
|
|
|
|
|
|
|
This is explained in some detail at
|
2001-12-13 02:29:05 -05:00
|
|
|
@url{http://www.robotstxt.org/wc/meta-user.html}. Wget supports this
|
|
|
|
method of robot exclusion in addition to the usual @file{/robots.txt}
|
|
|
|
exclusion.
|
2000-11-15 05:44:18 -05:00
|
|
|
|
2002-04-23 20:37:39 -04:00
|
|
|
If you know what you are doing and really really wish to turn off the
|
|
|
|
robot exclusion, set the @code{robots} variable to @samp{off} in your
|
|
|
|
@file{.wgetrc}. You can achieve the same effect from the command line
|
|
|
|
using the @code{-e} switch, e.g. @samp{wget -e robots=off @var{url}...}.
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Security Considerations, Contributors, Robot Exclusion, Appendices
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Security Considerations
|
|
|
|
@cindex security
|
|
|
|
|
|
|
|
When using Wget, you must be aware that it sends unencrypted passwords
|
|
|
|
through the network, which may present a security problem. Here are the
|
|
|
|
main issues, and some solutions.
|
|
|
|
|
|
|
|
@enumerate
|
2004-02-08 05:50:13 -05:00
|
|
|
@item
|
|
|
|
The passwords on the command line are visible using @code{ps}. The best
|
|
|
|
way around it is to use @code{wget -i -} and feed the @sc{url}s to
|
|
|
|
Wget's standard input, each on a separate line, terminated by @kbd{C-d}.
|
|
|
|
Another workaround is to use @file{.netrc} to store passwords; however,
|
|
|
|
storing unencrypted passwords is also considered a security risk.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Using the insecure @dfn{basic} authentication scheme, unencrypted
|
|
|
|
passwords are transmitted through the network routers and gateways.
|
|
|
|
|
|
|
|
@item
|
|
|
|
The @sc{ftp} passwords are also in no way encrypted. There is no good
|
|
|
|
solution for this at the moment.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Although the ``normal'' output of Wget tries to hide the passwords,
|
|
|
|
debugging logs show them, in all forms. This problem is avoided by
|
|
|
|
being careful when you send debug logs (yes, even when you send them to
|
|
|
|
me).
|
|
|
|
@end enumerate
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Contributors, , Security Considerations, Appendices
|
1999-12-02 02:42:23 -05:00
|
|
|
@section Contributors
|
|
|
|
@cindex contributors
|
|
|
|
|
|
|
|
@iftex
|
2006-07-10 06:18:53 -04:00
|
|
|
GNU Wget was written by Hrvoje Nik@v{s}i@'{c} @email{hniksic@@xemacs.org},
|
1999-12-02 02:42:23 -05:00
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
2011-03-21 18:22:31 -04:00
|
|
|
GNU Wget was written by Hrvoje Niksic @email{hniksic@@xemacs.org}.
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2006-07-10 06:18:53 -04:00
|
|
|
|
|
|
|
However, the development of Wget could never have gone as far as it has, were
|
|
|
|
it not for the help of many people, either with bug reports, feature proposals,
|
|
|
|
patches, or letters saying ``Thanks!''.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Special thanks goes to the following people (no particular order):
|
|
|
|
|
|
|
|
@itemize @bullet
|
2005-06-23 17:51:26 -04:00
|
|
|
@item Dan Harkless---contributed a lot of code and documentation of
|
|
|
|
extremely high quality, as well as the @code{--page-requisites} and
|
|
|
|
related options. He was the principal maintainer for some time and
|
|
|
|
released Wget 1.6.
|
|
|
|
|
|
|
|
@item Ian Abbott---contributed bug fixes, Windows-related fixes, and
|
|
|
|
provided a prototype implementation of the breadth-first recursive
|
|
|
|
download. Co-maintained Wget during the 1.8 release cycle.
|
|
|
|
|
|
|
|
@item
|
|
|
|
The dotsrc.org crew, in particular Karsten Thygesen---donated system
|
|
|
|
resources such as the mailing list, web space, @sc{ftp} space, and
|
|
|
|
version control repositories, along with a lot of time to make these
|
|
|
|
actually work. Christian Reiniger was of invaluable help with setting
|
|
|
|
up Subversion.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
2005-06-23 17:51:26 -04:00
|
|
|
Heiko Herold---provided high-quality Windows builds and contributed
|
|
|
|
bug and build reports for many years.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Shawn McHorse---bug reports and patches.
|
|
|
|
|
|
|
|
@item
|
2000-11-10 09:47:30 -05:00
|
|
|
Kaveh R. Ghazi---on-the-fly @code{ansi2knr}-ization. Lots of
|
|
|
|
portability fixes.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Gordon Matzigkeit---@file{.netrc} support.
|
|
|
|
|
|
|
|
@item
|
|
|
|
@iftex
|
|
|
|
Zlatko @v{C}alu@v{s}i@'{c}, Tomislav Vujec and Dra@v{z}en
|
|
|
|
Ka@v{c}ar---feature suggestions and ``philosophical'' discussions.
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Zlatko Calusic, Tomislav Vujec and Drazen Kacar---feature suggestions
|
|
|
|
and ``philosophical'' discussions.
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Darko Budor---initial port to Windows.
|
|
|
|
|
|
|
|
@item
|
2005-06-23 17:51:26 -04:00
|
|
|
Antonio Rosella---help and suggestions, plus the initial Italian
|
|
|
|
translation.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
@iftex
|
|
|
|
Tomislav Petrovi@'{c}, Mario Miko@v{c}evi@'{c}---many bug reports and
|
|
|
|
suggestions.
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Tomislav Petrovic, Mario Mikocevic---many bug reports and suggestions.
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
@iftex
|
|
|
|
Fran@,{c}ois Pinard---many thorough bug reports and discussions.
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Francois Pinard---many thorough bug reports and discussions.
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
2005-06-23 17:51:26 -04:00
|
|
|
Karl Eichwalder---lots of help with internationalization, Makefile
|
|
|
|
layout and many other things.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Junio Hamano---donated support for Opie and @sc{http} @code{Digest}
|
|
|
|
authentication.
|
|
|
|
|
2007-09-28 19:15:54 -04:00
|
|
|
@item
|
2008-04-24 19:48:46 -04:00
|
|
|
Mauro Tortonesi---improved IPv6 support, adding support for dual
|
2007-09-28 19:15:54 -04:00
|
|
|
family systems. Refactored and enhanced FTP IPv6 code. Maintained GNU
|
|
|
|
Wget from 2004--2007.
|
|
|
|
|
|
|
|
@item
|
2008-04-24 19:48:46 -04:00
|
|
|
Christopher G.@: Lewis---maintenance of the Windows version of GNU WGet.
|
2007-09-28 19:15:54 -04:00
|
|
|
|
|
|
|
@item
|
2008-04-24 19:48:46 -04:00
|
|
|
Gisle Vanem---many helpful patches and improvements, especially for
|
2007-09-28 19:15:54 -04:00
|
|
|
Windows and MS-DOS support.
|
|
|
|
|
2007-10-08 15:29:48 -04:00
|
|
|
@item
|
2008-04-24 19:48:46 -04:00
|
|
|
Ralf Wildenhues---contributed patches to convert Wget to use Automake as
|
2007-10-08 15:29:48 -04:00
|
|
|
part of its build process, and various bugfixes.
|
|
|
|
|
2008-06-30 15:35:56 -04:00
|
|
|
@item
|
|
|
|
Steven Schubiger---Many helpful patches, bugfixes and improvements.
|
|
|
|
Notably, conversion of Wget to use the Gnulib quotes and quoteargs
|
|
|
|
modules, and the addition of password prompts at the console, via the
|
|
|
|
Gnulib getpasswd-gnu module.
|
|
|
|
|
2008-04-24 19:48:46 -04:00
|
|
|
@item
|
|
|
|
Ted Mielczarek---donated support for CSS.
|
|
|
|
|
2009-07-04 00:37:19 -04:00
|
|
|
@item
|
|
|
|
Saint Xavier---Support for IRIs (RFC 3987).
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
2005-06-23 17:51:26 -04:00
|
|
|
People who provided donations for development---including Brian Gough.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end itemize
|
|
|
|
|
|
|
|
The following people have provided patches, bug/build reports, useful
|
|
|
|
suggestions, beta testing services, fan mail and all the other things
|
|
|
|
that make maintenance so much fun:
|
|
|
|
|
|
|
|
Tim Adam,
|
2000-11-04 23:56:11 -05:00
|
|
|
Adrian Aichner,
|
1999-12-02 02:42:23 -05:00
|
|
|
Martin Baehr,
|
|
|
|
Dieter Baron,
|
2001-11-30 21:36:21 -05:00
|
|
|
Roger Beeman,
|
2000-03-02 09:56:48 -05:00
|
|
|
Dan Berger,
|
2007-09-28 19:15:54 -04:00
|
|
|
T.@: Bharath,
|
2004-02-22 10:46:49 -05:00
|
|
|
Christian Biere,
|
2001-05-27 15:33:34 -04:00
|
|
|
Paul Bludov,
|
2001-11-30 21:36:21 -05:00
|
|
|
Daniel Bodea,
|
1999-12-02 02:42:23 -05:00
|
|
|
Mark Boyns,
|
|
|
|
John Burden,
|
2008-04-11 19:28:24 -04:00
|
|
|
Julien Buty,
|
1999-12-02 02:42:23 -05:00
|
|
|
Wanderlei Cavassin,
|
|
|
|
Gilles Cedoc,
|
|
|
|
Tim Charron,
|
|
|
|
Noel Cragg,
|
|
|
|
@iftex
|
|
|
|
Kristijan @v{C}onka@v{s},
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Kristijan Conkas,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2000-11-04 23:56:11 -05:00
|
|
|
John Daily,
|
2005-05-05 11:41:32 -04:00
|
|
|
Andreas Damm,
|
2003-10-01 09:53:04 -04:00
|
|
|
Ahmon Dancy,
|
2000-11-04 23:56:11 -05:00
|
|
|
Andrew Davison,
|
2005-05-05 11:41:32 -04:00
|
|
|
Bertrand Demiddelaer,
|
2008-04-11 19:28:24 -04:00
|
|
|
Alexander Dergachev,
|
2000-03-02 08:36:47 -05:00
|
|
|
Andrew Deryabin,
|
2000-11-04 23:56:11 -05:00
|
|
|
Ulrich Drepper,
|
|
|
|
Marc Duponcheel,
|
1999-12-02 02:42:23 -05:00
|
|
|
@iftex
|
|
|
|
Damir D@v{z}eko,
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Damir Dzeko,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2001-11-30 21:36:21 -05:00
|
|
|
Alan Eldridge,
|
2005-04-21 10:29:07 -04:00
|
|
|
Hans-Andreas Engel,
|
1999-12-02 02:42:23 -05:00
|
|
|
@iftex
|
|
|
|
Aleksandar Erkalovi@'{c},
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Aleksandar Erkalovic,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Andy Eskilsson,
|
2008-06-30 15:22:28 -04:00
|
|
|
@iftex
|
|
|
|
Jo@~{a}o Ferreira,
|
|
|
|
@end iftex
|
|
|
|
@ifnottex
|
|
|
|
Joao Ferreira,
|
|
|
|
@end ifnottex
|
2001-05-27 15:33:34 -04:00
|
|
|
Christian Fraenkel,
|
2004-02-22 10:46:49 -05:00
|
|
|
David Fritz,
|
2008-06-30 15:35:56 -04:00
|
|
|
Mike Frysinger,
|
2007-09-28 19:15:54 -04:00
|
|
|
Charles C.@: Fu,
|
2005-04-21 10:29:07 -04:00
|
|
|
FUJISHIMA Satsuki,
|
1999-12-02 02:42:23 -05:00
|
|
|
Masashi Fujita,
|
|
|
|
Howard Gayle,
|
|
|
|
Marcel Gerrits,
|
2001-11-30 21:36:21 -05:00
|
|
|
Lemble Gregory,
|
1999-12-02 02:42:23 -05:00
|
|
|
Hans Grobler,
|
2008-06-30 15:35:56 -04:00
|
|
|
Alain Guibert,
|
1999-12-02 02:42:23 -05:00
|
|
|
Mathieu Guillaume,
|
2003-10-01 09:53:04 -04:00
|
|
|
Aaron Hawley,
|
2001-11-30 21:36:21 -05:00
|
|
|
Jochen Hein,
|
1999-12-02 02:42:23 -05:00
|
|
|
Karl Heuer,
|
2008-06-30 15:35:56 -04:00
|
|
|
Madhusudan Hosaagrahara,
|
2000-03-02 09:56:48 -05:00
|
|
|
HIROSE Masaaki,
|
2005-04-21 10:29:07 -04:00
|
|
|
Ulf Harnhammar,
|
1999-12-02 02:42:23 -05:00
|
|
|
Gregor Hoffleit,
|
|
|
|
Erik Magnus Hulthen,
|
|
|
|
Richard Huveneers,
|
2001-05-27 15:33:34 -04:00
|
|
|
Jonas Jensen,
|
2005-04-21 10:29:07 -04:00
|
|
|
Larry Jones,
|
1999-12-02 02:42:23 -05:00
|
|
|
Simon Josefsson,
|
|
|
|
@iftex
|
|
|
|
Mario Juri@'{c},
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Mario Juric,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2001-05-27 15:33:34 -04:00
|
|
|
@iftex
|
|
|
|
Hack Kampbj@o rn,
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
2001-05-27 15:33:34 -04:00
|
|
|
Hack Kampbjorn,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2000-11-04 23:56:11 -05:00
|
|
|
Const Kaplinsky,
|
1999-12-02 02:42:23 -05:00
|
|
|
@iftex
|
|
|
|
Goran Kezunovi@'{c},
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Goran Kezunovic,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2005-05-05 11:41:32 -04:00
|
|
|
Igor Khristophorov,
|
1999-12-02 02:42:23 -05:00
|
|
|
Robert Kleine,
|
2001-05-27 15:33:34 -04:00
|
|
|
KOJIMA Haime,
|
1999-12-02 02:42:23 -05:00
|
|
|
Fila Kolodny,
|
|
|
|
Alexander Kourakos,
|
|
|
|
Martin Kraemer,
|
2005-04-25 17:01:27 -04:00
|
|
|
Sami Krank,
|
2009-06-20 17:50:02 -04:00
|
|
|
Jay Krell,
|
1999-12-02 02:42:23 -05:00
|
|
|
@tex
|
|
|
|
$\Sigma\acute{\iota}\mu o\varsigma\;
|
|
|
|
\Xi\varepsilon\nu\iota\tau\acute{\epsilon}\lambda\lambda\eta\varsigma$
|
|
|
|
(Simos KSenitellis),
|
|
|
|
@end tex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Simos KSenitellis,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2005-05-05 11:41:32 -04:00
|
|
|
Christian Lackas,
|
1999-12-02 02:42:23 -05:00
|
|
|
Hrvoje Lacko,
|
2007-09-28 19:15:54 -04:00
|
|
|
Daniel S.@: Lewart,
|
2001-05-27 15:33:34 -04:00
|
|
|
@iftex
|
|
|
|
Nicol@'{a}s Lichtmeier,
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
2001-05-27 15:33:34 -04:00
|
|
|
Nicolas Lichtmeier,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Dave Love,
|
2007-09-28 19:15:54 -04:00
|
|
|
Alexander V.@: Lukyanov,
|
2005-06-24 12:07:09 -04:00
|
|
|
@iftex
|
|
|
|
Thomas Lu@ss{}nig,
|
|
|
|
@end iftex
|
|
|
|
@ifnottex
|
2003-10-01 09:53:04 -04:00
|
|
|
Thomas Lussnig,
|
2005-06-24 12:07:09 -04:00
|
|
|
@end ifnottex
|
2005-05-05 11:41:32 -04:00
|
|
|
Andre Majorel,
|
2003-10-01 09:53:04 -04:00
|
|
|
Aurelien Marchand,
|
2007-09-28 19:15:54 -04:00
|
|
|
Matthew J.@: Mellon,
|
1999-12-02 02:42:23 -05:00
|
|
|
Jordan Mendelson,
|
2008-04-24 19:48:46 -04:00
|
|
|
Ted Mielczarek,
|
2009-06-11 22:20:19 -04:00
|
|
|
Robert Millan,
|
1999-12-02 02:42:23 -05:00
|
|
|
Lin Zhe Min,
|
2005-04-21 10:29:07 -04:00
|
|
|
Jan Minar,
|
2001-05-27 15:33:34 -04:00
|
|
|
Tim Mooney,
|
2005-06-07 18:02:28 -04:00
|
|
|
Keith Moore,
|
2007-09-28 19:15:54 -04:00
|
|
|
Adam D.@: Moss,
|
2000-11-04 23:56:11 -05:00
|
|
|
Simon Munton,
|
1999-12-02 02:42:23 -05:00
|
|
|
Charlie Negyesi,
|
2007-09-28 19:15:54 -04:00
|
|
|
R.@: K.@: Owen,
|
2008-06-30 15:35:56 -04:00
|
|
|
Jim Paris,
|
|
|
|
Kenny Parnell,
|
2005-04-21 10:29:07 -04:00
|
|
|
Leonid Petrov,
|
|
|
|
Simone Piunno,
|
1999-12-02 02:42:23 -05:00
|
|
|
Andrew Pollock,
|
|
|
|
Steve Pothier,
|
2001-02-10 19:22:42 -05:00
|
|
|
@iftex
|
|
|
|
Jan P@v{r}ikryl,
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Jan Prikryl,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2000-03-02 08:44:56 -05:00
|
|
|
Marin Purgar,
|
2001-05-27 15:33:34 -04:00
|
|
|
@iftex
|
|
|
|
Csaba R@'{a}duly,
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
2001-05-27 15:33:34 -04:00
|
|
|
Csaba Raduly,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Keith Refson,
|
2003-10-01 09:53:04 -04:00
|
|
|
Bill Richardson,
|
2000-11-04 23:56:11 -05:00
|
|
|
Tyler Riddle,
|
1999-12-02 02:42:23 -05:00
|
|
|
Tobias Ringstrom,
|
2007-09-28 19:15:54 -04:00
|
|
|
Jochen Roderburg,
|
1999-12-02 02:42:23 -05:00
|
|
|
@c Texinfo doesn't grok @'{@i}, so we have to use TeX itself.
|
|
|
|
@tex
|
2005-05-05 11:41:32 -04:00
|
|
|
Juan Jos\'{e} Rodr\'{\i}guez,
|
1999-12-02 02:42:23 -05:00
|
|
|
@end tex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
2005-05-05 11:41:32 -04:00
|
|
|
Juan Jose Rodriguez,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2007-09-28 19:15:54 -04:00
|
|
|
Maciej W.@: Rozycki,
|
|
|
|
Edward J.@: Sabol,
|
1999-12-02 02:42:23 -05:00
|
|
|
Heinz Salzmann,
|
|
|
|
Robert Schmidt,
|
2003-12-15 11:43:55 -05:00
|
|
|
Nicolas Schodet,
|
2008-06-30 15:35:56 -04:00
|
|
|
Benno Schulenberg,
|
2000-11-04 23:56:11 -05:00
|
|
|
Andreas Schwab,
|
2007-09-28 19:15:54 -04:00
|
|
|
Steven M.@: Schweda,
|
2001-11-30 21:36:21 -05:00
|
|
|
Chris Seawood,
|
2008-06-30 15:35:56 -04:00
|
|
|
Pranab Shenoy,
|
2005-05-05 11:41:32 -04:00
|
|
|
Dennis Smit,
|
1999-12-02 02:42:23 -05:00
|
|
|
Toomas Soome,
|
2000-03-02 08:44:56 -05:00
|
|
|
Tage Stabell-Kulo,
|
2005-05-05 11:41:32 -04:00
|
|
|
Philip Stadermann,
|
2005-05-05 12:51:38 -04:00
|
|
|
Daniel Stenberg,
|
1999-12-02 02:42:23 -05:00
|
|
|
Sven Sternberger,
|
|
|
|
Markus Strasser,
|
2001-05-27 15:33:34 -04:00
|
|
|
John Summerfield,
|
1999-12-02 02:42:23 -05:00
|
|
|
Szakacsits Szabolcs,
|
|
|
|
Mike Thomas,
|
2001-05-27 15:33:34 -04:00
|
|
|
Philipp Thomas,
|
2003-10-01 09:53:04 -04:00
|
|
|
Mauro Tortonesi,
|
2001-11-30 21:36:21 -05:00
|
|
|
Dave Turner,
|
2003-10-01 09:53:04 -04:00
|
|
|
Gisle Vanem,
|
2008-04-11 19:28:24 -04:00
|
|
|
Rabin Vincent,
|
1999-12-02 02:42:23 -05:00
|
|
|
Russell Vincent,
|
2005-04-24 16:00:19 -04:00
|
|
|
@iftex
|
|
|
|
@v{Z}eljko Vrba,
|
|
|
|
@end iftex
|
|
|
|
@ifnottex
|
|
|
|
Zeljko Vrba,
|
|
|
|
@end ifnottex
|
2000-03-02 09:56:48 -05:00
|
|
|
Charles G Waldman,
|
2007-09-28 19:15:54 -04:00
|
|
|
Douglas E.@: Wegscheid,
|
|
|
|
Ralf Wildenhues,
|
|
|
|
Joshua David Williams,
|
2009-06-11 22:20:19 -04:00
|
|
|
Benjamin Wolsey,
|
2009-07-04 00:37:19 -04:00
|
|
|
Saint Xavier,
|
2005-04-21 10:29:07 -04:00
|
|
|
YAMAZAKI Makoto,
|
1999-12-02 02:42:23 -05:00
|
|
|
Jasmin Zainul,
|
|
|
|
@iftex
|
|
|
|
Bojan @v{Z}drnja,
|
|
|
|
@end iftex
|
2003-09-30 17:09:06 -04:00
|
|
|
@ifnottex
|
1999-12-02 02:42:23 -05:00
|
|
|
Bojan Zdrnja,
|
2003-09-30 17:09:06 -04:00
|
|
|
@end ifnottex
|
2009-06-11 22:20:19 -04:00
|
|
|
Kristijan Zimmer,
|
|
|
|
Xin Zou.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Apologies to all who I accidentally left out, and many thanks to all the
|
|
|
|
subscribers of the Wget mailing list.
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Copying this manual, Concept Index, Appendices, Top
|
2007-07-10 01:53:22 -04:00
|
|
|
@appendix Copying this manual
|
|
|
|
|
2000-11-14 17:49:07 -05:00
|
|
|
@menu
|
2013-12-29 05:41:22 -05:00
|
|
|
* GNU Free Documentation License:: License for copying this manual.
|
2000-11-14 17:49:07 -05:00
|
|
|
@end menu
|
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node GNU Free Documentation License, , Copying this manual, Copying this manual
|
2008-11-04 04:27:44 -05:00
|
|
|
@appendixsec GNU Free Documentation License
|
|
|
|
@cindex FDL, GNU Free Documentation License
|
|
|
|
|
2003-11-14 17:53:38 -05:00
|
|
|
@include fdl.texi
|
2000-11-14 17:49:07 -05:00
|
|
|
|
2007-07-10 01:53:22 -04:00
|
|
|
|
2008-11-10 14:24:04 -05:00
|
|
|
@node Concept Index, , Copying this manual, Top
|
1999-12-02 02:42:23 -05:00
|
|
|
@unnumbered Concept Index
|
|
|
|
@printindex cp
|
|
|
|
|
|
|
|
@contents
|
|
|
|
|
|
|
|
@bye
|