1999-12-02 02:42:23 -05:00
|
|
|
\input texinfo @c -*-texinfo-*-
|
|
|
|
|
|
|
|
@c %**start of header
|
|
|
|
@setfilename wget.info
|
|
|
|
@settitle GNU Wget Manual
|
|
|
|
@c Disable the monstrous rectangles beside overfull hbox-es.
|
|
|
|
@finalout
|
|
|
|
@c Use `odd' to print double-sided.
|
|
|
|
@setchapternewpage on
|
|
|
|
@c %**end of header
|
|
|
|
|
|
|
|
@iftex
|
|
|
|
@c Remove this if you don't use A4 paper.
|
|
|
|
@afourpaper
|
|
|
|
@end iftex
|
|
|
|
|
2000-03-02 08:36:47 -05:00
|
|
|
@c This should really be auto-generated!
|
2000-12-31 06:50:52 -05:00
|
|
|
@set VERSION 1.7-dev
|
2001-02-10 19:22:42 -05:00
|
|
|
@set UPDATED Jan 2001
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@dircategory Net Utilities
|
|
|
|
@dircategory World Wide Web
|
|
|
|
@direntry
|
|
|
|
* Wget: (wget). The non-interactive network downloader.
|
|
|
|
@end direntry
|
|
|
|
|
|
|
|
@ifinfo
|
|
|
|
This file documents the the GNU Wget utility for downloading network
|
|
|
|
data.
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin COPYRIGHT
|
|
|
|
Copyright @copyright{} 1996, 1997, 1998, 2000, 2001 Free Software
|
|
|
|
Foundation, Inc.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Permission is granted to make and distribute verbatim copies of
|
|
|
|
this manual provided the copyright notice and this permission notice
|
|
|
|
are preserved on all copies.
|
|
|
|
|
|
|
|
@ignore
|
|
|
|
Permission is granted to process this file through TeX and print the
|
|
|
|
results, provided the printed document carries a copying permission
|
|
|
|
notice identical to this one except for the removal of this paragraph
|
|
|
|
(this paragraph not being relevant to the printed manual).
|
|
|
|
@end ignore
|
2000-11-14 17:49:07 -05:00
|
|
|
Permission is granted to copy, distribute and/or modify this document
|
|
|
|
under the terms of the GNU Free Documentation License, Version 1.1 or
|
2000-11-15 05:44:18 -05:00
|
|
|
any later version published by the Free Software Foundation; with the
|
|
|
|
Invariant Sections being ``GNU General Public License'' and ``GNU Free
|
|
|
|
Documentation License'', with no Front-Cover Texts, and with no
|
|
|
|
Back-Cover Texts. A copy of the license is included in the section
|
|
|
|
entitled ``GNU Free Documentation License''.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
@end ifinfo
|
|
|
|
|
|
|
|
@titlepage
|
|
|
|
@title GNU Wget
|
|
|
|
@subtitle The noninteractive downloading utility
|
|
|
|
@subtitle Updated for Wget @value{VERSION}, @value{UPDATED}
|
2000-02-29 19:50:52 -05:00
|
|
|
@author by Hrvoje Nik@v{s}i@'{c} and the developers
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@ignore
|
|
|
|
@c man begin AUTHOR
|
|
|
|
Originally written by Hrvoje Niksic <hniksic@arsdigita.com>.
|
|
|
|
@c man end
|
|
|
|
@c man begin SEEALSO
|
|
|
|
GNU Info entry for @file{wget}.
|
|
|
|
@c man end
|
|
|
|
@end ignore
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@page
|
|
|
|
@vskip 0pt plus 1filll
|
2001-02-10 19:22:42 -05:00
|
|
|
Copyright @copyright{} 1996, 1997, 1998, 2000, 2001 Free Software
|
|
|
|
Foundation, Inc.
|
2000-11-14 17:49:07 -05:00
|
|
|
|
|
|
|
Permission is granted to copy, distribute and/or modify this document
|
|
|
|
under the terms of the GNU Free Documentation License, Version 1.1 or
|
2000-11-15 05:44:18 -05:00
|
|
|
any later version published by the Free Software Foundation; with the
|
|
|
|
Invariant Sections being ``GNU General Public License'' and ``GNU Free
|
|
|
|
Documentation License'', with no Front-Cover Texts, and with no
|
|
|
|
Back-Cover Texts. A copy of the license is included in the section
|
|
|
|
entitled ``GNU Free Documentation License''.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end titlepage
|
|
|
|
|
|
|
|
@ifinfo
|
|
|
|
@node Top, Overview, (dir), (dir)
|
|
|
|
@top Wget @value{VERSION}
|
|
|
|
|
|
|
|
This manual documents version @value{VERSION} of GNU Wget, the freely
|
|
|
|
available utility for network download.
|
|
|
|
|
2000-11-14 17:49:07 -05:00
|
|
|
Copyright @copyright{} 1996, 1997, 1998, 2000 Free Software Foundation, Inc.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@menu
|
|
|
|
* Overview:: Features of Wget.
|
|
|
|
* Invoking:: Wget command-line arguments.
|
|
|
|
* Recursive Retrieval:: Description of recursive retrieval.
|
|
|
|
* Following Links:: The available methods of chasing links.
|
|
|
|
* Time-Stamping:: Mirroring according to time-stamps.
|
|
|
|
* Startup File:: Wget's initialization file.
|
|
|
|
* Examples:: Examples of usage.
|
|
|
|
* Various:: The stuff that doesn't fit anywhere else.
|
|
|
|
* Appendices:: Some useful references.
|
2000-11-14 17:49:07 -05:00
|
|
|
* Copying:: You may give out copies of Wget and of this manual.
|
1999-12-02 02:42:23 -05:00
|
|
|
* Concept Index:: Topics covered by this manual.
|
|
|
|
@end menu
|
|
|
|
@end ifinfo
|
|
|
|
|
|
|
|
@node Overview, Invoking, Top, Top
|
|
|
|
@chapter Overview
|
|
|
|
@cindex overview
|
|
|
|
@cindex features
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
GNU Wget is a freely available network utility to retrieve files from
|
|
|
|
the World Wide Web, using @sc{http} (Hyper Text Transfer Protocol) and
|
|
|
|
@sc{ftp} (File Transfer Protocol), the two most widely used Internet
|
|
|
|
protocols. It has many useful features to make downloading easier, some
|
|
|
|
of them being:
|
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
|
|
|
Wget is non-interactive, meaning that it can work in the background,
|
|
|
|
while the user is not logged on. This allows you to start a retrieval
|
|
|
|
and disconnect from the system, letting Wget finish the work. By
|
|
|
|
contrast, most of the Web browsers require constant user's presence,
|
|
|
|
which can be a great hindrance when transferring a lot of data.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sp 1
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
|
|
|
Wget is capable of descending recursively through the structure of
|
|
|
|
@sc{html} documents and @sc{ftp} directory trees, making a local copy of
|
|
|
|
the directory hierarchy similar to the one on the remote server. This
|
|
|
|
feature can be used to mirror archives and home pages, or traverse the
|
2000-11-14 17:49:07 -05:00
|
|
|
web in search of data, like a @sc{www} robot (@pxref{Robots}). In that
|
1999-12-02 02:42:23 -05:00
|
|
|
spirit, Wget understands the @code{norobots} convention.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sp 1
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
|
|
|
File name wildcard matching and recursive mirroring of directories are
|
|
|
|
available when retrieving via @sc{ftp}. Wget can read the time-stamp
|
|
|
|
information given by both @sc{http} and @sc{ftp} servers, and store it
|
|
|
|
locally. Thus Wget can see if the remote file has changed since last
|
|
|
|
retrieval, and automatically retrieve the new version if it has. This
|
|
|
|
makes Wget suitable for mirroring of @sc{ftp} sites, as well as home
|
|
|
|
pages.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sp 1
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
2000-03-02 08:44:56 -05:00
|
|
|
Wget works exceedingly well on slow or unstable connections,
|
1999-12-02 02:42:23 -05:00
|
|
|
retrying the document until it is fully retrieved, or until a
|
|
|
|
user-specified retry count is surpassed. It will try to resume the
|
|
|
|
download from the point of interruption, using @code{REST} with @sc{ftp}
|
|
|
|
and @code{Range} with @sc{http} servers that support them.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sp 1
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
|
|
|
By default, Wget supports proxy servers, which can lighten the network
|
|
|
|
load, speed up retrieval and provide access behind firewalls. However,
|
|
|
|
if you are behind a firewall that requires that you use a socks style
|
2000-11-16 07:35:27 -05:00
|
|
|
gateway, you can get the socks library and build Wget with support for
|
1999-12-02 02:42:23 -05:00
|
|
|
socks. Wget also supports the passive @sc{ftp} downloading as an
|
|
|
|
option.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sp 1
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
|
|
|
Builtin features offer mechanisms to tune which links you wish to follow
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Following Links}).
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sp 1
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
|
|
|
The retrieval is conveniently traced with printing dots, each dot
|
|
|
|
representing a fixed amount of data received (1KB by default). These
|
|
|
|
representations can be customized to your preferences.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sp 1
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
|
|
|
Most of the features are fully configurable, either through command line
|
2000-11-14 17:49:07 -05:00
|
|
|
options, or via the initialization file @file{.wgetrc} (@pxref{Startup
|
1999-12-02 02:42:23 -05:00
|
|
|
File}). Wget allows you to define @dfn{global} startup files
|
|
|
|
(@file{/usr/local/etc/wgetrc} by default) for site settings.
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
|
|
|
|
|
|
|
@ignore
|
|
|
|
@c man begin FILES
|
|
|
|
@table @samp
|
|
|
|
@item /usr/local/etc/wgetrc
|
|
|
|
Default location of the @dfn{global} startup file.
|
|
|
|
|
|
|
|
@item .wgetrc
|
|
|
|
User startup file.
|
|
|
|
@end table
|
|
|
|
@c man end
|
|
|
|
@end ignore
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sp 1
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin DESCRIPTION
|
1999-12-02 02:42:23 -05:00
|
|
|
@item
|
|
|
|
Finally, GNU Wget is free software. This means that everyone may use
|
|
|
|
it, redistribute it and/or modify it under the terms of the GNU General
|
|
|
|
Public License, as published by the Free Software Foundation
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Copying}).
|
1999-12-02 02:42:23 -05:00
|
|
|
@end itemize
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@node Invoking, Recursive Retrieval, Overview, Top
|
|
|
|
@chapter Invoking
|
|
|
|
@cindex invoking
|
|
|
|
@cindex command line
|
|
|
|
@cindex arguments
|
|
|
|
@cindex nohup
|
|
|
|
|
|
|
|
By default, Wget is very simple to invoke. The basic syntax is:
|
|
|
|
|
|
|
|
@example
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin SYNOPSIS
|
1999-12-02 02:42:23 -05:00
|
|
|
wget [@var{option}]@dots{} [@var{URL}]@dots{}
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
Wget will simply download all the @sc{url}s specified on the command
|
|
|
|
line. @var{URL} is a @dfn{Uniform Resource Locator}, as defined below.
|
|
|
|
|
|
|
|
However, you may wish to change some of the default parameters of
|
|
|
|
Wget. You can do it two ways: permanently, adding the appropriate
|
2000-11-14 17:49:07 -05:00
|
|
|
command to @file{.wgetrc} (@pxref{Startup File}), or specifying it on
|
1999-12-02 02:42:23 -05:00
|
|
|
the command line.
|
|
|
|
|
|
|
|
@menu
|
|
|
|
* URL Format::
|
|
|
|
* Option Syntax::
|
|
|
|
* Basic Startup Options::
|
|
|
|
* Logging and Input File Options::
|
|
|
|
* Download Options::
|
|
|
|
* Directory Options::
|
|
|
|
* HTTP Options::
|
|
|
|
* FTP Options::
|
|
|
|
* Recursive Retrieval Options::
|
|
|
|
* Recursive Accept/Reject Options::
|
|
|
|
@end menu
|
|
|
|
|
|
|
|
@node URL Format, Option Syntax, Invoking, Invoking
|
|
|
|
@section URL Format
|
|
|
|
@cindex URL
|
|
|
|
@cindex URL syntax
|
|
|
|
|
|
|
|
@dfn{URL} is an acronym for Uniform Resource Locator. A uniform
|
|
|
|
resource locator is a compact string representation for a resource
|
|
|
|
available via the Internet. Wget recognizes the @sc{url} syntax as per
|
|
|
|
@sc{rfc1738}. This is the most widely used form (square brackets denote
|
|
|
|
optional parts):
|
|
|
|
|
|
|
|
@example
|
|
|
|
http://host[:port]/directory/file
|
|
|
|
ftp://host[:port]/directory/file
|
|
|
|
@end example
|
|
|
|
|
|
|
|
You can also encode your username and password within a @sc{url}:
|
|
|
|
|
|
|
|
@example
|
|
|
|
ftp://user:password@@host/path
|
|
|
|
http://user:password@@host/path
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Either @var{user} or @var{password}, or both, may be left out. If you
|
|
|
|
leave out either the @sc{http} username or password, no authentication
|
|
|
|
will be sent. If you leave out the @sc{ftp} username, @samp{anonymous}
|
|
|
|
will be used. If you leave out the @sc{ftp} password, your email
|
|
|
|
address will be supplied as a default password.@footnote{If you have a
|
|
|
|
@file{.netrc} file in your home directory, password will also be
|
|
|
|
searched for there.}
|
|
|
|
|
|
|
|
You can encode unsafe characters in a @sc{url} as @samp{%xy}, @code{xy}
|
|
|
|
being the hexadecimal representation of the character's @sc{ascii}
|
|
|
|
value. Some common unsafe characters include @samp{%} (quoted as
|
|
|
|
@samp{%25}), @samp{:} (quoted as @samp{%3A}), and @samp{@@} (quoted as
|
|
|
|
@samp{%40}). Refer to @sc{rfc1738} for a comprehensive list of unsafe
|
|
|
|
characters.
|
|
|
|
|
|
|
|
Wget also supports the @code{type} feature for @sc{ftp} @sc{url}s. By
|
|
|
|
default, @sc{ftp} documents are retrieved in the binary mode (type
|
|
|
|
@samp{i}), which means that they are downloaded unchanged. Another
|
|
|
|
useful mode is the @samp{a} (@dfn{ASCII}) mode, which converts the line
|
|
|
|
delimiters between the different operating systems, and is thus useful
|
|
|
|
for text files. Here is an example:
|
|
|
|
|
|
|
|
@example
|
|
|
|
ftp://host/directory/file;type=a
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Two alternative variants of @sc{url} specification are also supported,
|
2000-03-02 08:44:56 -05:00
|
|
|
because of historical (hysterical?) reasons and their widespreaded use.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@sc{ftp}-only syntax (supported by @code{NcFTP}):
|
|
|
|
@example
|
|
|
|
host:/dir/file
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@sc{http}-only syntax (introduced by @code{Netscape}):
|
|
|
|
@example
|
|
|
|
host[:port]/dir/file
|
|
|
|
@end example
|
|
|
|
|
|
|
|
These two alternative forms are deprecated, and may cease being
|
|
|
|
supported in the future.
|
|
|
|
|
|
|
|
If you do not understand the difference between these notations, or do
|
|
|
|
not know which one to use, just use the plain ordinary format you use
|
|
|
|
with your favorite browser, like @code{Lynx} or @code{Netscape}.
|
|
|
|
|
|
|
|
@node Option Syntax, Basic Startup Options, URL Format, Invoking
|
|
|
|
@section Option Syntax
|
|
|
|
@cindex option syntax
|
|
|
|
@cindex syntax of options
|
|
|
|
|
|
|
|
Since Wget uses GNU getopts to process its arguments, every option has a
|
|
|
|
short form and a long form. Long options are more convenient to
|
|
|
|
remember, but take time to type. You may freely mix different option
|
2000-11-16 07:35:27 -05:00
|
|
|
styles, or specify options after the command-line arguments. Thus you
|
1999-12-02 02:42:23 -05:00
|
|
|
may write:
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget -r --tries=10 http://fly.srk.fer.hr/ -o log
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
The space between the option accepting an argument and the argument may
|
|
|
|
be omitted. Instead @samp{-o log} you can write @samp{-olog}.
|
|
|
|
|
|
|
|
You may put several options that do not require arguments together,
|
|
|
|
like:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -drc @var{URL}
|
|
|
|
@end example
|
|
|
|
|
|
|
|
This is a complete equivalent of:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -d -r -c @var{URL}
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Since the options can be specified after the arguments, you may
|
|
|
|
terminate them with @samp{--}. So the following will try to download
|
|
|
|
@sc{url} @samp{-x}, reporting failure to @file{log}:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -o log -- -x
|
|
|
|
@end example
|
|
|
|
|
|
|
|
The options that accept comma-separated lists all respect the convention
|
|
|
|
that specifying an empty list clears its value. This can be useful to
|
|
|
|
clear the @file{.wgetrc} settings. For instance, if your @file{.wgetrc}
|
|
|
|
sets @code{exclude_directories} to @file{/cgi-bin}, the following
|
|
|
|
example will first reset it, and then set it to exclude @file{/~nobody}
|
|
|
|
and @file{/~somebody}. You can also clear the lists in @file{.wgetrc}
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Wgetrc Syntax}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -X '' -X /~nobody,/~somebody
|
|
|
|
@end example
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin OPTIONS
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@node Basic Startup Options, Logging and Input File Options, Option Syntax, Invoking
|
|
|
|
@section Basic Startup Options
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@item -V
|
|
|
|
@itemx --version
|
|
|
|
Display the version of Wget.
|
|
|
|
|
|
|
|
@item -h
|
|
|
|
@itemx --help
|
|
|
|
Print a help message describing all of Wget's command-line options.
|
|
|
|
|
|
|
|
@item -b
|
|
|
|
@itemx --background
|
|
|
|
Go to background immediately after startup. If no output file is
|
|
|
|
specified via the @samp{-o}, output is redirected to @file{wget-log}.
|
|
|
|
|
|
|
|
@cindex execute wgetrc command
|
|
|
|
@item -e @var{command}
|
|
|
|
@itemx --execute @var{command}
|
|
|
|
Execute @var{command} as if it were a part of @file{.wgetrc}
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Startup File}). A command thus invoked will be executed
|
1999-12-02 02:42:23 -05:00
|
|
|
@emph{after} the commands in @file{.wgetrc}, thus taking precedence over
|
|
|
|
them.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
@node Logging and Input File Options, Download Options, Basic Startup Options, Invoking
|
|
|
|
@section Logging and Input File Options
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@cindex output file
|
|
|
|
@cindex log file
|
|
|
|
@item -o @var{logfile}
|
|
|
|
@itemx --output-file=@var{logfile}
|
|
|
|
Log all messages to @var{logfile}. The messages are normally reported
|
|
|
|
to standard error.
|
|
|
|
|
|
|
|
@cindex append to log
|
|
|
|
@item -a @var{logfile}
|
|
|
|
@itemx --append-output=@var{logfile}
|
|
|
|
Append to @var{logfile}. This is the same as @samp{-o}, only it appends
|
|
|
|
to @var{logfile} instead of overwriting the old log file. If
|
|
|
|
@var{logfile} does not exist, a new file is created.
|
|
|
|
|
|
|
|
@cindex debug
|
|
|
|
@item -d
|
|
|
|
@itemx --debug
|
|
|
|
Turn on debug output, meaning various information important to the
|
|
|
|
developers of Wget if it does not work properly. Your system
|
|
|
|
administrator may have chosen to compile Wget without debug support, in
|
|
|
|
which case @samp{-d} will not work. Please note that compiling with
|
|
|
|
debug support is always safe---Wget compiled with the debug support will
|
|
|
|
@emph{not} print any debug info unless requested with @samp{-d}.
|
2000-11-14 17:49:07 -05:00
|
|
|
@xref{Reporting Bugs}, for more information on how to use @samp{-d} for
|
1999-12-02 02:42:23 -05:00
|
|
|
sending bug reports.
|
|
|
|
|
|
|
|
@cindex quiet
|
|
|
|
@item -q
|
|
|
|
@itemx --quiet
|
|
|
|
Turn off Wget's output.
|
|
|
|
|
|
|
|
@cindex verbose
|
|
|
|
@item -v
|
|
|
|
@itemx --verbose
|
|
|
|
Turn on verbose output, with all the available data. The default output
|
|
|
|
is verbose.
|
|
|
|
|
|
|
|
@item -nv
|
|
|
|
@itemx --non-verbose
|
|
|
|
Non-verbose output---turn off verbose without being completely quiet
|
|
|
|
(use @samp{-q} for that), which means that error messages and basic
|
|
|
|
information still get printed.
|
|
|
|
|
|
|
|
@cindex input-file
|
|
|
|
@item -i @var{file}
|
|
|
|
@itemx --input-file=@var{file}
|
|
|
|
Read @sc{url}s from @var{file}, in which case no @sc{url}s need to be on
|
|
|
|
the command line. If there are @sc{url}s both on the command line and
|
|
|
|
in an input file, those on the command lines will be the first ones to
|
|
|
|
be retrieved. The @var{file} need not be an @sc{html} document (but no
|
|
|
|
harm if it is)---it is enough if the @sc{url}s are just listed
|
|
|
|
sequentially.
|
|
|
|
|
|
|
|
However, if you specify @samp{--force-html}, the document will be
|
|
|
|
regarded as @samp{html}. In that case you may have problems with
|
|
|
|
relative links, which you can solve either by adding @code{<base
|
|
|
|
href="@var{url}">} to the documents or by specifying
|
|
|
|
@samp{--base=@var{url}} on the command line.
|
|
|
|
|
|
|
|
@cindex force html
|
|
|
|
@item -F
|
|
|
|
@itemx --force-html
|
|
|
|
When input is read from a file, force it to be treated as an @sc{html}
|
|
|
|
file. This enables you to retrieve relative links from existing
|
|
|
|
@sc{html} files on your local disk, by adding @code{<base
|
|
|
|
href="@var{url}">} to @sc{html}, or using the @samp{--base} command-line
|
|
|
|
option.
|
2000-08-23 18:41:21 -04:00
|
|
|
|
|
|
|
@cindex base for relative links in input file
|
|
|
|
@item -B @var{URL}
|
|
|
|
@itemx --base=@var{URL}
|
|
|
|
When used in conjunction with @samp{-F}, prepends @var{URL} to relative
|
|
|
|
links in the file specified by @samp{-i}.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
|
|
|
@node Download Options, Directory Options, Logging and Input File Options, Invoking
|
|
|
|
@section Download Options
|
|
|
|
|
|
|
|
@table @samp
|
2000-10-24 02:19:17 -04:00
|
|
|
@cindex bind() address
|
|
|
|
@cindex client IP address
|
|
|
|
@cindex IP address, client
|
|
|
|
@item --bind-address=@var{ADDRESS}
|
|
|
|
When making client TCP/IP connections, @code{bind()} to @var{ADDRESS} on
|
|
|
|
the local machine. @var{ADDRESS} may be specified as a hostname or IP
|
|
|
|
address. This option can be useful if your machine is bound to multiple
|
|
|
|
IPs.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex retries
|
|
|
|
@cindex tries
|
|
|
|
@cindex number of retries
|
|
|
|
@item -t @var{number}
|
|
|
|
@itemx --tries=@var{number}
|
|
|
|
Set number of retries to @var{number}. Specify 0 or @samp{inf} for
|
|
|
|
infinite retrying.
|
|
|
|
|
|
|
|
@item -O @var{file}
|
|
|
|
@itemx --output-document=@var{file}
|
|
|
|
The documents will not be written to the appropriate files, but all will
|
|
|
|
be concatenated together and written to @var{file}. If @var{file}
|
|
|
|
already exists, it will be overwritten. If the @var{file} is @samp{-},
|
|
|
|
the documents will be written to standard output. Including this option
|
|
|
|
automatically sets the number of tries to 1.
|
|
|
|
|
2000-08-22 23:04:20 -04:00
|
|
|
@cindex clobbering, file
|
|
|
|
@cindex downloading multiple times
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex no-clobber
|
|
|
|
@item -nc
|
|
|
|
@itemx --no-clobber
|
2000-11-16 07:35:27 -05:00
|
|
|
If a file is downloaded more than once in the same directory, Wget's
|
2000-08-22 23:04:20 -04:00
|
|
|
behavior depends on a few options, including @samp{-nc}. In certain
|
2000-11-16 07:35:27 -05:00
|
|
|
cases, the local file will be @dfn{clobbered}, or overwritten, upon
|
|
|
|
repeated download. In other cases it will be preserved.
|
2000-08-22 23:04:20 -04:00
|
|
|
|
2000-11-16 07:35:27 -05:00
|
|
|
When running Wget without @samp{-N}, @samp{-nc}, or @samp{-r},
|
2000-08-22 23:04:20 -04:00
|
|
|
downloading the same file in the same directory will result in the
|
2000-11-16 07:35:27 -05:00
|
|
|
original copy of @var{file} being preserved and the second copy being
|
|
|
|
named @samp{@var{file}.1}. If that file is downloaded yet again, the
|
|
|
|
third copy will be named @samp{@var{file}.2}, and so on. When
|
|
|
|
@samp{-nc} is specified, this behavior is suppressed, and Wget will
|
2000-08-22 23:04:20 -04:00
|
|
|
refuse to download newer copies of @samp{@var{file}}. Therefore,
|
2000-11-16 07:35:27 -05:00
|
|
|
``@code{no-clobber}'' is actually a misnomer in this mode---it's not
|
|
|
|
clobbering that's prevented (as the numeric suffixes were already
|
|
|
|
preventing clobbering), but rather the multiple version saving that's
|
|
|
|
prevented.
|
2000-08-22 23:04:20 -04:00
|
|
|
|
2000-11-16 07:35:27 -05:00
|
|
|
When running Wget with @samp{-r}, but without @samp{-N} or @samp{-nc},
|
2000-08-22 23:04:20 -04:00
|
|
|
re-downloading a file will result in the new copy simply overwriting the
|
|
|
|
old. Adding @samp{-nc} will prevent this behavior, instead causing the
|
|
|
|
original version to be preserved and any newer copies on the server to
|
|
|
|
be ignored.
|
|
|
|
|
2000-11-16 07:35:27 -05:00
|
|
|
When running Wget with @samp{-N}, with or without @samp{-r}, the
|
2000-08-22 23:04:20 -04:00
|
|
|
decision as to whether or not to download a newer copy of a file depends
|
|
|
|
on the local and remote timestamp and size of the file
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Time-Stamping}). @samp{-nc} may not be specified at the same
|
2000-08-22 23:04:20 -04:00
|
|
|
time as @samp{-N}.
|
|
|
|
|
|
|
|
Note that when @samp{-nc} is specified, files with the suffixes
|
|
|
|
@samp{.html} or (yuck) @samp{.htm} will be loaded from the local disk
|
|
|
|
and parsed as if they had been retrieved from the Web.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex continue retrieval
|
2001-02-19 18:22:48 -05:00
|
|
|
@cindex incomplete downloads
|
2001-01-10 01:51:51 -05:00
|
|
|
@cindex resume download
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -c
|
|
|
|
@itemx --continue
|
2001-01-10 01:51:51 -05:00
|
|
|
Continue getting a partially-downloaded file. This is useful when you
|
|
|
|
want to finish up a download started by a previous instance of Wget, or
|
|
|
|
by another program. For instance:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
|
|
|
|
@end example
|
|
|
|
|
2001-01-10 01:51:51 -05:00
|
|
|
If there is a file named @file{ls-lR.Z} in the current directory, Wget
|
1999-12-02 02:42:23 -05:00
|
|
|
will assume that it is the first portion of the remote file, and will
|
2001-01-10 01:51:51 -05:00
|
|
|
ask the server to continue the retrieval from an offset equal to the
|
1999-12-02 02:42:23 -05:00
|
|
|
length of the local file.
|
|
|
|
|
2001-01-10 01:51:51 -05:00
|
|
|
Note that you don't need to specify this option if you just want the
|
|
|
|
current invocation of Wget to retry downloading a file should the
|
|
|
|
connection be lost midway through. This is the default behavior.
|
|
|
|
@samp{-c} only affects resumption of downloads started @emph{prior} to
|
|
|
|
this invocation of Wget, and whose local files are still sitting around.
|
|
|
|
|
|
|
|
Without @samp{-c}, the previous example would just download the remote
|
|
|
|
file to @file{ls-lR.Z.1}, leaving the truncated @file{ls-lR.Z} file
|
2001-04-30 06:19:30 -04:00
|
|
|
alone.
|
2001-01-10 01:51:51 -05:00
|
|
|
|
2001-04-30 06:19:30 -04:00
|
|
|
Beginning with Wget 1.7, if you use @samp{-c} on a non-empty file, and
|
|
|
|
it turns out that the server does not support continued downloading,
|
|
|
|
Wget will refuse to start the download from scratch, which would
|
|
|
|
effectively ruin existing contents. If you really want the download to
|
|
|
|
start from scratch, remove the file.
|
|
|
|
|
|
|
|
Also beginning with Wget 1.7, if you use @samp{-c} on a file which is of
|
|
|
|
equal size as the one on the server, Wget will refuse to download the
|
|
|
|
file and print an explanatory message. The same happens when the file
|
|
|
|
is smaller on the server than locally (presumably because it was changed
|
|
|
|
on the server since your last download attempt)---because ``continuing''
|
|
|
|
is not meaningful, no download occurs.
|
2001-02-19 18:22:48 -05:00
|
|
|
|
|
|
|
On the other side of the coin, while using @samp{-c}, any file that's
|
|
|
|
bigger on the server than locally will be considered an incomplete
|
2001-04-30 06:19:30 -04:00
|
|
|
download and only @code{(length(remote) - length(local))} bytes will be
|
|
|
|
downloaded and tacked onto the end of the local file. This behavior can
|
|
|
|
be desirable in certain cases---for instance, you can use @samp{wget -c}
|
|
|
|
to download just the new portion that's been appended to a data
|
2001-02-19 18:22:48 -05:00
|
|
|
collection or log file.
|
|
|
|
|
|
|
|
However, if the file is bigger on the server because it's been
|
|
|
|
@emph{changed}, as opposed to just @emph{appended} to, you'll end up
|
|
|
|
with a garbled file. Wget has no way of verifying that the local file
|
|
|
|
is really a valid prefix of the remote file. You need to be especially
|
|
|
|
careful of this when using @samp{-c} in conjunction with @samp{-r},
|
|
|
|
since every file will be considered as an "incomplete download" candidate.
|
|
|
|
|
|
|
|
Another instance where you'll get a garbled file if you try to use
|
|
|
|
@samp{-c} is if you have a lame @sc{http} proxy that inserts a
|
|
|
|
``transfer interrupted'' string into the local file. In the future a
|
2001-01-10 01:51:51 -05:00
|
|
|
``rollback'' option may be added to deal with this case.
|
|
|
|
|
2001-02-19 18:22:48 -05:00
|
|
|
Note that @samp{-c} only works with @sc{ftp} servers and with @sc{http}
|
|
|
|
servers that support the @code{Range} header.
|
2000-08-23 17:36:31 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex dot style
|
|
|
|
@cindex retrieval tracing style
|
|
|
|
@item --dot-style=@var{style}
|
|
|
|
Set the retrieval style to @var{style}. Wget traces the retrieval of
|
|
|
|
each document by printing dots on the screen, each dot representing a
|
|
|
|
fixed amount of retrieved data. Any number of dots may be separated in
|
|
|
|
a @dfn{cluster}, to make counting easier. This option allows you to
|
|
|
|
choose one of the pre-defined styles, determining the number of bytes
|
|
|
|
represented by a dot, the number of dots in a cluster, and the number of
|
|
|
|
dots on the line.
|
|
|
|
|
|
|
|
With the @code{default} style each dot represents 1K, there are ten dots
|
|
|
|
in a cluster and 50 dots in a line. The @code{binary} style has a more
|
|
|
|
``computer''-like orientation---8K dots, 16-dots clusters and 48 dots
|
|
|
|
per line (which makes for 384K lines). The @code{mega} style is
|
|
|
|
suitable for downloading very large files---each dot represents 64K
|
|
|
|
retrieved, there are eight dots in a cluster, and 48 dots on each line
|
|
|
|
(so each line contains 3M). The @code{micro} style is exactly the
|
|
|
|
reverse; it is suitable for downloading small files, with 128-byte dots,
|
|
|
|
8 dots per cluster, and 48 dots (6K) per line.
|
|
|
|
|
|
|
|
@item -N
|
|
|
|
@itemx --timestamping
|
2000-11-14 17:49:07 -05:00
|
|
|
Turn on time-stamping. @xref{Time-Stamping}, for details.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex server response, print
|
|
|
|
@item -S
|
|
|
|
@itemx --server-response
|
|
|
|
Print the headers sent by @sc{http} servers and responses sent by
|
|
|
|
@sc{ftp} servers.
|
|
|
|
|
|
|
|
@cindex Wget as spider
|
|
|
|
@cindex spider
|
|
|
|
@item --spider
|
|
|
|
When invoked with this option, Wget will behave as a Web @dfn{spider},
|
|
|
|
which means that it will not download the pages, just check that they
|
|
|
|
are there. You can use it to check your bookmarks, e.g. with:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget --spider --force-html -i bookmarks.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
This feature needs much more work for Wget to get close to the
|
|
|
|
functionality of real @sc{www} spiders.
|
|
|
|
|
|
|
|
@cindex timeout
|
|
|
|
@item -T seconds
|
|
|
|
@itemx --timeout=@var{seconds}
|
|
|
|
Set the read timeout to @var{seconds} seconds. Whenever a network read
|
|
|
|
is issued, the file descriptor is checked for a timeout, which could
|
|
|
|
otherwise leave a pending connection (uninterrupted read). The default
|
|
|
|
timeout is 900 seconds (fifteen minutes). Setting timeout to 0 will
|
|
|
|
disable checking for timeouts.
|
|
|
|
|
|
|
|
Please do not lower the default timeout value with this option unless
|
|
|
|
you know what you are doing.
|
|
|
|
|
|
|
|
@cindex pause
|
|
|
|
@cindex wait
|
|
|
|
@item -w @var{seconds}
|
|
|
|
@itemx --wait=@var{seconds}
|
|
|
|
Wait the specified number of seconds between the retrievals. Use of
|
|
|
|
this option is recommended, as it lightens the server load by making the
|
|
|
|
requests less frequent. Instead of in seconds, the time can be
|
|
|
|
specified in minutes using the @code{m} suffix, in hours using @code{h}
|
|
|
|
suffix, or in days using @code{d} suffix.
|
|
|
|
|
|
|
|
Specifying a large value for this option is useful if the network or the
|
|
|
|
destination host is down, so that Wget can wait long enough to
|
|
|
|
reasonably expect the network error to be fixed before the retry.
|
|
|
|
|
2000-04-12 21:42:34 -04:00
|
|
|
@cindex retries, waiting between
|
|
|
|
@cindex waiting between retries
|
|
|
|
@item --waitretry=@var{seconds}
|
|
|
|
If you don't want Wget to wait between @emph{every} retrieval, but only
|
2000-04-13 15:37:52 -04:00
|
|
|
between retries of failed downloads, you can use this option. Wget will
|
2000-11-16 07:35:27 -05:00
|
|
|
use @dfn{linear backoff}, waiting 1 second after the first failure on a
|
2000-04-13 15:37:52 -04:00
|
|
|
given file, then waiting 2 seconds after the second failure on that
|
|
|
|
file, up to the maximum number of @var{seconds} you specify. Therefore,
|
|
|
|
a value of 10 will actually make Wget wait up to (1 + 2 + ... + 10) = 55
|
|
|
|
seconds per file.
|
|
|
|
|
|
|
|
Note that this option is turned on by default in the global
|
|
|
|
@file{wgetrc} file.
|
2000-04-12 21:42:34 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex proxy
|
|
|
|
@item -Y on/off
|
|
|
|
@itemx --proxy=on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Turn proxy support on or off. The proxy is on by default if the
|
1999-12-02 02:42:23 -05:00
|
|
|
appropriate environmental variable is defined.
|
|
|
|
|
|
|
|
@cindex quota
|
|
|
|
@item -Q @var{quota}
|
|
|
|
@itemx --quota=@var{quota}
|
|
|
|
Specify download quota for automatic retrievals. The value can be
|
|
|
|
specified in bytes (default), kilobytes (with @samp{k} suffix), or
|
|
|
|
megabytes (with @samp{m} suffix).
|
|
|
|
|
|
|
|
Note that quota will never affect downloading a single file. So if you
|
|
|
|
specify @samp{wget -Q10k ftp://wuarchive.wustl.edu/ls-lR.gz}, all of the
|
|
|
|
@file{ls-lR.gz} will be downloaded. The same goes even when several
|
|
|
|
@sc{url}s are specified on the command-line. However, quota is
|
|
|
|
respected when retrieving either recursively, or from an input file.
|
|
|
|
Thus you may safely type @samp{wget -Q2m -i sites}---download will be
|
|
|
|
aborted when the quota is exceeded.
|
|
|
|
|
|
|
|
Setting quota to 0 or to @samp{inf} unlimits the download quota.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
@node Directory Options, HTTP Options, Download Options, Invoking
|
|
|
|
@section Directory Options
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@item -nd
|
|
|
|
@itemx --no-directories
|
2000-11-16 07:35:27 -05:00
|
|
|
Do not create a hierarchy of directories when retrieving recursively.
|
|
|
|
With this option turned on, all files will get saved to the current
|
|
|
|
directory, without clobbering (if a name shows up more than once, the
|
|
|
|
filenames will get extensions @samp{.n}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item -x
|
|
|
|
@itemx --force-directories
|
|
|
|
The opposite of @samp{-nd}---create a hierarchy of directories, even if
|
|
|
|
one would not have been created otherwise. E.g. @samp{wget -x
|
2000-11-10 09:47:30 -05:00
|
|
|
http://fly.srk.fer.hr/robots.txt} will save the downloaded file to
|
|
|
|
@file{fly.srk.fer.hr/robots.txt}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item -nH
|
|
|
|
@itemx --no-host-directories
|
|
|
|
Disable generation of host-prefixed directories. By default, invoking
|
2000-11-10 09:47:30 -05:00
|
|
|
Wget with @samp{-r http://fly.srk.fer.hr/} will create a structure of
|
|
|
|
directories beginning with @file{fly.srk.fer.hr/}. This option disables
|
1999-12-02 02:42:23 -05:00
|
|
|
such behavior.
|
|
|
|
|
|
|
|
@cindex cut directories
|
|
|
|
@item --cut-dirs=@var{number}
|
|
|
|
Ignore @var{number} directory components. This is useful for getting a
|
|
|
|
fine-grained control over the directory where recursive retrieval will
|
|
|
|
be saved.
|
|
|
|
|
|
|
|
Take, for example, the directory at
|
|
|
|
@samp{ftp://ftp.xemacs.org/pub/xemacs/}. If you retrieve it with
|
|
|
|
@samp{-r}, it will be saved locally under
|
|
|
|
@file{ftp.xemacs.org/pub/xemacs/}. While the @samp{-nH} option can
|
|
|
|
remove the @file{ftp.xemacs.org/} part, you are still stuck with
|
|
|
|
@file{pub/xemacs}. This is where @samp{--cut-dirs} comes in handy; it
|
|
|
|
makes Wget not ``see'' @var{number} remote directory components. Here
|
|
|
|
are several examples of how @samp{--cut-dirs} option works.
|
|
|
|
|
|
|
|
@example
|
|
|
|
@group
|
|
|
|
No options -> ftp.xemacs.org/pub/xemacs/
|
|
|
|
-nH -> pub/xemacs/
|
|
|
|
-nH --cut-dirs=1 -> xemacs/
|
|
|
|
-nH --cut-dirs=2 -> .
|
|
|
|
|
|
|
|
--cut-dirs=1 -> ftp.xemacs.org/xemacs/
|
|
|
|
...
|
|
|
|
@end group
|
|
|
|
@end example
|
|
|
|
|
|
|
|
If you just want to get rid of the directory structure, this option is
|
|
|
|
similar to a combination of @samp{-nd} and @samp{-P}. However, unlike
|
|
|
|
@samp{-nd}, @samp{--cut-dirs} does not lose with subdirectories---for
|
|
|
|
instance, with @samp{-nH --cut-dirs=1}, a @file{beta/} subdirectory will
|
|
|
|
be placed to @file{xemacs/beta}, as one would expect.
|
|
|
|
|
|
|
|
@cindex directory prefix
|
|
|
|
@item -P @var{prefix}
|
|
|
|
@itemx --directory-prefix=@var{prefix}
|
|
|
|
Set directory prefix to @var{prefix}. The @dfn{directory prefix} is the
|
|
|
|
directory where all other files and subdirectories will be saved to,
|
|
|
|
i.e. the top of the retrieval tree. The default is @samp{.} (the
|
|
|
|
current directory).
|
|
|
|
@end table
|
|
|
|
|
|
|
|
@node HTTP Options, FTP Options, Directory Options, Invoking
|
|
|
|
@section HTTP Options
|
|
|
|
|
|
|
|
@table @samp
|
2000-10-20 01:55:46 -04:00
|
|
|
@cindex .html extension
|
|
|
|
@item -E
|
|
|
|
@itemx --html-extension
|
|
|
|
If a file of type @samp{text/html} is downloaded and the URL does not
|
2000-11-16 07:35:27 -05:00
|
|
|
end with the regexp @samp{\.[Hh][Tt][Mm][Ll]?}, this option will cause
|
|
|
|
the suffix @samp{.html} to be appended to the local filename. This is
|
2000-10-20 17:46:41 -04:00
|
|
|
useful, for instance, when you're mirroring a remote site that uses
|
|
|
|
@samp{.asp} pages, but you want the mirrored pages to be viewable on
|
|
|
|
your stock Apache server. Another good use for this is when you're
|
2000-10-20 01:55:46 -04:00
|
|
|
downloading the output of CGIs. A URL like
|
|
|
|
@samp{http://site.com/article.cgi?25} will be saved as
|
|
|
|
@file{article.cgi?25.html}.
|
|
|
|
|
|
|
|
Note that filenames changed in this way will be re-downloaded every time
|
2000-11-16 07:35:27 -05:00
|
|
|
you re-mirror a site, because Wget can't tell that the local
|
2000-10-20 01:55:46 -04:00
|
|
|
@file{@var{X}.html} file corresponds to remote URL @samp{@var{X}} (since
|
|
|
|
it doesn't yet know that the URL produces output of type
|
|
|
|
@samp{text/html}. To prevent this re-downloading, you must use
|
|
|
|
@samp{-k} and @samp{-K} so that the original version of the file will be
|
2000-11-14 17:49:07 -05:00
|
|
|
saved as @file{@var{X}.orig} (@pxref{Recursive Retrieval Options}).
|
2000-10-20 01:55:46 -04:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex http user
|
|
|
|
@cindex http password
|
|
|
|
@cindex authentication
|
|
|
|
@item --http-user=@var{user}
|
|
|
|
@itemx --http-passwd=@var{password}
|
|
|
|
Specify the username @var{user} and password @var{password} on an
|
|
|
|
@sc{http} server. According to the type of the challenge, Wget will
|
|
|
|
encode them using either the @code{basic} (insecure) or the
|
|
|
|
@code{digest} authentication scheme.
|
|
|
|
|
|
|
|
Another way to specify username and password is in the @sc{url} itself
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{URL Format}). For more information about security issues with
|
1999-12-02 02:42:23 -05:00
|
|
|
Wget, @xref{Security Considerations}.
|
|
|
|
|
|
|
|
@cindex proxy
|
|
|
|
@cindex cache
|
|
|
|
@item -C on/off
|
|
|
|
@itemx --cache=on/off
|
|
|
|
When set to off, disable server-side cache. In this case, Wget will
|
|
|
|
send the remote server an appropriate directive (@samp{Pragma:
|
|
|
|
no-cache}) to get the file from the remote service, rather than
|
|
|
|
returning the cached version. This is especially useful for retrieving
|
|
|
|
and flushing out-of-date documents on proxy servers.
|
|
|
|
|
|
|
|
Caching is allowed by default.
|
|
|
|
|
2001-04-27 02:08:23 -04:00
|
|
|
@cindex cookies
|
|
|
|
@item --cookies=on/off
|
|
|
|
When set to off, disable the use of cookies. Cookies are a mechanism
|
|
|
|
for maintaining server-side state. The server sends the client a cookie
|
|
|
|
using the @code{Set-Cookie} header, and the client responds with the
|
|
|
|
same cookie upon further requests. Since cookies allow the server
|
|
|
|
owners to keep track of visitors and for sites to exchange this
|
|
|
|
information, some consider them a breach of privacy. The default is to
|
|
|
|
use cookies; however, @emph{storing} cookies is not on by default.
|
|
|
|
|
|
|
|
@cindex loading cookies
|
|
|
|
@cindex cookies, loading
|
|
|
|
@item --load-cookies @var{file}
|
|
|
|
Load cookies from @var{file} before the first HTTP retrieval. The
|
|
|
|
format of @var{file} is one used by Netscape and Mozilla, at least their
|
|
|
|
Unix version.
|
|
|
|
|
|
|
|
@cindex saving cookies
|
|
|
|
@cindex cookies, saving
|
|
|
|
@item --save-cookies @var{file}
|
|
|
|
Save cookies from @var{file} at the end of session. Cookies whose
|
|
|
|
expiry time is not specified, or those that have already expired, are
|
|
|
|
not saved.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex Content-Length, ignore
|
|
|
|
@cindex ignore length
|
|
|
|
@item --ignore-length
|
|
|
|
Unfortunately, some @sc{http} servers (@sc{cgi} programs, to be more
|
|
|
|
precise) send out bogus @code{Content-Length} headers, which makes Wget
|
|
|
|
go wild, as it thinks not all the document was retrieved. You can spot
|
|
|
|
this syndrome if Wget retries getting the same document again and again,
|
|
|
|
each time claiming that the (otherwise normal) connection has closed on
|
|
|
|
the very same byte.
|
|
|
|
|
|
|
|
With this option, Wget will ignore the @code{Content-Length} header---as
|
|
|
|
if it never existed.
|
|
|
|
|
|
|
|
@cindex header, add
|
|
|
|
@item --header=@var{additional-header}
|
|
|
|
Define an @var{additional-header} to be passed to the @sc{http} servers.
|
|
|
|
Headers must contain a @samp{:} preceded by one or more non-blank
|
|
|
|
characters, and must not contain newlines.
|
|
|
|
|
|
|
|
You may define more than one additional header by specifying
|
|
|
|
@samp{--header} more than once.
|
|
|
|
|
|
|
|
@example
|
|
|
|
@group
|
|
|
|
wget --header='Accept-Charset: iso-8859-2' \
|
|
|
|
--header='Accept-Language: hr' \
|
2000-11-10 09:47:30 -05:00
|
|
|
http://fly.srk.fer.hr/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end group
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Specification of an empty string as the header value will clear all
|
|
|
|
previous user-defined headers.
|
|
|
|
|
|
|
|
@cindex proxy user
|
|
|
|
@cindex proxy password
|
|
|
|
@cindex proxy authentication
|
|
|
|
@item --proxy-user=@var{user}
|
|
|
|
@itemx --proxy-passwd=@var{password}
|
|
|
|
Specify the username @var{user} and password @var{password} for
|
|
|
|
authentication on a proxy server. Wget will encode them using the
|
|
|
|
@code{basic} authentication scheme.
|
|
|
|
|
2000-05-22 22:29:38 -04:00
|
|
|
@cindex http referer
|
|
|
|
@cindex referer, http
|
|
|
|
@item --referer=@var{url}
|
|
|
|
Include `Referer: @var{url}' header in HTTP request. Useful for
|
|
|
|
retrieving documents with server-side processing that assume they are
|
|
|
|
always being retrieved by interactive web browsers and only come out
|
|
|
|
properly when Referer is set to one of the pages that point to them.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex server response, save
|
|
|
|
@item -s
|
|
|
|
@itemx --save-headers
|
|
|
|
Save the headers sent by the @sc{http} server to the file, preceding the
|
|
|
|
actual contents, with an empty line as the separator.
|
|
|
|
|
|
|
|
@cindex user-agent
|
|
|
|
@item -U @var{agent-string}
|
|
|
|
@itemx --user-agent=@var{agent-string}
|
|
|
|
Identify as @var{agent-string} to the @sc{http} server.
|
|
|
|
|
|
|
|
The @sc{http} protocol allows the clients to identify themselves using a
|
|
|
|
@code{User-Agent} header field. This enables distinguishing the
|
|
|
|
@sc{www} software, usually for statistical purposes or for tracing of
|
|
|
|
protocol violations. Wget normally identifies as
|
|
|
|
@samp{Wget/@var{version}}, @var{version} being the current version
|
|
|
|
number of Wget.
|
|
|
|
|
|
|
|
However, some sites have been known to impose the policy of tailoring
|
|
|
|
the output according to the @code{User-Agent}-supplied information.
|
|
|
|
While conceptually this is not such a bad idea, it has been abused by
|
|
|
|
servers denying information to clients other than @code{Mozilla} or
|
|
|
|
Microsoft @code{Internet Explorer}. This option allows you to change
|
|
|
|
the @code{User-Agent} line issued by Wget. Use of this option is
|
|
|
|
discouraged, unless you really know what you are doing.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
@node FTP Options, Recursive Retrieval Options, HTTP Options, Invoking
|
|
|
|
@section FTP Options
|
|
|
|
|
|
|
|
@table @samp
|
2001-02-23 16:31:54 -05:00
|
|
|
@cindex .listing files, removing
|
|
|
|
@item -nr
|
|
|
|
@itemx --dont-remove-listing
|
|
|
|
Don't remove the temporary @file{.listing} files generated by @sc{ftp}
|
|
|
|
retrievals. Normally, these files contain the raw directory listings
|
2001-02-23 17:49:42 -05:00
|
|
|
received from @sc{ftp} servers. Not removing them can be useful for
|
|
|
|
debugging purposes, or when you want to be able to easily check on the
|
|
|
|
contents of remote server directories (e.g. to verify that a mirror
|
|
|
|
you're running is complete).
|
|
|
|
|
|
|
|
Note that even though Wget writes to a known filename for this file,
|
|
|
|
this is not a security hole in the scenario of a user making
|
|
|
|
@file{.listing} a symbolic link to @file{/etc/passwd} or something and
|
|
|
|
asking @code{root} to run Wget in his or her directory. Depending on
|
|
|
|
the options used, either Wget will refuse to write to @file{.listing},
|
|
|
|
making the globbing/recursion/time-stamping operation fail, or the
|
|
|
|
symbolic link will be deleted and replaced with the actual
|
|
|
|
@file{.listing} file, or the listing will be written to a
|
|
|
|
@file{.listing.@var{number}} file.
|
|
|
|
|
|
|
|
Even though this situation isn't a problem, though, @code{root} should
|
|
|
|
never run Wget in a non-trusted user's directory. A user could do
|
|
|
|
something as simple as linking @file{index.html} to @file{/etc/passwd}
|
|
|
|
and asking @code{root} to run Wget with @samp{-N} or @samp{-r} so the file
|
|
|
|
will be overwritten.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex globbing, toggle
|
|
|
|
@item -g on/off
|
|
|
|
@itemx --glob=on/off
|
|
|
|
Turn @sc{ftp} globbing on or off. Globbing means you may use the
|
|
|
|
shell-like special characters (@dfn{wildcards}), like @samp{*},
|
|
|
|
@samp{?}, @samp{[} and @samp{]} to retrieve more than one file from the
|
|
|
|
same directory at once, like:
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget ftp://gnjilux.srk.fer.hr/*.msg
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
By default, globbing will be turned on if the @sc{url} contains a
|
|
|
|
globbing character. This option may be used to turn globbing on or off
|
|
|
|
permanently.
|
|
|
|
|
|
|
|
You may have to quote the @sc{url} to protect it from being expanded by
|
|
|
|
your shell. Globbing makes Wget look for a directory listing, which is
|
|
|
|
system-specific. This is why it currently works only with Unix @sc{ftp}
|
|
|
|
servers (and the ones emulating Unix @code{ls} output).
|
|
|
|
|
|
|
|
@cindex passive ftp
|
|
|
|
@item --passive-ftp
|
|
|
|
Use the @dfn{passive} @sc{ftp} retrieval scheme, in which the client
|
|
|
|
initiates the data connection. This is sometimes required for @sc{ftp}
|
|
|
|
to work behind firewalls.
|
2001-02-23 16:31:54 -05:00
|
|
|
|
|
|
|
@cindex symbolic links, retrieving
|
|
|
|
@item --retr-symlinks
|
|
|
|
Usually, when retrieving @sc{ftp} directories recursively and a symbolic
|
|
|
|
link is encountered, the linked-to file is not downloaded. Instead, a
|
|
|
|
matching symbolic link is created on the local filesystem. The
|
|
|
|
pointed-to file will not be downloaded unless this recursive retrieval
|
|
|
|
would have encountered it separately and downloaded it anyway.
|
|
|
|
|
|
|
|
When @samp{--retr-symlinks} is specified, however, symbolic links are
|
|
|
|
traversed and the pointed-to files are retrieved. At this time, this
|
|
|
|
option does not cause Wget to traverse symlinks to directories and
|
|
|
|
recurse through them, but in the future it should be enhanced to do
|
|
|
|
this.
|
|
|
|
|
|
|
|
Note that when retrieving a file (not a directory) because it was
|
|
|
|
specified on the commandline, rather than because it was recursed to,
|
|
|
|
this option has no effect. Symbolic links are always traversed in this
|
|
|
|
case.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
|
|
|
@node Recursive Retrieval Options, Recursive Accept/Reject Options, FTP Options, Invoking
|
|
|
|
@section Recursive Retrieval Options
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@item -r
|
|
|
|
@itemx --recursive
|
2000-11-14 17:49:07 -05:00
|
|
|
Turn on recursive retrieving. @xref{Recursive Retrieval}, for more
|
1999-12-02 02:42:23 -05:00
|
|
|
details.
|
|
|
|
|
|
|
|
@item -l @var{depth}
|
|
|
|
@itemx --level=@var{depth}
|
2000-11-14 17:49:07 -05:00
|
|
|
Specify recursion maximum depth level @var{depth} (@pxref{Recursive
|
1999-12-02 02:42:23 -05:00
|
|
|
Retrieval}). The default maximum depth is 5.
|
|
|
|
|
|
|
|
@cindex proxy filling
|
|
|
|
@cindex delete after retrieval
|
|
|
|
@cindex filling proxy cache
|
|
|
|
@item --delete-after
|
|
|
|
This option tells Wget to delete every single file it downloads,
|
|
|
|
@emph{after} having done so. It is useful for pre-fetching popular
|
2000-10-23 23:43:47 -04:00
|
|
|
pages through a proxy, e.g.:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -nd --delete-after http://whatever.com/~popular/page/
|
|
|
|
@end example
|
|
|
|
|
2000-10-23 23:43:47 -04:00
|
|
|
The @samp{-r} option is to retrieve recursively, and @samp{-nd} to not
|
|
|
|
create directories.
|
|
|
|
|
|
|
|
Note that @samp{--delete-after} deletes files on the local machine. It
|
|
|
|
does not issue the @samp{DELE} command to remote FTP sites, for
|
|
|
|
instance. Also note that when @samp{--delete-after} is specified,
|
|
|
|
@samp{--convert-links} is ignored, so @samp{.orig} files are simply not
|
|
|
|
created in the first place.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex conversion of links
|
2000-03-11 01:48:06 -05:00
|
|
|
@cindex link conversion
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -k
|
|
|
|
@itemx --convert-links
|
2001-03-31 21:54:31 -05:00
|
|
|
After the download is complete, convert the links in the document to
|
|
|
|
make them suitable for local viewing. This affects not only the visible
|
|
|
|
hyperlinks, but any part of the document that links to external content,
|
|
|
|
such as embedded images, links to style sheets, hyperlinks to non-HTML
|
|
|
|
content, etc.
|
|
|
|
|
|
|
|
Each link will be changed in one of the two ways:
|
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
|
|
|
The links to files that have been downloaded by Wget will be changed to
|
|
|
|
refer to the file they point to as a relative link.
|
|
|
|
|
|
|
|
Example: if the downloaded file @file{/foo/doc.html} links to
|
|
|
|
@file{/bar/img.gif}, also downloaded, then the link in @file{doc.html}
|
|
|
|
will be modified to point to @samp{../bar/img.gif}. This kind of
|
|
|
|
transformation works reliably for arbitrary combinations of directories.
|
|
|
|
|
|
|
|
@item
|
|
|
|
The links to files that have not been downloaded by Wget will be changed
|
|
|
|
to include host name and absolute path of the location they point to.
|
|
|
|
|
|
|
|
Example: if the downloaded file @file{/foo/doc.html} links to
|
|
|
|
@file{/bar/img.gif} (or to @file{../bar/img.gif}), then the link in
|
|
|
|
@file{doc.html} will be modified to point to
|
|
|
|
@file{http://@var{hostname}/bar/img.gif}.
|
|
|
|
@end itemize
|
|
|
|
|
|
|
|
Because of this, local browsing works reliably: if a linked file was
|
|
|
|
downloaded, the link will refer to its local name; if it was not
|
|
|
|
downloaded, the link will refer to its full Internet address rather than
|
|
|
|
presenting a broken link. The fact that the former links are converted
|
|
|
|
to relative links ensures that you can move the downloaded hierarchy to
|
|
|
|
another directory.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Note that only at the end of the download can Wget know which links have
|
2001-03-31 21:54:31 -05:00
|
|
|
been downloaded. Because of that, the work done by @samp{-k} will be
|
|
|
|
performed at the end of all the downloads.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-02-29 19:17:23 -05:00
|
|
|
@cindex backing up converted files
|
|
|
|
@item -K
|
|
|
|
@itemx --backup-converted
|
2000-03-11 01:48:06 -05:00
|
|
|
When converting a file, back up the original version with a @samp{.orig}
|
2000-11-14 17:49:07 -05:00
|
|
|
suffix. Affects the behavior of @samp{-N} (@pxref{HTTP Time-Stamping
|
2000-03-11 01:48:06 -05:00
|
|
|
Internals}).
|
2000-02-29 19:17:23 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -m
|
|
|
|
@itemx --mirror
|
|
|
|
Turn on options suitable for mirroring. This option turns on recursion
|
|
|
|
and time-stamping, sets infinite recursion depth and keeps @sc{ftp}
|
|
|
|
directory listings. It is currently equivalent to
|
|
|
|
@samp{-r -N -l inf -nr}.
|
|
|
|
|
2000-08-30 07:26:21 -04:00
|
|
|
@cindex page requisites
|
|
|
|
@cindex required images, downloading
|
|
|
|
@item -p
|
|
|
|
@itemx --page-requisites
|
2000-11-16 07:35:27 -05:00
|
|
|
This option causes Wget to download all the files that are necessary to
|
2000-08-30 07:26:21 -04:00
|
|
|
properly display a given HTML page. This includes such things as
|
|
|
|
inlined images, sounds, and referenced stylesheets.
|
|
|
|
|
|
|
|
Ordinarily, when downloading a single HTML page, any requisite documents
|
|
|
|
that may be needed to display it properly are not downloaded. Using
|
2000-11-16 07:35:27 -05:00
|
|
|
@samp{-r} together with @samp{-l} can help, but since Wget does not
|
2000-08-30 07:26:21 -04:00
|
|
|
ordinarily distinguish between external and inlined documents, one is
|
2000-11-16 07:35:27 -05:00
|
|
|
generally left with ``leaf documents'' that are missing their
|
|
|
|
requisites.
|
2000-08-30 07:26:21 -04:00
|
|
|
|
|
|
|
For instance, say document @file{1.html} contains an @code{<IMG>} tag
|
|
|
|
referencing @file{1.gif} and an @code{<A>} tag pointing to external
|
2001-03-26 22:22:17 -05:00
|
|
|
document @file{2.html}. Say that @file{2.html} is similar but that its
|
2000-08-30 07:26:21 -04:00
|
|
|
image is @file{2.gif} and it links to @file{3.html}. Say this
|
|
|
|
continues up to some arbitrarily high number.
|
|
|
|
|
|
|
|
If one executes the command:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l 2 http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
then @file{1.html}, @file{1.gif}, @file{2.html}, @file{2.gif}, and
|
|
|
|
@file{3.html} will be downloaded. As you can see, @file{3.html} is
|
2000-11-16 07:35:27 -05:00
|
|
|
without its requisite @file{3.gif} because Wget is simply counting the
|
2000-08-30 07:26:21 -04:00
|
|
|
number of hops (up to 2) away from @file{1.html} in order to determine
|
|
|
|
where to stop the recursion. However, with this command:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l 2 -p http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
all the above files @emph{and} @file{3.html}'s requisite @file{3.gif}
|
|
|
|
will be downloaded. Similarly,
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l 1 -p http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
will cause @file{1.html}, @file{1.gif}, @file{2.html}, and @file{2.gif}
|
|
|
|
to be downloaded. One might think that:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l 0 -p http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
would download just @file{1.html} and @file{1.gif}, but unfortunately
|
2000-11-16 07:35:27 -05:00
|
|
|
this is not the case, because @samp{-l 0} is equivalent to
|
|
|
|
@samp{-l inf}---that is, infinite recursion. To download a single HTML
|
|
|
|
page (or a handful of them, all specified on the commandline or in a
|
2001-03-26 22:22:17 -05:00
|
|
|
@samp{-i} @sc{url} input file) and its (or their) requisites, simply leave off
|
|
|
|
@samp{-r} and @samp{-l}:
|
2000-08-30 07:26:21 -04:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -p http://@var{site}/1.html
|
|
|
|
@end example
|
|
|
|
|
2000-11-16 07:35:27 -05:00
|
|
|
Note that Wget will behave as if @samp{-r} had been specified, but only
|
2000-08-30 07:26:21 -04:00
|
|
|
that single page and its requisites will be downloaded. Links from that
|
|
|
|
page to external documents will not be followed. Actually, to download
|
|
|
|
a single page and all its requisites (even if they exist on separate
|
|
|
|
websites), and make sure the lot displays properly locally, this author
|
|
|
|
likes to use a few options in addition to @samp{-p}:
|
|
|
|
|
|
|
|
@example
|
2000-10-20 19:06:45 -04:00
|
|
|
wget -E -H -k -K -nh -p http://@var{site}/@var{document}
|
2000-08-30 07:26:21 -04:00
|
|
|
@end example
|
|
|
|
|
2001-03-26 22:22:17 -05:00
|
|
|
In one case you'll need to add a couple more options. If @var{document}
|
|
|
|
is a @code{<FRAMESET>} page, the "one more hop" that @samp{-p} gives you
|
|
|
|
won't be enough---you'll get the @code{<FRAME>} pages that are
|
|
|
|
referenced, but you won't get @emph{their} requisites. Therefore, in
|
|
|
|
this case you'll need to add @samp{-r -l1} to the commandline. The
|
|
|
|
@samp{-r -l1} will recurse from the @code{<FRAMESET>} page to to the
|
|
|
|
@code{<FRAME>} pages, and the @samp{-p} will get their requisites. If
|
|
|
|
you're already using a recursion level of 1 or more, you'll need to up
|
|
|
|
it by one. In the future, @samp{-p} may be made smarter so that it'll
|
|
|
|
do "two more hops" in the case of a @code{<FRAMESET>} page.
|
|
|
|
|
2000-11-16 11:29:46 -05:00
|
|
|
To finish off this topic, it's worth knowing that Wget's idea of an
|
2000-08-30 07:26:21 -04:00
|
|
|
external document link is any URL specified in an @code{<A>} tag, an
|
|
|
|
@code{<AREA>} tag, or a @code{<LINK>} tag other than @code{<LINK
|
|
|
|
REL="stylesheet">}.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
|
|
|
@node Recursive Accept/Reject Options, , Recursive Retrieval Options, Invoking
|
|
|
|
@section Recursive Accept/Reject Options
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@item -A @var{acclist} --accept @var{acclist}
|
|
|
|
@itemx -R @var{rejlist} --reject @var{rejlist}
|
|
|
|
Specify comma-separated lists of file name suffixes or patterns to
|
2000-11-14 17:49:07 -05:00
|
|
|
accept or reject (@pxref{Types of Files} for more details).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item -D @var{domain-list}
|
|
|
|
@itemx --domains=@var{domain-list}
|
|
|
|
Set domains to be accepted and @sc{dns} looked-up, where
|
|
|
|
@var{domain-list} is a comma-separated list. Note that it does
|
|
|
|
@emph{not} turn on @samp{-H}. This option speeds things up, even if
|
2000-11-14 17:49:07 -05:00
|
|
|
only one host is spanned (@pxref{Domain Acceptance}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item --exclude-domains @var{domain-list}
|
|
|
|
Exclude the domains given in a comma-separated @var{domain-list} from
|
2000-11-14 17:49:07 -05:00
|
|
|
@sc{dns}-lookup (@pxref{Domain Acceptance}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@cindex follow FTP links
|
|
|
|
@item --follow-ftp
|
|
|
|
Follow @sc{ftp} links from @sc{html} documents. Without this option,
|
|
|
|
Wget will ignore all the @sc{ftp} links.
|
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
@cindex tag-based recursive pruning
|
|
|
|
@item --follow-tags=@var{list}
|
|
|
|
Wget has an internal table of HTML tag / attribute pairs that it
|
|
|
|
considers when looking for linked documents during a recursive
|
|
|
|
retrieval. If a user wants only a subset of those tags to be
|
|
|
|
considered, however, he or she should be specify such tags in a
|
|
|
|
comma-separated @var{list} with this option.
|
|
|
|
|
|
|
|
@item -G @var{list}
|
|
|
|
@itemx --ignore-tags=@var{list}
|
|
|
|
This is the opposite of the @samp{--follow-tags} option. To skip
|
|
|
|
certain HTML tags when recursively looking for documents to download,
|
2000-08-30 07:26:21 -04:00
|
|
|
specify them in a comma-separated @var{list}.
|
|
|
|
|
|
|
|
In the past, the @samp{-G} option was the best bet for downloading a
|
|
|
|
single page and its requisites, using a commandline like:
|
2000-03-11 01:48:06 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -Ga,area -H -k -K -nh -r http://@var{site}/@var{document}
|
|
|
|
@end example
|
|
|
|
|
2000-08-30 07:26:21 -04:00
|
|
|
However, the author of this option came across a page with tags like
|
|
|
|
@code{<LINK REL="home" HREF="/">} and came to the realization that
|
2000-11-16 07:35:27 -05:00
|
|
|
@samp{-G} was not enough. One can't just tell Wget to ignore
|
2000-08-30 07:26:21 -04:00
|
|
|
@code{<LINK>}, because then stylesheets will not be downloaded. Now the
|
|
|
|
best bet for downloading a single page and its requisites is the
|
|
|
|
dedicated @samp{--page-requisites} option.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -H
|
|
|
|
@itemx --span-hosts
|
2000-11-14 17:49:07 -05:00
|
|
|
Enable spanning across hosts when doing recursive retrieving (@pxref{All
|
1999-12-02 02:42:23 -05:00
|
|
|
Hosts}).
|
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
@item -L
|
|
|
|
@itemx --relative
|
|
|
|
Follow relative links only. Useful for retrieving a specific home page
|
|
|
|
without any distractions, not even those from the same hosts
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Relative Links}).
|
2000-03-11 01:48:06 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item -I @var{list}
|
|
|
|
@itemx --include-directories=@var{list}
|
|
|
|
Specify a comma-separated list of directories you wish to follow when
|
2000-11-14 17:49:07 -05:00
|
|
|
downloading (@pxref{Directory-Based Limits} for more details.) Elements
|
1999-12-02 02:42:23 -05:00
|
|
|
of @var{list} may contain wildcards.
|
|
|
|
|
|
|
|
@item -X @var{list}
|
|
|
|
@itemx --exclude-directories=@var{list}
|
|
|
|
Specify a comma-separated list of directories you wish to exclude from
|
2000-11-14 17:49:07 -05:00
|
|
|
download (@pxref{Directory-Based Limits} for more details.) Elements of
|
1999-12-02 02:42:23 -05:00
|
|
|
@var{list} may contain wildcards.
|
|
|
|
|
|
|
|
@item -nh
|
|
|
|
@itemx --no-host-lookup
|
|
|
|
Disable the time-consuming @sc{dns} lookup of almost all hosts
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Host Checking}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item -np
|
|
|
|
@item --no-parent
|
|
|
|
Do not ever ascend to the parent directory when retrieving recursively.
|
|
|
|
This is a useful option, since it guarantees that only the files
|
|
|
|
@emph{below} a certain hierarchy will be downloaded.
|
2000-11-14 17:49:07 -05:00
|
|
|
@xref{Directory-Based Limits}, for more details.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@node Recursive Retrieval, Following Links, Invoking, Top
|
|
|
|
@chapter Recursive Retrieval
|
|
|
|
@cindex recursion
|
|
|
|
@cindex retrieving
|
|
|
|
@cindex recursive retrieval
|
|
|
|
|
|
|
|
GNU Wget is capable of traversing parts of the Web (or a single
|
|
|
|
@sc{http} or @sc{ftp} server), depth-first following links and directory
|
|
|
|
structure. This is called @dfn{recursive} retrieving, or
|
|
|
|
@dfn{recursion}.
|
|
|
|
|
|
|
|
With @sc{http} @sc{url}s, Wget retrieves and parses the @sc{html} from
|
|
|
|
the given @sc{url}, documents, retrieving the files the @sc{html}
|
|
|
|
document was referring to, through markups like @code{href}, or
|
|
|
|
@code{src}. If the freshly downloaded file is also of type
|
|
|
|
@code{text/html}, it will be parsed and followed further.
|
|
|
|
|
|
|
|
The maximum @dfn{depth} to which the retrieval may descend is specified
|
|
|
|
with the @samp{-l} option (the default maximum depth is five layers).
|
|
|
|
@xref{Recursive Retrieval}.
|
|
|
|
|
|
|
|
When retrieving an @sc{ftp} @sc{url} recursively, Wget will retrieve all
|
|
|
|
the data from the given directory tree (including the subdirectories up
|
|
|
|
to the specified depth) on the remote server, creating its mirror image
|
|
|
|
locally. @sc{ftp} retrieval is also limited by the @code{depth}
|
|
|
|
parameter.
|
|
|
|
|
|
|
|
By default, Wget will create a local directory tree, corresponding to
|
|
|
|
the one found on the remote server.
|
|
|
|
|
|
|
|
Recursive retrieving can find a number of applications, the most
|
|
|
|
important of which is mirroring. It is also useful for @sc{www}
|
|
|
|
presentations, and any other opportunities where slow network
|
|
|
|
connections should be bypassed by storing the files locally.
|
|
|
|
|
|
|
|
You should be warned that invoking recursion may cause grave overloading
|
|
|
|
on your system, because of the fast exchange of data through the
|
|
|
|
network; all of this may hamper other users' work. The same stands for
|
|
|
|
the foreign server you are mirroring---the more requests it gets in a
|
|
|
|
rows, the greater is its load.
|
|
|
|
|
2000-03-02 08:44:56 -05:00
|
|
|
Careless retrieving can also fill your file system uncontrollably, which
|
1999-12-02 02:42:23 -05:00
|
|
|
can grind the machine to a halt.
|
|
|
|
|
|
|
|
The load can be minimized by lowering the maximum recursion level
|
|
|
|
(@samp{-l}) and/or by lowering the number of retries (@samp{-t}). You
|
|
|
|
may also consider using the @samp{-w} option to slow down your requests
|
|
|
|
to the remote servers, as well as the numerous options to narrow the
|
2000-11-14 17:49:07 -05:00
|
|
|
number of followed links (@pxref{Following Links}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Recursive retrieval is a good thing when used properly. Please take all
|
|
|
|
precautions not to wreak havoc through carelessness.
|
|
|
|
|
|
|
|
@node Following Links, Time-Stamping, Recursive Retrieval, Top
|
|
|
|
@chapter Following Links
|
|
|
|
@cindex links
|
|
|
|
@cindex following links
|
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
When retrieving recursively, one does not wish to retrieve loads of
|
1999-12-02 02:42:23 -05:00
|
|
|
unnecessary data. Most of the time the users bear in mind exactly what
|
|
|
|
they want to download, and want Wget to follow only specific links.
|
|
|
|
|
|
|
|
For example, if you wish to download the music archive from
|
2000-11-10 09:47:30 -05:00
|
|
|
@samp{fly.srk.fer.hr}, you will not want to download all the home pages
|
1999-12-02 02:42:23 -05:00
|
|
|
that happen to be referenced by an obscure part of the archive.
|
|
|
|
|
|
|
|
Wget possesses several mechanisms that allows you to fine-tune which
|
|
|
|
links it will follow.
|
|
|
|
|
|
|
|
@menu
|
|
|
|
* Relative Links:: Follow relative links only.
|
|
|
|
* Host Checking:: Follow links on the same host.
|
|
|
|
* Domain Acceptance:: Check on a list of domains.
|
|
|
|
* All Hosts:: No host restrictions.
|
|
|
|
* Types of Files:: Getting only certain files.
|
|
|
|
* Directory-Based Limits:: Getting only certain directories.
|
|
|
|
* FTP Links:: Following FTP links.
|
|
|
|
@end menu
|
|
|
|
|
|
|
|
@node Relative Links, Host Checking, Following Links, Following Links
|
|
|
|
@section Relative Links
|
|
|
|
@cindex relative links
|
|
|
|
|
|
|
|
When only relative links are followed (option @samp{-L}), recursive
|
|
|
|
retrieving will never span hosts. No time-expensive @sc{dns}-lookups
|
|
|
|
will be performed, and the process will be very fast, with the minimum
|
|
|
|
strain of the network. This will suit your needs often, especially when
|
|
|
|
mirroring the output of various @code{x2html} converters, since they
|
|
|
|
generally output relative links.
|
|
|
|
|
|
|
|
@node Host Checking, Domain Acceptance, Relative Links, Following Links
|
|
|
|
@section Host Checking
|
|
|
|
@cindex DNS lookup
|
|
|
|
@cindex host lookup
|
|
|
|
@cindex host checking
|
|
|
|
|
|
|
|
The drawback of following the relative links solely is that humans often
|
|
|
|
tend to mix them with absolute links to the very same host, and the very
|
|
|
|
same page. In this mode (which is the default mode for following links)
|
2000-03-02 08:44:56 -05:00
|
|
|
all @sc{url}s that refer to the same host will be retrieved.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
The problem with this option are the aliases of the hosts and domains.
|
|
|
|
Thus there is no way for Wget to know that @samp{regoc.srce.hr} and
|
2000-11-10 09:47:30 -05:00
|
|
|
@samp{www.srce.hr} are the same host, or that @samp{fly.srk.fer.hr} is
|
|
|
|
the same as @samp{fly.cc.fer.hr}. Whenever an absolute link is
|
1999-12-02 02:42:23 -05:00
|
|
|
encountered, the host is @sc{dns}-looked-up with @code{gethostbyname} to
|
|
|
|
check whether we are maybe dealing with the same hosts. Although the
|
|
|
|
results of @code{gethostbyname} are cached, it is still a great
|
|
|
|
slowdown, e.g. when dealing with large indices of home pages on different
|
2000-03-02 08:44:56 -05:00
|
|
|
hosts (because each of the hosts must be @sc{dns}-resolved to see
|
|
|
|
whether it just @emph{might} be an alias of the starting host).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
To avoid the overhead you may use @samp{-nh}, which will turn off
|
|
|
|
@sc{dns}-resolving and make Wget compare hosts literally. This will
|
|
|
|
make things run much faster, but also much less reliable
|
|
|
|
(e.g. @samp{www.srce.hr} and @samp{regoc.srce.hr} will be flagged as
|
|
|
|
different hosts).
|
|
|
|
|
2000-03-02 08:44:56 -05:00
|
|
|
Note that modern @sc{http} servers allow one IP address to host several
|
|
|
|
@dfn{virtual servers}, each having its own directory hierarchy. Such
|
1999-12-02 02:42:23 -05:00
|
|
|
``servers'' are distinguished by their hostnames (all of which point to
|
|
|
|
the same IP address); for this to work, a client must send a @code{Host}
|
|
|
|
header, which is what Wget does. However, in that case Wget @emph{must
|
|
|
|
not} try to divine a host's ``real'' address, nor try to use the same
|
|
|
|
hostname for each access, i.e. @samp{-nh} must be turned on.
|
|
|
|
|
2000-03-02 08:44:56 -05:00
|
|
|
In other words, the @samp{-nh} option must be used to enable the
|
1999-12-02 02:42:23 -05:00
|
|
|
retrieval from virtual servers distinguished by their hostnames. As the
|
|
|
|
number of such server setups grow, the behavior of @samp{-nh} may become
|
|
|
|
the default in the future.
|
|
|
|
|
|
|
|
@node Domain Acceptance, All Hosts, Host Checking, Following Links
|
|
|
|
@section Domain Acceptance
|
|
|
|
|
|
|
|
With the @samp{-D} option you may specify the domains that will be
|
|
|
|
followed. The hosts the domain of which is not in this list will not be
|
|
|
|
@sc{dns}-resolved. Thus you can specify @samp{-Dmit.edu} just to make
|
|
|
|
sure that @strong{nothing outside of @sc{mit} gets looked up}. This is
|
|
|
|
very important and useful. It also means that @samp{-D} does @emph{not}
|
|
|
|
imply @samp{-H} (span all hosts), which must be specified explicitly.
|
|
|
|
Feel free to use this options since it will speed things up, with almost
|
|
|
|
all the reliability of checking for all hosts. Thus you could invoke
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget -r -D.hr http://fly.srk.fer.hr/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
to make sure that only the hosts in @samp{.hr} domain get
|
2000-11-10 09:47:30 -05:00
|
|
|
@sc{dns}-looked-up for being equal to @samp{fly.srk.fer.hr}. So
|
|
|
|
@samp{fly.cc.fer.hr} will be checked (only once!) and found equal, but
|
1999-12-02 02:42:23 -05:00
|
|
|
@samp{www.gnu.ai.mit.edu} will not even be checked.
|
|
|
|
|
|
|
|
Of course, domain acceptance can be used to limit the retrieval to
|
|
|
|
particular domains with spanning of hosts in them, but then you must
|
|
|
|
specify @samp{-H} explicitly. E.g.:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -H -Dmit.edu,stanford.edu http://www.mit.edu/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
will start with @samp{http://www.mit.edu/}, following links across
|
|
|
|
@sc{mit} and Stanford.
|
|
|
|
|
|
|
|
If there are domains you want to exclude specifically, you can do it
|
|
|
|
with @samp{--exclude-domains}, which accepts the same type of arguments
|
|
|
|
of @samp{-D}, but will @emph{exclude} all the listed domains. For
|
|
|
|
example, if you want to download all the hosts from @samp{foo.edu}
|
|
|
|
domain, with the exception of @samp{sunsite.foo.edu}, you can do it like
|
|
|
|
this:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -rH -Dfoo.edu --exclude-domains sunsite.foo.edu http://www.foo.edu/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@node All Hosts, Types of Files, Domain Acceptance, Following Links
|
|
|
|
@section All Hosts
|
|
|
|
@cindex all hosts
|
|
|
|
@cindex span hosts
|
|
|
|
|
|
|
|
When @samp{-H} is specified without @samp{-D}, all hosts are freely
|
|
|
|
spanned. There are no restrictions whatsoever as to what part of the
|
|
|
|
net Wget will go to fetch documents, other than maximum retrieval depth.
|
|
|
|
If a page references @samp{www.yahoo.com}, so be it. Such an option is
|
|
|
|
rarely useful for itself.
|
|
|
|
|
|
|
|
@node Types of Files, Directory-Based Limits, All Hosts, Following Links
|
|
|
|
@section Types of Files
|
|
|
|
@cindex types of files
|
|
|
|
|
|
|
|
When downloading material from the web, you will often want to restrict
|
|
|
|
the retrieval to only certain file types. For example, if you are
|
2000-03-02 08:44:56 -05:00
|
|
|
interested in downloading @sc{gif}s, you will not be overjoyed to get
|
|
|
|
loads of PostScript documents, and vice versa.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Wget offers two options to deal with this problem. Each option
|
|
|
|
description lists a short name, a long name, and the equivalent command
|
|
|
|
in @file{.wgetrc}.
|
|
|
|
|
|
|
|
@cindex accept wildcards
|
|
|
|
@cindex accept suffixes
|
|
|
|
@cindex wildcards, accept
|
|
|
|
@cindex suffixes, accept
|
|
|
|
@table @samp
|
|
|
|
@item -A @var{acclist}
|
|
|
|
@itemx --accept @var{acclist}
|
|
|
|
@itemx accept = @var{acclist}
|
|
|
|
The argument to @samp{--accept} option is a list of file suffixes or
|
|
|
|
patterns that Wget will download during recursive retrieval. A suffix
|
|
|
|
is the ending part of a file, and consists of ``normal'' letters,
|
|
|
|
e.g. @samp{gif} or @samp{.jpg}. A matching pattern contains shell-like
|
|
|
|
wildcards, e.g. @samp{books*} or @samp{zelazny*196[0-9]*}.
|
|
|
|
|
|
|
|
So, specifying @samp{wget -A gif,jpg} will make Wget download only the
|
|
|
|
files ending with @samp{gif} or @samp{jpg}, i.e. @sc{gif}s and
|
|
|
|
@sc{jpeg}s. On the other hand, @samp{wget -A "zelazny*196[0-9]*"} will
|
|
|
|
download only files beginning with @samp{zelazny} and containing numbers
|
|
|
|
from 1960 to 1969 anywhere within. Look up the manual of your shell for
|
|
|
|
a description of how pattern matching works.
|
|
|
|
|
|
|
|
Of course, any number of suffixes and patterns can be combined into a
|
|
|
|
comma-separated list, and given as an argument to @samp{-A}.
|
|
|
|
|
|
|
|
@cindex reject wildcards
|
|
|
|
@cindex reject suffixes
|
|
|
|
@cindex wildcards, reject
|
|
|
|
@cindex suffixes, reject
|
|
|
|
@item -R @var{rejlist}
|
|
|
|
@itemx --reject @var{rejlist}
|
|
|
|
@itemx reject = @var{rejlist}
|
|
|
|
The @samp{--reject} option works the same way as @samp{--accept}, only
|
|
|
|
its logic is the reverse; Wget will download all files @emph{except} the
|
|
|
|
ones matching the suffixes (or patterns) in the list.
|
|
|
|
|
|
|
|
So, if you want to download a whole page except for the cumbersome
|
|
|
|
@sc{mpeg}s and @sc{.au} files, you can use @samp{wget -R mpg,mpeg,au}.
|
|
|
|
Analogously, to download all files except the ones beginning with
|
|
|
|
@samp{bjork}, use @samp{wget -R "bjork*"}. The quotes are to prevent
|
|
|
|
expansion by the shell.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
The @samp{-A} and @samp{-R} options may be combined to achieve even
|
|
|
|
better fine-tuning of which files to retrieve. E.g. @samp{wget -A
|
|
|
|
"*zelazny*" -R .ps} will download all the files having @samp{zelazny} as
|
2000-03-02 08:44:56 -05:00
|
|
|
a part of their name, but @emph{not} the PostScript files.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Note that these two options do not affect the downloading of @sc{html}
|
|
|
|
files; Wget must load all the @sc{html}s to know where to go at
|
|
|
|
all---recursive retrieval would make no sense otherwise.
|
|
|
|
|
|
|
|
@node Directory-Based Limits, FTP Links, Types of Files, Following Links
|
|
|
|
@section Directory-Based Limits
|
|
|
|
@cindex directories
|
|
|
|
@cindex directory limits
|
|
|
|
|
|
|
|
Regardless of other link-following facilities, it is often useful to
|
|
|
|
place the restriction of what files to retrieve based on the directories
|
|
|
|
those files are placed in. There can be many reasons for this---the
|
|
|
|
home pages may be organized in a reasonable directory structure; or some
|
|
|
|
directories may contain useless information, e.g. @file{/cgi-bin} or
|
|
|
|
@file{/dev} directories.
|
|
|
|
|
|
|
|
Wget offers three different options to deal with this requirement. Each
|
|
|
|
option description lists a short name, a long name, and the equivalent
|
|
|
|
command in @file{.wgetrc}.
|
|
|
|
|
|
|
|
@cindex directories, include
|
|
|
|
@cindex include directories
|
|
|
|
@cindex accept directories
|
|
|
|
@table @samp
|
|
|
|
@item -I @var{list}
|
|
|
|
@itemx --include @var{list}
|
|
|
|
@itemx include_directories = @var{list}
|
|
|
|
@samp{-I} option accepts a comma-separated list of directories included
|
|
|
|
in the retrieval. Any other directories will simply be ignored. The
|
|
|
|
directories are absolute paths.
|
|
|
|
|
|
|
|
So, if you wish to download from @samp{http://host/people/bozo/}
|
|
|
|
following only links to bozo's colleagues in the @file{/people}
|
|
|
|
directory and the bogus scripts in @file{/cgi-bin}, you can specify:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -I /people,/cgi-bin http://host/people/bozo/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@cindex directories, exclude
|
|
|
|
@cindex exclude directories
|
|
|
|
@cindex reject directories
|
|
|
|
@item -X @var{list}
|
|
|
|
@itemx --exclude @var{list}
|
|
|
|
@itemx exclude_directories = @var{list}
|
|
|
|
@samp{-X} option is exactly the reverse of @samp{-I}---this is a list of
|
|
|
|
directories @emph{excluded} from the download. E.g. if you do not want
|
|
|
|
Wget to download things from @file{/cgi-bin} directory, specify @samp{-X
|
|
|
|
/cgi-bin} on the command line.
|
|
|
|
|
|
|
|
The same as with @samp{-A}/@samp{-R}, these two options can be combined
|
|
|
|
to get a better fine-tuning of downloading subdirectories. E.g. if you
|
|
|
|
want to load all the files from @file{/pub} hierarchy except for
|
|
|
|
@file{/pub/worthless}, specify @samp{-I/pub -X/pub/worthless}.
|
|
|
|
|
|
|
|
@cindex no parent
|
|
|
|
@item -np
|
|
|
|
@itemx --no-parent
|
|
|
|
@itemx no_parent = on
|
|
|
|
The simplest, and often very useful way of limiting directories is
|
|
|
|
disallowing retrieval of the links that refer to the hierarchy
|
2000-03-02 08:44:56 -05:00
|
|
|
@dfn{above} than the beginning directory, i.e. disallowing ascent to the
|
1999-12-02 02:42:23 -05:00
|
|
|
parent directory/directories.
|
|
|
|
|
|
|
|
The @samp{--no-parent} option (short @samp{-np}) is useful in this case.
|
|
|
|
Using it guarantees that you will never leave the existing hierarchy.
|
|
|
|
Supposing you issue Wget with:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r --no-parent http://somehost/~luzer/my-archive/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
You may rest assured that none of the references to
|
|
|
|
@file{/~his-girls-homepage/} or @file{/~luzer/all-my-mpegs/} will be
|
|
|
|
followed. Only the archive you are interested in will be downloaded.
|
|
|
|
Essentially, @samp{--no-parent} is similar to
|
|
|
|
@samp{-I/~luzer/my-archive}, only it handles redirections in a more
|
|
|
|
intelligent fashion.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
@node FTP Links, , Directory-Based Limits, Following Links
|
|
|
|
@section Following FTP Links
|
|
|
|
@cindex following ftp links
|
|
|
|
|
|
|
|
The rules for @sc{ftp} are somewhat specific, as it is necessary for
|
|
|
|
them to be. @sc{ftp} links in @sc{html} documents are often included
|
|
|
|
for purposes of reference, and it is often inconvenient to download them
|
|
|
|
by default.
|
|
|
|
|
|
|
|
To have @sc{ftp} links followed from @sc{html} documents, you need to
|
|
|
|
specify the @samp{--follow-ftp} option. Having done that, @sc{ftp}
|
|
|
|
links will span hosts regardless of @samp{-H} setting. This is logical,
|
|
|
|
as @sc{ftp} links rarely point to the same host where the @sc{http}
|
|
|
|
server resides. For similar reasons, the @samp{-L} options has no
|
|
|
|
effect on such downloads. On the other hand, domain acceptance
|
|
|
|
(@samp{-D}) and suffix rules (@samp{-A} and @samp{-R}) apply normally.
|
|
|
|
|
|
|
|
Also note that followed links to @sc{ftp} directories will not be
|
|
|
|
retrieved recursively further.
|
|
|
|
|
|
|
|
@node Time-Stamping, Startup File, Following Links, Top
|
|
|
|
@chapter Time-Stamping
|
|
|
|
@cindex time-stamping
|
|
|
|
@cindex timestamping
|
|
|
|
@cindex updating the archives
|
|
|
|
@cindex incremental updating
|
|
|
|
|
|
|
|
One of the most important aspects of mirroring information from the
|
|
|
|
Internet is updating your archives.
|
|
|
|
|
|
|
|
Downloading the whole archive again and again, just to replace a few
|
|
|
|
changed files is expensive, both in terms of wasted bandwidth and money,
|
|
|
|
and the time to do the update. This is why all the mirroring tools
|
|
|
|
offer the option of incremental updating.
|
|
|
|
|
|
|
|
Such an updating mechanism means that the remote server is scanned in
|
|
|
|
search of @dfn{new} files. Only those new files will be downloaded in
|
|
|
|
the place of the old ones.
|
|
|
|
|
|
|
|
A file is considered new if one of these two conditions are met:
|
|
|
|
|
|
|
|
@enumerate
|
|
|
|
@item
|
|
|
|
A file of that name does not already exist locally.
|
|
|
|
|
|
|
|
@item
|
|
|
|
A file of that name does exist, but the remote file was modified more
|
|
|
|
recently than the local file.
|
|
|
|
@end enumerate
|
|
|
|
|
|
|
|
To implement this, the program needs to be aware of the time of last
|
2001-02-23 15:16:07 -05:00
|
|
|
modification of both local and remote files. We call this information the
|
|
|
|
@dfn{time-stamp} of a file.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
The time-stamping in GNU Wget is turned on using @samp{--timestamping}
|
|
|
|
(@samp{-N}) option, or through @code{timestamping = on} directive in
|
|
|
|
@file{.wgetrc}. With this option, for each file it intends to download,
|
|
|
|
Wget will check whether a local file of the same name exists. If it
|
|
|
|
does, and the remote file is older, Wget will not download it.
|
|
|
|
|
|
|
|
If the local file does not exist, or the sizes of the files do not
|
|
|
|
match, Wget will download the remote file no matter what the time-stamps
|
|
|
|
say.
|
|
|
|
|
|
|
|
@menu
|
|
|
|
* Time-Stamping Usage::
|
|
|
|
* HTTP Time-Stamping Internals::
|
|
|
|
* FTP Time-Stamping Internals::
|
|
|
|
@end menu
|
|
|
|
|
|
|
|
@node Time-Stamping Usage, HTTP Time-Stamping Internals, Time-Stamping, Time-Stamping
|
|
|
|
@section Time-Stamping Usage
|
|
|
|
@cindex time-stamping usage
|
|
|
|
@cindex usage, time-stamping
|
|
|
|
|
|
|
|
The usage of time-stamping is simple. Say you would like to download a
|
|
|
|
file so that it keeps its date of modification.
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -S http://www.gnu.ai.mit.edu/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
A simple @code{ls -l} shows that the time stamp on the local file equals
|
|
|
|
the state of the @code{Last-Modified} header, as returned by the server.
|
|
|
|
As you can see, the time-stamping info is preserved locally, even
|
2001-02-23 15:16:07 -05:00
|
|
|
without @samp{-N} (at least for @sc{http}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Several days later, you would like Wget to check if the remote file has
|
|
|
|
changed, and download it if it has.
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -N http://www.gnu.ai.mit.edu/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Wget will ask the server for the last-modified date. If the local file
|
2001-02-23 15:16:07 -05:00
|
|
|
has the same timestamp as the server, or a newer one, the remote file
|
|
|
|
will not be re-fetched. However, if the remote file is more recent,
|
|
|
|
Wget will proceed to fetch it.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
The same goes for @sc{ftp}. For example:
|
|
|
|
|
|
|
|
@example
|
2001-02-23 15:16:07 -05:00
|
|
|
wget "ftp://ftp.ifi.uio.no/pub/emacs/gnus/*"
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
(The quotes around that URL are to prevent the shell from trying to
|
|
|
|
interpret the @samp{*}.)
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
After download, a local directory listing will show that the timestamps
|
|
|
|
match those on the remote server. Reissuing the command with @samp{-N}
|
|
|
|
will make Wget re-fetch @emph{only} the files that have been modified
|
|
|
|
since the last download.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
If you wished to mirror the GNU archive every week, you would use a
|
|
|
|
command like the following, weekly:
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2001-02-23 15:16:07 -05:00
|
|
|
wget --timestamping -r ftp://ftp.gnu.org/pub/gnu/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
Note that time-stamping will only work for files for which the server
|
|
|
|
gives a timestamp. For @sc{http}, this depends on getting a
|
|
|
|
@code{Last-Modified} header. For @sc{ftp}, this depends on getting a
|
|
|
|
directory listing with dates in a format that Wget can parse
|
|
|
|
(@pxref{FTP Time-Stamping Internals}).
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@node HTTP Time-Stamping Internals, FTP Time-Stamping Internals, Time-Stamping Usage, Time-Stamping
|
|
|
|
@section HTTP Time-Stamping Internals
|
|
|
|
@cindex http time-stamping
|
|
|
|
|
|
|
|
Time-stamping in @sc{http} is implemented by checking of the
|
|
|
|
@code{Last-Modified} header. If you wish to retrieve the file
|
|
|
|
@file{foo.html} through @sc{http}, Wget will check whether
|
|
|
|
@file{foo.html} exists locally. If it doesn't, @file{foo.html} will be
|
|
|
|
retrieved unconditionally.
|
|
|
|
|
|
|
|
If the file does exist locally, Wget will first check its local
|
|
|
|
time-stamp (similar to the way @code{ls -l} checks it), and then send a
|
|
|
|
@code{HEAD} request to the remote server, demanding the information on
|
|
|
|
the remote file.
|
|
|
|
|
|
|
|
The @code{Last-Modified} header is examined to find which file was
|
|
|
|
modified more recently (which makes it ``newer''). If the remote file
|
|
|
|
is newer, it will be downloaded; if it is older, Wget will give
|
|
|
|
up.@footnote{As an additional check, Wget will look at the
|
|
|
|
@code{Content-Length} header, and compare the sizes; if they are not the
|
|
|
|
same, the remote file will be downloaded no matter what the time-stamp
|
|
|
|
says.}
|
|
|
|
|
2000-03-02 02:06:10 -05:00
|
|
|
When @samp{--backup-converted} (@samp{-K}) is specified in conjunction
|
|
|
|
with @samp{-N}, server file @samp{@var{X}} is compared to local file
|
|
|
|
@samp{@var{X}.orig}, if extant, rather than being compared to local file
|
|
|
|
@samp{@var{X}}, which will always differ if it's been converted by
|
|
|
|
@samp{--convert-links} (@samp{-k}).
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
Arguably, @sc{http} time-stamping should be implemented using the
|
|
|
|
@code{If-Modified-Since} request.
|
|
|
|
|
|
|
|
@node FTP Time-Stamping Internals, , HTTP Time-Stamping Internals, Time-Stamping
|
|
|
|
@section FTP Time-Stamping Internals
|
|
|
|
@cindex ftp time-stamping
|
|
|
|
|
|
|
|
In theory, @sc{ftp} time-stamping works much the same as @sc{http}, only
|
2001-02-23 16:31:54 -05:00
|
|
|
@sc{ftp} has no headers---time-stamps must be ferreted out of directory
|
|
|
|
listings.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2001-02-23 15:16:07 -05:00
|
|
|
If an @sc{ftp} download is recursive or uses globbing, Wget will use the
|
|
|
|
@sc{ftp} @code{LIST} command to get a file listing for the directory
|
|
|
|
containing the desired file(s). It will try to analyze the listing,
|
|
|
|
treating it like Unix @code{ls -l} output, extracting the time-stamps.
|
|
|
|
The rest is exactly the same as for @sc{http}. Note that when
|
|
|
|
retrieving individual files from an @sc{ftp} server without using
|
|
|
|
globbing or recursion, listing files will not be downloaded (and thus
|
|
|
|
files will not be time-stamped) unless @samp{-N} is specified.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Assumption that every directory listing is a Unix-style listing may
|
|
|
|
sound extremely constraining, but in practice it is not, as many
|
|
|
|
non-Unix @sc{ftp} servers use the Unixoid listing format because most
|
|
|
|
(all?) of the clients understand it. Bear in mind that @sc{rfc959}
|
|
|
|
defines no standard way to get a file list, let alone the time-stamps.
|
|
|
|
We can only hope that a future standard will define this.
|
|
|
|
|
|
|
|
Another non-standard solution includes the use of @code{MDTM} command
|
|
|
|
that is supported by some @sc{ftp} servers (including the popular
|
|
|
|
@code{wu-ftpd}), which returns the exact time of the specified file.
|
|
|
|
Wget may support this command in the future.
|
|
|
|
|
|
|
|
@node Startup File, Examples, Time-Stamping, Top
|
|
|
|
@chapter Startup File
|
|
|
|
@cindex startup file
|
|
|
|
@cindex wgetrc
|
|
|
|
@cindex .wgetrc
|
|
|
|
@cindex startup
|
|
|
|
@cindex .netrc
|
|
|
|
|
|
|
|
Once you know how to change default settings of Wget through command
|
|
|
|
line arguments, you may wish to make some of those settings permanent.
|
|
|
|
You can do that in a convenient way by creating the Wget startup
|
|
|
|
file---@file{.wgetrc}.
|
|
|
|
|
|
|
|
Besides @file{.wgetrc} is the ``main'' initialization file, it is
|
|
|
|
convenient to have a special facility for storing passwords. Thus Wget
|
|
|
|
reads and interprets the contents of @file{$HOME/.netrc}, if it finds
|
|
|
|
it. You can find @file{.netrc} format in your system manuals.
|
|
|
|
|
|
|
|
Wget reads @file{.wgetrc} upon startup, recognizing a limited set of
|
|
|
|
commands.
|
|
|
|
|
|
|
|
@menu
|
|
|
|
* Wgetrc Location:: Location of various wgetrc files.
|
|
|
|
* Wgetrc Syntax:: Syntax of wgetrc.
|
|
|
|
* Wgetrc Commands:: List of available commands.
|
|
|
|
* Sample Wgetrc:: A wgetrc example.
|
|
|
|
@end menu
|
|
|
|
|
|
|
|
@node Wgetrc Location, Wgetrc Syntax, Startup File, Startup File
|
|
|
|
@section Wgetrc Location
|
|
|
|
@cindex wgetrc location
|
|
|
|
@cindex location of wgetrc
|
|
|
|
|
|
|
|
When initializing, Wget will look for a @dfn{global} startup file,
|
|
|
|
@file{/usr/local/etc/wgetrc} by default (or some prefix other than
|
|
|
|
@file{/usr/local}, if Wget was not installed there) and read commands
|
|
|
|
from there, if it exists.
|
|
|
|
|
|
|
|
Then it will look for the user's file. If the environmental variable
|
|
|
|
@code{WGETRC} is set, Wget will try to load that file. Failing that, no
|
|
|
|
further attempts will be made.
|
|
|
|
|
|
|
|
If @code{WGETRC} is not set, Wget will try to load @file{$HOME/.wgetrc}.
|
|
|
|
|
|
|
|
The fact that user's settings are loaded after the system-wide ones
|
|
|
|
means that in case of collision user's wgetrc @emph{overrides} the
|
|
|
|
system-wide wgetrc (in @file{/usr/local/etc/wgetrc} by default).
|
|
|
|
Fascist admins, away!
|
|
|
|
|
|
|
|
@node Wgetrc Syntax, Wgetrc Commands, Wgetrc Location, Startup File
|
|
|
|
@section Wgetrc Syntax
|
|
|
|
@cindex wgetrc syntax
|
|
|
|
@cindex syntax of wgetrc
|
|
|
|
|
|
|
|
The syntax of a wgetrc command is simple:
|
|
|
|
|
|
|
|
@example
|
|
|
|
variable = value
|
|
|
|
@end example
|
|
|
|
|
|
|
|
The @dfn{variable} will also be called @dfn{command}. Valid
|
|
|
|
@dfn{values} are different for different commands.
|
|
|
|
|
|
|
|
The commands are case-insensitive and underscore-insensitive. Thus
|
|
|
|
@samp{DIr__PrefiX} is the same as @samp{dirprefix}. Empty lines, lines
|
|
|
|
beginning with @samp{#} and lines containing white-space only are
|
|
|
|
discarded.
|
|
|
|
|
|
|
|
Commands that expect a comma-separated list will clear the list on an
|
|
|
|
empty command. So, if you wish to reset the rejection list specified in
|
|
|
|
global @file{wgetrc}, you can do it with:
|
|
|
|
|
|
|
|
@example
|
|
|
|
reject =
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@node Wgetrc Commands, Sample Wgetrc, Wgetrc Syntax, Startup File
|
|
|
|
@section Wgetrc Commands
|
|
|
|
@cindex wgetrc commands
|
|
|
|
|
2000-10-20 02:59:30 -04:00
|
|
|
The complete set of commands is listed below. Legal values are listed
|
|
|
|
after the @samp{=}. Simple Boolean values can be set or unset using
|
|
|
|
@samp{on} and @samp{off} or @samp{1} and @samp{0}. A fancier kind of
|
2000-11-16 07:35:27 -05:00
|
|
|
Boolean allowed in some cases is the @dfn{lockable Boolean}, which may
|
|
|
|
be set to @samp{on}, @samp{off}, @samp{always}, or @samp{never}. If an
|
2000-10-20 02:59:30 -04:00
|
|
|
option is set to @samp{always} or @samp{never}, that value will be
|
2000-11-16 07:35:27 -05:00
|
|
|
locked in for the duration of the Wget invocation---commandline options
|
2000-10-20 02:59:30 -04:00
|
|
|
will not override.
|
|
|
|
|
2000-10-24 02:19:17 -04:00
|
|
|
Some commands take pseudo-arbitrary values. @var{address} values can be
|
|
|
|
hostnames or dotted-quad IP addresses. @var{n} can be any positive
|
|
|
|
integer, or @samp{inf} for infinity, where appropriate. @var{string}
|
|
|
|
values can be any non-empty string.
|
2000-10-20 02:59:30 -04:00
|
|
|
|
2000-11-14 17:49:07 -05:00
|
|
|
Most of these commands have commandline equivalents (@pxref{Invoking}),
|
2000-10-20 02:59:30 -04:00
|
|
|
though some of the more obscure or rarely used ones do not.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@table @asis
|
|
|
|
@item accept/reject = @var{string}
|
2000-11-14 17:49:07 -05:00
|
|
|
Same as @samp{-A}/@samp{-R} (@pxref{Types of Files}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item add_hostdir = on/off
|
|
|
|
Enable/disable host-prefixed file names. @samp{-nH} disables it.
|
|
|
|
|
|
|
|
@item continue = on/off
|
2001-04-30 06:19:30 -04:00
|
|
|
If set to on, force continuation of preexistent partially retrieved
|
|
|
|
files. See @samp{-c} before setting it.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item background = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Enable/disable going to background---the same as @samp{-b} (which
|
|
|
|
enables it).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-02-29 19:17:23 -05:00
|
|
|
@item backup_converted = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Enable/disable saving pre-converted files with the suffix
|
|
|
|
@samp{.orig}---the same as @samp{-K} (which enables it).
|
2000-02-29 19:17:23 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@c @item backups = @var{number}
|
|
|
|
@c #### Document me!
|
2000-08-23 18:41:21 -04:00
|
|
|
@c
|
1999-12-02 02:42:23 -05:00
|
|
|
@item base = @var{string}
|
2000-08-23 18:41:21 -04:00
|
|
|
Consider relative @sc{url}s in @sc{url} input files forced to be
|
2000-11-16 07:35:27 -05:00
|
|
|
interpreted as @sc{html} as being relative to @var{string}---the same as
|
|
|
|
@samp{-B}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-10-24 02:19:17 -04:00
|
|
|
@item bind_address = @var{address}
|
|
|
|
Bind to @var{address}, like the @samp{--bind-address} option.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item cache = on/off
|
|
|
|
When set to off, disallow server-caching. See the @samp{-C} option.
|
|
|
|
|
|
|
|
@item convert links = on/off
|
|
|
|
Convert non-relative links locally. The same as @samp{-k}.
|
|
|
|
|
2001-04-27 02:08:23 -04:00
|
|
|
@item cookies = on/off
|
|
|
|
When set to off, disallow cookies. See the @samp{--cookies} option.
|
|
|
|
|
|
|
|
@item load_cookies = @var{file}
|
|
|
|
Load cookies from @var{file}. See @samp{--load-cookies}.
|
|
|
|
|
|
|
|
@item save_cookies = @var{file}
|
|
|
|
Save cookies to @var{file}. See @samp{--save-cookies}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item cut_dirs = @var{n}
|
|
|
|
Ignore @var{n} remote directory components.
|
|
|
|
|
|
|
|
@item debug = on/off
|
|
|
|
Debug mode, same as @samp{-d}.
|
|
|
|
|
|
|
|
@item delete_after = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Delete after download---the same as @samp{--delete-after}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item dir_prefix = @var{string}
|
2000-11-16 07:35:27 -05:00
|
|
|
Top of directory tree---the same as @samp{-P}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item dirstruct = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Turning dirstruct on or off---the same as @samp{-x} or @samp{-nd},
|
1999-12-02 02:42:23 -05:00
|
|
|
respectively.
|
|
|
|
|
|
|
|
@item domains = @var{string}
|
2000-11-14 17:49:07 -05:00
|
|
|
Same as @samp{-D} (@pxref{Domain Acceptance}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item dot_bytes = @var{n}
|
|
|
|
Specify the number of bytes ``contained'' in a dot, as seen throughout
|
|
|
|
the retrieval (1024 by default). You can postfix the value with
|
|
|
|
@samp{k} or @samp{m}, representing kilobytes and megabytes,
|
|
|
|
respectively. With dot settings you can tailor the dot retrieval to
|
|
|
|
suit your needs, or you can use the predefined @dfn{styles}
|
2000-11-14 17:49:07 -05:00
|
|
|
(@pxref{Download Options}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item dots_in_line = @var{n}
|
|
|
|
Specify the number of dots that will be printed in each line throughout
|
|
|
|
the retrieval (50 by default).
|
|
|
|
|
|
|
|
@item dot_spacing = @var{n}
|
|
|
|
Specify the number of dots in a single cluster (10 by default).
|
|
|
|
|
|
|
|
@item dot_style = @var{string}
|
|
|
|
Specify the dot retrieval @dfn{style}, as with @samp{--dot-style}.
|
|
|
|
|
|
|
|
@item exclude_directories = @var{string}
|
|
|
|
Specify a comma-separated list of directories you wish to exclude from
|
2000-11-16 07:35:27 -05:00
|
|
|
download---the same as @samp{-X} (@pxref{Directory-Based Limits}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item exclude_domains = @var{string}
|
2000-11-14 17:49:07 -05:00
|
|
|
Same as @samp{--exclude-domains} (@pxref{Domain Acceptance}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item follow_ftp = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Follow @sc{ftp} links from @sc{html} documents---the same as @samp{-f}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
@item follow_tags = @var{string}
|
|
|
|
Only follow certain HTML tags when doing a recursive retrieval, just like
|
2000-11-16 07:35:27 -05:00
|
|
|
@samp{--follow-tags}.
|
2000-03-11 01:48:06 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item force_html = on/off
|
|
|
|
If set to on, force the input filename to be regarded as an @sc{html}
|
2000-11-16 07:35:27 -05:00
|
|
|
document---the same as @samp{-F}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item ftp_proxy = @var{string}
|
|
|
|
Use @var{string} as @sc{ftp} proxy, instead of the one specified in
|
|
|
|
environment.
|
|
|
|
|
|
|
|
@item glob = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Turn globbing on/off---the same as @samp{-g}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item header = @var{string}
|
|
|
|
Define an additional header, like @samp{--header}.
|
|
|
|
|
2000-10-20 01:55:46 -04:00
|
|
|
@item html_extension = on/off
|
|
|
|
Add a @samp{.html} extension to @samp{text/html} files without it, like
|
|
|
|
@samp{-E}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item http_passwd = @var{string}
|
|
|
|
Set @sc{http} password.
|
|
|
|
|
|
|
|
@item http_proxy = @var{string}
|
|
|
|
Use @var{string} as @sc{http} proxy, instead of the one specified in
|
|
|
|
environment.
|
|
|
|
|
|
|
|
@item http_user = @var{string}
|
|
|
|
Set @sc{http} user to @var{string}.
|
|
|
|
|
|
|
|
@item ignore_length = on/off
|
|
|
|
When set to on, ignore @code{Content-Length} header; the same as
|
|
|
|
@samp{--ignore-length}.
|
|
|
|
|
2000-03-11 01:48:06 -05:00
|
|
|
@item ignore_tags = @var{string}
|
|
|
|
Ignore certain HTML tags when doing a recursive retrieval, just like
|
2000-11-16 07:35:27 -05:00
|
|
|
@samp{-G} / @samp{--ignore-tags}.
|
2000-03-11 01:48:06 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@item include_directories = @var{string}
|
|
|
|
Specify a comma-separated list of directories you wish to follow when
|
2000-11-16 07:35:27 -05:00
|
|
|
downloading---the same as @samp{-I}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item input = @var{string}
|
|
|
|
Read the @sc{url}s from @var{string}, like @samp{-i}.
|
|
|
|
|
|
|
|
@item kill_longer = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Consider data longer than specified in content-length header as invalid
|
|
|
|
(and retry getting it). The default behaviour is to save as much data
|
|
|
|
as there is, provided there is more than or equal to the value in
|
|
|
|
@code{Content-Length}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item logfile = @var{string}
|
2000-11-16 07:35:27 -05:00
|
|
|
Set logfile---the same as @samp{-o}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item login = @var{string}
|
|
|
|
Your user name on the remote machine, for @sc{ftp}. Defaults to
|
|
|
|
@samp{anonymous}.
|
|
|
|
|
|
|
|
@item mirror = on/off
|
|
|
|
Turn mirroring on/off. The same as @samp{-m}.
|
|
|
|
|
|
|
|
@item netrc = on/off
|
|
|
|
Turn reading netrc on or off.
|
|
|
|
|
|
|
|
@item noclobber = on/off
|
|
|
|
Same as @samp{-nc}.
|
|
|
|
|
|
|
|
@item no_parent = on/off
|
|
|
|
Disallow retrieving outside the directory hierarchy, like
|
2000-11-14 17:49:07 -05:00
|
|
|
@samp{--no-parent} (@pxref{Directory-Based Limits}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item no_proxy = @var{string}
|
|
|
|
Use @var{string} as the comma-separated list of domains to avoid in
|
|
|
|
proxy loading, instead of the one specified in environment.
|
|
|
|
|
|
|
|
@item output_document = @var{string}
|
2000-11-16 07:35:27 -05:00
|
|
|
Set the output filename---the same as @samp{-O}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-08-30 07:26:21 -04:00
|
|
|
@item page_requisites = on/off
|
|
|
|
Download all ancillary documents necessary for a single HTML page to
|
2000-11-16 07:35:27 -05:00
|
|
|
display properly---the same as @samp{-p}.
|
2000-08-30 07:26:21 -04:00
|
|
|
|
2000-10-20 02:59:30 -04:00
|
|
|
@item passive_ftp = on/off/always/never
|
2000-11-16 07:35:27 -05:00
|
|
|
Set passive @sc{ftp}---the same as @samp{--passive-ftp}. Some scripts
|
2000-10-20 02:59:30 -04:00
|
|
|
and @samp{.pm} (Perl module) files download files using @samp{wget
|
|
|
|
--passive-ftp}. If your firewall does not allow this, you can set
|
|
|
|
@samp{passive_ftp = never} to override the commandline.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item passwd = @var{string}
|
|
|
|
Set your @sc{ftp} password to @var{password}. Without this setting, the
|
|
|
|
password defaults to @samp{username@@hostname.domainname}.
|
|
|
|
|
|
|
|
@item proxy_user = @var{string}
|
2000-10-20 02:59:30 -04:00
|
|
|
Set proxy authentication user name to @var{string}, like @samp{--proxy-user}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item proxy_passwd = @var{string}
|
2000-10-20 02:59:30 -04:00
|
|
|
Set proxy authentication password to @var{string}, like @samp{--proxy-passwd}.
|
|
|
|
|
|
|
|
@item referer = @var{string}
|
|
|
|
Set HTTP @samp{Referer:} header just like @samp{--referer}. (Note it
|
|
|
|
was the folks who wrote the @sc{http} spec who got the spelling of
|
2000-11-16 07:35:27 -05:00
|
|
|
``referrer'' wrong.)
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item quiet = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Quiet mode---the same as @samp{-q}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item quota = @var{quota}
|
2000-04-13 15:37:52 -04:00
|
|
|
Specify the download quota, which is useful to put in the global
|
2000-11-16 07:35:27 -05:00
|
|
|
@file{wgetrc}. When download quota is specified, Wget will stop
|
|
|
|
retrieving after the download sum has become greater than quota. The
|
|
|
|
quota can be specified in bytes (default), kbytes @samp{k} appended) or
|
|
|
|
mbytes (@samp{m} appended). Thus @samp{quota = 5m} will set the quota
|
|
|
|
to 5 mbytes. Note that the user's startup file overrides system
|
|
|
|
settings.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item reclevel = @var{n}
|
2000-11-16 07:35:27 -05:00
|
|
|
Recursion level---the same as @samp{-l}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item recursive = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Recursive on/off---the same as @samp{-r}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item relative_only = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Follow only relative links---the same as @samp{-L} (@pxref{Relative
|
1999-12-02 02:42:23 -05:00
|
|
|
Links}).
|
|
|
|
|
|
|
|
@item remove_listing = on/off
|
|
|
|
If set to on, remove @sc{ftp} listings downloaded by Wget. Setting it
|
|
|
|
to off is the same as @samp{-nr}.
|
|
|
|
|
|
|
|
@item retr_symlinks = on/off
|
|
|
|
When set to on, retrieve symbolic links as if they were plain files; the
|
|
|
|
same as @samp{--retr-symlinks}.
|
|
|
|
|
|
|
|
@item robots = on/off
|
2000-11-14 17:49:07 -05:00
|
|
|
Use (or not) @file{/robots.txt} file (@pxref{Robots}). Be sure to know
|
1999-12-02 02:42:23 -05:00
|
|
|
what you are doing before changing the default (which is @samp{on}).
|
|
|
|
|
|
|
|
@item server_response = on/off
|
|
|
|
Choose whether or not to print the @sc{http} and @sc{ftp} server
|
2000-11-16 07:35:27 -05:00
|
|
|
responses---the same as @samp{-S}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item simple_host_check = on/off
|
2000-11-14 17:49:07 -05:00
|
|
|
Same as @samp{-nh} (@pxref{Host Checking}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item span_hosts = on/off
|
|
|
|
Same as @samp{-H}.
|
|
|
|
|
|
|
|
@item timeout = @var{n}
|
2000-11-16 07:35:27 -05:00
|
|
|
Set timeout value---the same as @samp{-T}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item timestamping = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Turn timestamping on/off. The same as @samp{-N} (@pxref{Time-Stamping}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item tries = @var{n}
|
2000-11-16 07:35:27 -05:00
|
|
|
Set number of retries per @sc{url}---the same as @samp{-t}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item use_proxy = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Turn proxy support on/off. The same as @samp{-Y}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item verbose = on/off
|
2000-11-16 07:35:27 -05:00
|
|
|
Turn verbose on/off---the same as @samp{-v}/@samp{-nv}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item wait = @var{n}
|
2000-11-16 07:35:27 -05:00
|
|
|
Wait @var{n} seconds between retrievals---the same as @samp{-w}.
|
2000-04-12 21:42:34 -04:00
|
|
|
|
|
|
|
@item waitretry = @var{n}
|
2000-11-16 07:35:27 -05:00
|
|
|
Wait up to @var{n} seconds between retries of failed retrievals
|
|
|
|
only---the same as @samp{--waitretry}. Note that this is turned on by
|
|
|
|
default in the global @file{wgetrc}.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end table
|
|
|
|
|
|
|
|
@node Sample Wgetrc, , Wgetrc Commands, Startup File
|
|
|
|
@section Sample Wgetrc
|
|
|
|
@cindex sample wgetrc
|
|
|
|
|
|
|
|
This is the sample initialization file, as given in the distribution.
|
|
|
|
It is divided in two section---one for global usage (suitable for global
|
|
|
|
startup file), and one for local usage (suitable for
|
|
|
|
@file{$HOME/.wgetrc}). Be careful about the things you change.
|
|
|
|
|
2000-04-13 15:37:52 -04:00
|
|
|
Note that almost all the lines are commented out. For a command to have
|
|
|
|
any effect, you must remove the @samp{#} character at the beginning of
|
|
|
|
its line.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
2000-04-13 15:37:52 -04:00
|
|
|
@include sample.wgetrc.munged_for_texi_inclusion
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@node Examples, Various, Startup File, Top
|
|
|
|
@chapter Examples
|
|
|
|
@cindex examples
|
|
|
|
|
|
|
|
The examples are classified into three sections, because of clarity.
|
|
|
|
The first section is a tutorial for beginners. The second section
|
|
|
|
explains some of the more complex program features. The third section
|
|
|
|
contains advice for mirror administrators, as well as even more complex
|
|
|
|
features (that some would call perverted).
|
|
|
|
|
|
|
|
@menu
|
|
|
|
* Simple Usage:: Simple, basic usage of the program.
|
|
|
|
* Advanced Usage:: Advanced techniques of usage.
|
|
|
|
* Guru Usage:: Mirroring and the hairy stuff.
|
|
|
|
@end menu
|
|
|
|
|
|
|
|
@node Simple Usage, Advanced Usage, Examples, Examples
|
|
|
|
@section Simple Usage
|
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
|
|
|
Say you want to download a @sc{url}. Just type:
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget http://fly.srk.fer.hr/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
The response will be something like:
|
|
|
|
|
|
|
|
@example
|
|
|
|
@group
|
2000-11-10 09:47:30 -05:00
|
|
|
--13:30:45-- http://fly.srk.fer.hr:80/en/
|
1999-12-02 02:42:23 -05:00
|
|
|
=> `index.html'
|
2000-11-10 09:47:30 -05:00
|
|
|
Connecting to fly.srk.fer.hr:80... connected!
|
1999-12-02 02:42:23 -05:00
|
|
|
HTTP request sent, awaiting response... 200 OK
|
|
|
|
Length: 4,694 [text/html]
|
|
|
|
|
|
|
|
0K -> .... [100%]
|
|
|
|
|
|
|
|
13:30:46 (23.75 KB/s) - `index.html' saved [4694/4694]
|
|
|
|
@end group
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
But what will happen if the connection is slow, and the file is lengthy?
|
|
|
|
The connection will probably fail before the whole file is retrieved,
|
|
|
|
more than once. In this case, Wget will try getting the file until it
|
|
|
|
either gets the whole of it, or exceeds the default number of retries
|
|
|
|
(this being 20). It is easy to change the number of tries to 45, to
|
|
|
|
insure that the whole file will arrive safely:
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget --tries=45 http://fly.srk.fer.hr/jpg/flyweb.jpg
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
Now let's leave Wget to work in the background, and write its progress
|
|
|
|
to log file @file{log}. It is tiring to type @samp{--tries}, so we
|
|
|
|
shall use @samp{-t}.
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget -t 45 -o log http://fly.srk.fer.hr/jpg/flyweb.jpg &
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
The ampersand at the end of the line makes sure that Wget works in the
|
|
|
|
background. To unlimit the number of retries, use @samp{-t inf}.
|
|
|
|
|
|
|
|
@item
|
|
|
|
The usage of @sc{ftp} is as simple. Wget will take care of login and
|
|
|
|
password.
|
|
|
|
|
|
|
|
@example
|
|
|
|
@group
|
2000-11-10 09:47:30 -05:00
|
|
|
$ wget ftp://gnjilux.srk.fer.hr/welcome.msg
|
|
|
|
--10:08:47-- ftp://gnjilux.srk.fer.hr:21/welcome.msg
|
1999-12-02 02:42:23 -05:00
|
|
|
=> `welcome.msg'
|
2000-11-10 09:47:30 -05:00
|
|
|
Connecting to gnjilux.srk.fer.hr:21... connected!
|
1999-12-02 02:42:23 -05:00
|
|
|
Logging in as anonymous ... Logged in!
|
|
|
|
==> TYPE I ... done. ==> CWD not needed.
|
|
|
|
==> PORT ... done. ==> RETR welcome.msg ... done.
|
|
|
|
Length: 1,340 (unauthoritative)
|
|
|
|
|
|
|
|
0K -> . [100%]
|
|
|
|
|
|
|
|
10:08:48 (1.28 MB/s) - `welcome.msg' saved [1340]
|
|
|
|
@end group
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
If you specify a directory, Wget will retrieve the directory listing,
|
|
|
|
parse it and convert it to @sc{html}. Try:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget ftp://prep.ai.mit.edu/pub/gnu/
|
|
|
|
lynx index.html
|
|
|
|
@end example
|
|
|
|
@end itemize
|
|
|
|
|
|
|
|
@node Advanced Usage, Guru Usage, Simple Usage, Examples
|
|
|
|
@section Advanced Usage
|
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
|
|
|
You would like to read the list of @sc{url}s from a file? Not a problem
|
|
|
|
with that:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -i file
|
|
|
|
@end example
|
|
|
|
|
|
|
|
If you specify @samp{-} as file name, the @sc{url}s will be read from
|
|
|
|
standard input.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Create a mirror image of GNU @sc{www} site (with the same directory structure
|
|
|
|
the original has) with only one try per document, saving the log of the
|
|
|
|
activities to @file{gnulog}:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -t1 http://www.gnu.ai.mit.edu/ -o gnulog
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
Retrieve the first layer of yahoo links:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l1 http://www.yahoo.com/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
Retrieve the index.html of @samp{www.lycos.com}, showing the original
|
|
|
|
server headers:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -S http://www.lycos.com/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
Save the server headers with the file:
|
|
|
|
@example
|
|
|
|
wget -s http://www.lycos.com/
|
|
|
|
more index.html
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
Retrieve the first two levels of @samp{wuarchive.wustl.edu}, saving them
|
|
|
|
to /tmp.
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -P/tmp -l2 ftp://wuarchive.wustl.edu/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
You want to download all the @sc{gif}s from an @sc{http} directory.
|
|
|
|
@samp{wget http://host/dir/*.gif} doesn't work, since @sc{http}
|
|
|
|
retrieval does not support globbing. In that case, use:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -r -l1 --no-parent -A.gif http://host/dir/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
It is a bit of a kludge, but it works. @samp{-r -l1} means to retrieve
|
2000-11-14 17:49:07 -05:00
|
|
|
recursively (@pxref{Recursive Retrieval}), with maximum depth of 1.
|
1999-12-02 02:42:23 -05:00
|
|
|
@samp{--no-parent} means that references to the parent directory are
|
2000-11-14 17:49:07 -05:00
|
|
|
ignored (@pxref{Directory-Based Limits}), and @samp{-A.gif} means to
|
1999-12-02 02:42:23 -05:00
|
|
|
download only the @sc{gif} files. @samp{-A "*.gif"} would have worked
|
|
|
|
too.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Suppose you were in the middle of downloading, when Wget was
|
|
|
|
interrupted. Now you do not want to clobber the files already present.
|
|
|
|
It would be:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -nc -r http://www.gnu.ai.mit.edu/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
If you want to encode your own username and password to @sc{http} or
|
2000-11-14 17:49:07 -05:00
|
|
|
@sc{ftp}, use the appropriate @sc{url} syntax (@pxref{URL Format}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget ftp://hniksic:mypassword@@jagor.srce.hr/.emacs
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
If you do not like the default retrieval visualization (1K dots with 10
|
|
|
|
dots per cluster and 50 dots per line), you can customize it through dot
|
2000-11-14 17:49:07 -05:00
|
|
|
settings (@pxref{Wgetrc Commands}). For example, many people like the
|
1999-12-02 02:42:23 -05:00
|
|
|
``binary'' style of retrieval, with 8K dots and 512K lines:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget --dot-style=binary ftp://prep.ai.mit.edu/pub/gnu/README
|
|
|
|
@end example
|
|
|
|
|
|
|
|
You can experiment with other styles, like:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget --dot-style=mega ftp://ftp.xemacs.org/pub/xemacs/xemacs-20.4/xemacs-20.4.tar.gz
|
2000-11-10 09:47:30 -05:00
|
|
|
wget --dot-style=micro http://fly.srk.fer.hr/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
|
|
|
To make these settings permanent, put them in your @file{.wgetrc}, as
|
2000-11-14 17:49:07 -05:00
|
|
|
described before (@pxref{Sample Wgetrc}).
|
1999-12-02 02:42:23 -05:00
|
|
|
@end itemize
|
|
|
|
|
|
|
|
@node Guru Usage, , Advanced Usage, Examples
|
|
|
|
@section Guru Usage
|
|
|
|
|
|
|
|
@cindex mirroring
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
|
|
|
If you wish Wget to keep a mirror of a page (or @sc{ftp}
|
|
|
|
subdirectories), use @samp{--mirror} (@samp{-m}), which is the shorthand
|
|
|
|
for @samp{-r -N}. You can put Wget in the crontab file asking it to
|
|
|
|
recheck a site each Sunday:
|
|
|
|
|
|
|
|
@example
|
|
|
|
crontab
|
|
|
|
0 0 * * 0 wget --mirror ftp://ftp.xemacs.org/pub/xemacs/ -o /home/me/weeklog
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
You may wish to do the same with someone's home page. But you do not
|
|
|
|
want to download all those images---you're only interested in @sc{html}.
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget --mirror -A.html http://www.w3.org/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@item
|
|
|
|
But what about mirroring the hosts networkologically close to you? It
|
|
|
|
seems so awfully slow because of all that @sc{dns} resolving. Just use
|
2000-11-14 17:49:07 -05:00
|
|
|
@samp{-D} (@pxref{Domain Acceptance}).
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@example
|
|
|
|
wget -rN -Dsrce.hr http://www.srce.hr/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Now Wget will correctly find out that @samp{regoc.srce.hr} is the same
|
|
|
|
as @samp{www.srce.hr}, but will not even take into consideration the
|
|
|
|
link to @samp{www.mit.edu}.
|
|
|
|
|
|
|
|
@item
|
|
|
|
You have a presentation and would like the dumb absolute links to be
|
|
|
|
converted to relative? Use @samp{-k}:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -k -r @var{URL}
|
|
|
|
@end example
|
|
|
|
|
|
|
|
@cindex redirecting output
|
|
|
|
@item
|
|
|
|
You would like the output documents to go to standard output instead of
|
|
|
|
to files? OK, but Wget will automatically shut up (turn on
|
|
|
|
@samp{--quiet}) to prevent mixing of Wget output and the retrieved
|
|
|
|
documents.
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -O - http://jagor.srce.hr/ http://www.srce.hr/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
You can also combine the two options and make weird pipelines to
|
|
|
|
retrieve the documents from remote hotlists:
|
|
|
|
|
|
|
|
@example
|
|
|
|
wget -O - http://cool.list.com/ | wget --force-html -i -
|
|
|
|
@end example
|
|
|
|
@end itemize
|
|
|
|
|
|
|
|
@node Various, Appendices, Examples, Top
|
|
|
|
@chapter Various
|
|
|
|
@cindex various
|
|
|
|
|
|
|
|
This chapter contains all the stuff that could not fit anywhere else.
|
|
|
|
|
|
|
|
@menu
|
|
|
|
* Proxies:: Support for proxy servers
|
|
|
|
* Distribution:: Getting the latest version.
|
|
|
|
* Mailing List:: Wget mailing list for announcements and discussion.
|
|
|
|
* Reporting Bugs:: How and where to report bugs.
|
|
|
|
* Portability:: The systems Wget works on.
|
|
|
|
* Signals:: Signal-handling performed by Wget.
|
|
|
|
@end menu
|
|
|
|
|
|
|
|
@node Proxies, Distribution, Various, Various
|
|
|
|
@section Proxies
|
|
|
|
@cindex proxies
|
|
|
|
|
|
|
|
@dfn{Proxies} are special-purpose @sc{http} servers designed to transfer
|
|
|
|
data from remote servers to local clients. One typical use of proxies
|
|
|
|
is lightening network load for users behind a slow connection. This is
|
|
|
|
achieved by channeling all @sc{http} and @sc{ftp} requests through the
|
|
|
|
proxy which caches the transferred data. When a cached resource is
|
|
|
|
requested again, proxy will return the data from cache. Another use for
|
|
|
|
proxies is for companies that separate (for security reasons) their
|
|
|
|
internal networks from the rest of Internet. In order to obtain
|
|
|
|
information from the Web, their users connect and retrieve remote data
|
|
|
|
using an authorized proxy.
|
|
|
|
|
|
|
|
Wget supports proxies for both @sc{http} and @sc{ftp} retrievals. The
|
|
|
|
standard way to specify proxy location, which Wget recognizes, is using
|
|
|
|
the following environment variables:
|
|
|
|
|
|
|
|
@table @code
|
|
|
|
@item http_proxy
|
|
|
|
This variable should contain the @sc{url} of the proxy for @sc{http}
|
|
|
|
connections.
|
|
|
|
|
|
|
|
@item ftp_proxy
|
|
|
|
This variable should contain the @sc{url} of the proxy for @sc{http}
|
2000-03-02 08:44:56 -05:00
|
|
|
connections. It is quite common that @sc{http_proxy} and @sc{ftp_proxy}
|
1999-12-02 02:42:23 -05:00
|
|
|
are set to the same @sc{url}.
|
|
|
|
|
|
|
|
@item no_proxy
|
2000-03-02 08:44:56 -05:00
|
|
|
This variable should contain a comma-separated list of domain extensions
|
1999-12-02 02:42:23 -05:00
|
|
|
proxy should @emph{not} be used for. For instance, if the value of
|
|
|
|
@code{no_proxy} is @samp{.mit.edu}, proxy will not be used to retrieve
|
|
|
|
documents from MIT.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
In addition to the environment variables, proxy location and settings
|
|
|
|
may be specified from within Wget itself.
|
|
|
|
|
|
|
|
@table @samp
|
|
|
|
@item -Y on/off
|
|
|
|
@itemx --proxy=on/off
|
|
|
|
@itemx proxy = on/off
|
|
|
|
This option may be used to turn the proxy support on or off. Proxy
|
|
|
|
support is on by default, provided that the appropriate environment
|
|
|
|
variables are set.
|
|
|
|
|
|
|
|
@item http_proxy = @var{URL}
|
|
|
|
@itemx ftp_proxy = @var{URL}
|
|
|
|
@itemx no_proxy = @var{string}
|
|
|
|
These startup file variables allow you to override the proxy settings
|
|
|
|
specified by the environment.
|
|
|
|
@end table
|
|
|
|
|
|
|
|
Some proxy servers require authorization to enable you to use them. The
|
|
|
|
authorization consists of @dfn{username} and @dfn{password}, which must
|
|
|
|
be sent by Wget. As with @sc{http} authorization, several
|
|
|
|
authentication schemes exist. For proxy authorization only the
|
|
|
|
@code{Basic} authentication scheme is currently implemented.
|
|
|
|
|
|
|
|
You may specify your username and password either through the proxy
|
|
|
|
@sc{url} or through the command-line options. Assuming that the
|
|
|
|
company's proxy is located at @samp{proxy.srce.hr} at port 8001, a proxy
|
|
|
|
@sc{url} location containing authorization data might look like this:
|
|
|
|
|
|
|
|
@example
|
|
|
|
http://hniksic:mypassword@@proxy.company.com:8001/
|
|
|
|
@end example
|
|
|
|
|
|
|
|
Alternatively, you may use the @samp{proxy-user} and
|
|
|
|
@samp{proxy-password} options, and the equivalent @file{.wgetrc}
|
|
|
|
settings @code{proxy_user} and @code{proxy_passwd} to set the proxy
|
|
|
|
username and password.
|
|
|
|
|
|
|
|
@node Distribution, Mailing List, Proxies, Various
|
|
|
|
@section Distribution
|
|
|
|
@cindex latest version
|
|
|
|
|
|
|
|
Like all GNU utilities, the latest version of Wget can be found at the
|
|
|
|
master GNU archive site prep.ai.mit.edu, and its mirrors. For example,
|
|
|
|
Wget @value{VERSION} can be found at
|
2000-03-02 08:44:56 -05:00
|
|
|
@url{ftp://prep.ai.mit.edu/gnu/wget/wget-@value{VERSION}.tar.gz}
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@node Mailing List, Reporting Bugs, Distribution, Various
|
|
|
|
@section Mailing List
|
|
|
|
@cindex mailing list
|
|
|
|
@cindex list
|
|
|
|
|
|
|
|
Wget has its own mailing list at @email{wget@@sunsite.auc.dk}, thanks
|
|
|
|
to Karsten Thygesen. The mailing list is for discussion of Wget
|
|
|
|
features and web, reporting Wget bugs (those that you think may be of
|
|
|
|
interest to the public) and mailing announcements. You are welcome to
|
|
|
|
subscribe. The more people on the list, the better!
|
|
|
|
|
|
|
|
To subscribe, send mail to @email{wget-subscribe@@sunsite.auc.dk}.
|
|
|
|
the magic word @samp{subscribe} in the subject line. Unsubscribe by
|
|
|
|
mailing to @email{wget-unsubscribe@@sunsite.auc.dk}.
|
|
|
|
|
2000-11-10 09:47:30 -05:00
|
|
|
The mailing list is archived at @url{http://fly.srk.fer.hr/archive/wget}.
|
2001-02-10 19:22:42 -05:00
|
|
|
Alternative archive is available at
|
|
|
|
@url{http://www.mail-archive.com/wget%40sunsite.auc.dk/}.
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@node Reporting Bugs, Portability, Mailing List, Various
|
|
|
|
@section Reporting Bugs
|
|
|
|
@cindex bugs
|
|
|
|
@cindex reporting bugs
|
|
|
|
@cindex bug reports
|
|
|
|
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man begin BUGS
|
1999-12-02 02:42:23 -05:00
|
|
|
You are welcome to send bug reports about GNU Wget to
|
2001-02-10 19:22:42 -05:00
|
|
|
@email{bug-wget@@gnu.org}.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Before actually submitting a bug report, please try to follow a few
|
|
|
|
simple guidelines.
|
|
|
|
|
|
|
|
@enumerate
|
|
|
|
@item
|
|
|
|
Please try to ascertain that the behaviour you see really is a bug. If
|
|
|
|
Wget crashes, it's a bug. If Wget does not behave as documented,
|
|
|
|
it's a bug. If things work strange, but you are not sure about the way
|
|
|
|
they are supposed to work, it might well be a bug.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Try to repeat the bug in as simple circumstances as possible. E.g. if
|
|
|
|
Wget crashes on @samp{wget -rLl0 -t5 -Y0 http://yoyodyne.com -o
|
|
|
|
/tmp/log}, you should try to see if it will crash with a simpler set of
|
|
|
|
options.
|
|
|
|
|
|
|
|
Also, while I will probably be interested to know the contents of your
|
|
|
|
@file{.wgetrc} file, just dumping it into the debug message is probably
|
|
|
|
a bad idea. Instead, you should first try to see if the bug repeats
|
|
|
|
with @file{.wgetrc} moved out of the way. Only if it turns out that
|
|
|
|
@file{.wgetrc} settings affect the bug, should you mail me the relevant
|
|
|
|
parts of the file.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Please start Wget with @samp{-d} option and send the log (or the
|
|
|
|
relevant parts of it). If Wget was compiled without debug support,
|
|
|
|
recompile it. It is @emph{much} easier to trace bugs with debug support
|
|
|
|
on.
|
|
|
|
|
|
|
|
@item
|
|
|
|
If Wget has crashed, try to run it in a debugger, e.g. @code{gdb `which
|
|
|
|
wget` core} and type @code{where} to get the backtrace.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Find where the bug is, fix it and send me the patches. :-)
|
|
|
|
@end enumerate
|
2001-02-10 19:22:42 -05:00
|
|
|
@c man end
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@node Portability, Signals, Reporting Bugs, Various
|
|
|
|
@section Portability
|
|
|
|
@cindex portability
|
|
|
|
@cindex operating systems
|
|
|
|
|
|
|
|
Since Wget uses GNU Autoconf for building and configuring, and avoids
|
|
|
|
using ``special'' ultra--mega--cool features of any particular Unix, it
|
|
|
|
should compile (and work) on all common Unix flavors.
|
|
|
|
|
|
|
|
Various Wget versions have been compiled and tested under many kinds of
|
|
|
|
Unix systems, including Solaris, Linux, SunOS, OSF (aka Digital Unix),
|
|
|
|
Ultrix, *BSD, IRIX, and others; refer to the file @file{MACHINES} in the
|
|
|
|
distribution directory for a comprehensive list. If you compile it on
|
|
|
|
an architecture not listed there, please let me know so I can update it.
|
|
|
|
|
|
|
|
Wget should also compile on the other Unix systems, not listed in
|
|
|
|
@file{MACHINES}. If it doesn't, please let me know.
|
|
|
|
|
|
|
|
Thanks to kind contributors, this version of Wget compiles and works on
|
|
|
|
Microsoft Windows 95 and Windows NT platforms. It has been compiled
|
|
|
|
successfully using MS Visual C++ 4.0, Watcom, and Borland C compilers,
|
|
|
|
with Winsock as networking software. Naturally, it is crippled of some
|
|
|
|
features available on Unix, but it should work as a substitute for
|
|
|
|
people stuck with Windows. Note that the Windows port is
|
|
|
|
@strong{neither tested nor maintained} by me---all questions and
|
|
|
|
problems should be reported to Wget mailing list at
|
|
|
|
@email{wget@@sunsite.auc.dk} where the maintainers will look at them.
|
|
|
|
|
|
|
|
@node Signals, , Portability, Various
|
|
|
|
@section Signals
|
|
|
|
@cindex signal handling
|
|
|
|
@cindex hangup
|
|
|
|
|
|
|
|
Since the purpose of Wget is background work, it catches the hangup
|
|
|
|
signal (@code{SIGHUP}) and ignores it. If the output was on standard
|
|
|
|
output, it will be redirected to a file named @file{wget-log}.
|
|
|
|
Otherwise, @code{SIGHUP} is ignored. This is convenient when you wish
|
|
|
|
to redirect the output of Wget after having started it.
|
|
|
|
|
|
|
|
@example
|
|
|
|
$ wget http://www.ifi.uio.no/~larsi/gnus.tar.gz &
|
|
|
|
$ kill -HUP %% # Redirect the output to wget-log
|
|
|
|
@end example
|
|
|
|
|
2000-11-16 07:35:27 -05:00
|
|
|
Other than that, Wget will not try to interfere with signals in any way.
|
|
|
|
@kbd{C-c}, @code{kill -TERM} and @code{kill -KILL} should kill it alike.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@node Appendices, Copying, Various, Top
|
|
|
|
@chapter Appendices
|
|
|
|
|
2000-11-14 17:49:07 -05:00
|
|
|
This chapter contains some references I consider useful.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@menu
|
|
|
|
* Robots:: Wget as a WWW robot.
|
|
|
|
* Security Considerations:: Security with Wget.
|
|
|
|
* Contributors:: People who helped.
|
|
|
|
@end menu
|
|
|
|
|
|
|
|
@node Robots, Security Considerations, Appendices, Appendices
|
|
|
|
@section Robots
|
|
|
|
@cindex robots
|
|
|
|
@cindex robots.txt
|
|
|
|
@cindex server maintenance
|
|
|
|
|
2000-11-15 05:44:18 -05:00
|
|
|
It is extremely easy to make Wget wander aimlessly around a web site,
|
|
|
|
sucking all the available data in progress. @samp{wget -r @var{site}},
|
|
|
|
and you're set. Great? Not for the server admin.
|
|
|
|
|
|
|
|
While Wget is retrieving static pages, there's not much of a problem.
|
|
|
|
But for Wget, there is no real difference between the smallest static
|
|
|
|
page and the hardest, most demanding CGI or dynamic page. For instance,
|
|
|
|
a site I know has a section handled by an, uh, bitchin' CGI script that
|
|
|
|
converts all the Info files to HTML. The script can and does bring the
|
|
|
|
machine to its knees without providing anything useful to the
|
|
|
|
downloader.
|
|
|
|
|
|
|
|
For such and similar cases various robot exclusion schemes have been
|
|
|
|
devised as a means for the server administrators and document authors to
|
|
|
|
protect chosen portions of their sites from the wandering of robots.
|
|
|
|
|
|
|
|
The more popular mechanism is the @dfn{Robots Exclusion Standard}
|
|
|
|
written by Martijn Koster et al. in 1994. It is specified by placing a
|
|
|
|
file named @file{/robots.txt} in the server root, which the robots are
|
|
|
|
supposed to download and parse. Wget supports this specification.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
Norobots support is turned on only when retrieving recursively, and
|
|
|
|
@emph{never} for the first page. Thus, you may issue:
|
|
|
|
|
|
|
|
@example
|
2000-11-10 09:47:30 -05:00
|
|
|
wget -r http://fly.srk.fer.hr/
|
1999-12-02 02:42:23 -05:00
|
|
|
@end example
|
|
|
|
|
2000-11-10 09:47:30 -05:00
|
|
|
First the index of fly.srk.fer.hr will be downloaded. If Wget finds
|
1999-12-02 02:42:23 -05:00
|
|
|
anything worth downloading on the same host, only @emph{then} will it
|
|
|
|
load the robots, and decide whether or not to load the links after all.
|
2000-11-15 05:44:18 -05:00
|
|
|
@file{/robots.txt} is loaded only once per host.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-11-14 17:49:07 -05:00
|
|
|
Note that the exlusion standard discussed here has undergone some
|
|
|
|
revisions. However, but Wget supports only the first version of
|
|
|
|
@sc{res}, the one written by Martijn Koster in 1994, available at
|
|
|
|
@url{http://info.webcrawler.com/mak/projects/robots/norobots.html}. A
|
|
|
|
later version exists in the form of an internet draft
|
|
|
|
<draft-koster-robots-00.txt> titled ``A Method for Web Robots Control'',
|
|
|
|
which expired on June 4, 1997. I am not aware if it ever made to an
|
|
|
|
@sc{rfc}. The text of the draft is available at
|
|
|
|
@url{http://info.webcrawler.com/mak/projects/robots/norobots-rfc.html}.
|
|
|
|
Wget does not yet support the new directives specified by this draft,
|
|
|
|
but we plan to add them.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-11-14 17:49:07 -05:00
|
|
|
This manual no longer includes the text of the old standard.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
2000-11-15 05:44:18 -05:00
|
|
|
The second, less known mechanism, enables the author of an individual
|
|
|
|
document to specify whether they want the links from the file to be
|
|
|
|
followed by a robot. This is achieved using the @code{META} tag, like
|
|
|
|
this:
|
|
|
|
|
|
|
|
@example
|
|
|
|
<meta name="robots" content="nofollow">
|
|
|
|
@end example
|
|
|
|
|
|
|
|
This is explained in some detail at
|
|
|
|
@url{http://info.webcrawler.com/mak/projects/robots/meta-user.html}.
|
2000-11-19 15:50:10 -05:00
|
|
|
Wget supports this method of robot exclusion in addition to the usual
|
|
|
|
@file{/robots.txt} exclusion.
|
2000-11-15 05:44:18 -05:00
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@node Security Considerations, Contributors, Robots, Appendices
|
|
|
|
@section Security Considerations
|
|
|
|
@cindex security
|
|
|
|
|
|
|
|
When using Wget, you must be aware that it sends unencrypted passwords
|
|
|
|
through the network, which may present a security problem. Here are the
|
|
|
|
main issues, and some solutions.
|
|
|
|
|
|
|
|
@enumerate
|
|
|
|
@item
|
|
|
|
The passwords on the command line are visible using @code{ps}. If this
|
|
|
|
is a problem, avoid putting passwords from the command line---e.g. you
|
|
|
|
can use @file{.netrc} for this.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Using the insecure @dfn{basic} authentication scheme, unencrypted
|
|
|
|
passwords are transmitted through the network routers and gateways.
|
|
|
|
|
|
|
|
@item
|
|
|
|
The @sc{ftp} passwords are also in no way encrypted. There is no good
|
|
|
|
solution for this at the moment.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Although the ``normal'' output of Wget tries to hide the passwords,
|
|
|
|
debugging logs show them, in all forms. This problem is avoided by
|
|
|
|
being careful when you send debug logs (yes, even when you send them to
|
|
|
|
me).
|
|
|
|
@end enumerate
|
|
|
|
|
|
|
|
@node Contributors, , Security Considerations, Appendices
|
|
|
|
@section Contributors
|
|
|
|
@cindex contributors
|
|
|
|
|
|
|
|
@iftex
|
2000-10-23 11:43:04 -04:00
|
|
|
GNU Wget was written by Hrvoje Nik@v{s}i@'{c} @email{hniksic@@arsdigita.com}.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
2000-10-23 11:43:04 -04:00
|
|
|
GNU Wget was written by Hrvoje Niksic @email{hniksic@@arsdigita.com}.
|
1999-12-02 02:42:23 -05:00
|
|
|
@end ifinfo
|
|
|
|
However, its development could never have gone as far as it has, were it
|
|
|
|
not for the help of many people, either with bug reports, feature
|
|
|
|
proposals, patches, or letters saying ``Thanks!''.
|
|
|
|
|
|
|
|
Special thanks goes to the following people (no particular order):
|
|
|
|
|
|
|
|
@itemize @bullet
|
|
|
|
@item
|
2000-11-04 23:56:11 -05:00
|
|
|
Karsten Thygesen---donated system resources such as the mailing list,
|
|
|
|
web space, and @sc{ftp} space, along with a lot of time to make these
|
|
|
|
actually work.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Shawn McHorse---bug reports and patches.
|
|
|
|
|
|
|
|
@item
|
2000-11-10 09:47:30 -05:00
|
|
|
Kaveh R. Ghazi---on-the-fly @code{ansi2knr}-ization. Lots of
|
|
|
|
portability fixes.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
Gordon Matzigkeit---@file{.netrc} support.
|
|
|
|
|
|
|
|
@item
|
|
|
|
@iftex
|
|
|
|
Zlatko @v{C}alu@v{s}i@'{c}, Tomislav Vujec and Dra@v{z}en
|
|
|
|
Ka@v{c}ar---feature suggestions and ``philosophical'' discussions.
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
Zlatko Calusic, Tomislav Vujec and Drazen Kacar---feature suggestions
|
|
|
|
and ``philosophical'' discussions.
|
|
|
|
@end ifinfo
|
|
|
|
|
|
|
|
@item
|
|
|
|
Darko Budor---initial port to Windows.
|
|
|
|
|
|
|
|
@item
|
2000-03-02 08:44:56 -05:00
|
|
|
Antonio Rosella---help and suggestions, plus the Italian translation.
|
1999-12-02 02:42:23 -05:00
|
|
|
|
|
|
|
@item
|
|
|
|
@iftex
|
|
|
|
Tomislav Petrovi@'{c}, Mario Miko@v{c}evi@'{c}---many bug reports and
|
|
|
|
suggestions.
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
Tomislav Petrovic, Mario Mikocevic---many bug reports and suggestions.
|
|
|
|
@end ifinfo
|
|
|
|
|
|
|
|
@item
|
|
|
|
@iftex
|
|
|
|
Fran@,{c}ois Pinard---many thorough bug reports and discussions.
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
Francois Pinard---many thorough bug reports and discussions.
|
|
|
|
@end ifinfo
|
|
|
|
|
|
|
|
@item
|
|
|
|
Karl Eichwalder---lots of help with internationalization and other
|
|
|
|
things.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Junio Hamano---donated support for Opie and @sc{http} @code{Digest}
|
|
|
|
authentication.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Brian Gough---a generous donation.
|
|
|
|
@end itemize
|
|
|
|
|
|
|
|
The following people have provided patches, bug/build reports, useful
|
|
|
|
suggestions, beta testing services, fan mail and all the other things
|
|
|
|
that make maintenance so much fun:
|
|
|
|
|
|
|
|
Tim Adam,
|
2000-11-04 23:56:11 -05:00
|
|
|
Adrian Aichner,
|
1999-12-02 02:42:23 -05:00
|
|
|
Martin Baehr,
|
|
|
|
Dieter Baron,
|
|
|
|
Roger Beeman and the Gurus at Cisco,
|
2000-03-02 09:56:48 -05:00
|
|
|
Dan Berger,
|
1999-12-02 02:42:23 -05:00
|
|
|
Mark Boyns,
|
|
|
|
John Burden,
|
|
|
|
Wanderlei Cavassin,
|
|
|
|
Gilles Cedoc,
|
|
|
|
Tim Charron,
|
|
|
|
Noel Cragg,
|
|
|
|
@iftex
|
|
|
|
Kristijan @v{C}onka@v{s},
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
Kristijan Conkas,
|
|
|
|
@end ifinfo
|
2000-11-04 23:56:11 -05:00
|
|
|
John Daily,
|
|
|
|
Andrew Davison,
|
2000-03-02 08:36:47 -05:00
|
|
|
Andrew Deryabin,
|
2000-11-04 23:56:11 -05:00
|
|
|
Ulrich Drepper,
|
|
|
|
Marc Duponcheel,
|
1999-12-02 02:42:23 -05:00
|
|
|
@iftex
|
|
|
|
Damir D@v{z}eko,
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
Damir Dzeko,
|
|
|
|
@end ifinfo
|
|
|
|
@iftex
|
|
|
|
Aleksandar Erkalovi@'{c},
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
Aleksandar Erkalovic,
|
|
|
|
@end ifinfo
|
|
|
|
Andy Eskilsson,
|
|
|
|
Masashi Fujita,
|
|
|
|
Howard Gayle,
|
|
|
|
Marcel Gerrits,
|
|
|
|
Hans Grobler,
|
|
|
|
Mathieu Guillaume,
|
2000-03-02 08:36:47 -05:00
|
|
|
Dan Harkless,
|
2000-03-02 09:56:48 -05:00
|
|
|
Heiko Herold,
|
1999-12-02 02:42:23 -05:00
|
|
|
Karl Heuer,
|
2000-03-02 09:56:48 -05:00
|
|
|
HIROSE Masaaki,
|
1999-12-02 02:42:23 -05:00
|
|
|
Gregor Hoffleit,
|
|
|
|
Erik Magnus Hulthen,
|
|
|
|
Richard Huveneers,
|
|
|
|
Simon Josefsson,
|
|
|
|
@iftex
|
|
|
|
Mario Juri@'{c},
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
Mario Juric,
|
|
|
|
@end ifinfo
|
2000-11-04 23:56:11 -05:00
|
|
|
Const Kaplinsky,
|
1999-12-02 02:42:23 -05:00
|
|
|
@iftex
|
|
|
|
Goran Kezunovi@'{c},
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
Goran Kezunovic,
|
|
|
|
@end ifinfo
|
|
|
|
Robert Kleine,
|
|
|
|
Fila Kolodny,
|
|
|
|
Alexander Kourakos,
|
|
|
|
Martin Kraemer,
|
|
|
|
@tex
|
|
|
|
$\Sigma\acute{\iota}\mu o\varsigma\;
|
|
|
|
\Xi\varepsilon\nu\iota\tau\acute{\epsilon}\lambda\lambda\eta\varsigma$
|
|
|
|
(Simos KSenitellis),
|
|
|
|
@end tex
|
|
|
|
@ifinfo
|
|
|
|
Simos KSenitellis,
|
|
|
|
@end ifinfo
|
|
|
|
Hrvoje Lacko,
|
2000-03-02 08:44:56 -05:00
|
|
|
Daniel S. Lewart,
|
1999-12-02 02:42:23 -05:00
|
|
|
Dave Love,
|
2000-11-04 23:56:11 -05:00
|
|
|
Alexander V. Lukyanov,
|
1999-12-02 02:42:23 -05:00
|
|
|
Jordan Mendelson,
|
|
|
|
Lin Zhe Min,
|
2000-11-04 23:56:11 -05:00
|
|
|
Simon Munton,
|
1999-12-02 02:42:23 -05:00
|
|
|
Charlie Negyesi,
|
2000-11-04 23:56:11 -05:00
|
|
|
R. K. Owen,
|
1999-12-02 02:42:23 -05:00
|
|
|
Andrew Pollock,
|
|
|
|
Steve Pothier,
|
2001-02-10 19:22:42 -05:00
|
|
|
@iftex
|
|
|
|
Jan P@v{r}ikryl,
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
1999-12-02 02:42:23 -05:00
|
|
|
Jan Prikryl,
|
2001-02-10 19:22:42 -05:00
|
|
|
@end ifinfo
|
2000-03-02 08:44:56 -05:00
|
|
|
Marin Purgar,
|
1999-12-02 02:42:23 -05:00
|
|
|
Keith Refson,
|
2000-11-04 23:56:11 -05:00
|
|
|
Tyler Riddle,
|
1999-12-02 02:42:23 -05:00
|
|
|
Tobias Ringstrom,
|
|
|
|
@c Texinfo doesn't grok @'{@i}, so we have to use TeX itself.
|
|
|
|
@tex
|
|
|
|
Juan Jos\'{e} Rodr\'{\i}gues,
|
|
|
|
@end tex
|
|
|
|
@ifinfo
|
|
|
|
Juan Jose Rodrigues,
|
|
|
|
@end ifinfo
|
2000-03-02 08:36:47 -05:00
|
|
|
Edward J. Sabol,
|
1999-12-02 02:42:23 -05:00
|
|
|
Heinz Salzmann,
|
|
|
|
Robert Schmidt,
|
2000-11-04 23:56:11 -05:00
|
|
|
Andreas Schwab,
|
1999-12-02 02:42:23 -05:00
|
|
|
Toomas Soome,
|
2000-03-02 08:44:56 -05:00
|
|
|
Tage Stabell-Kulo,
|
1999-12-02 02:42:23 -05:00
|
|
|
Sven Sternberger,
|
|
|
|
Markus Strasser,
|
|
|
|
Szakacsits Szabolcs,
|
|
|
|
Mike Thomas,
|
|
|
|
Russell Vincent,
|
2000-03-02 09:56:48 -05:00
|
|
|
Charles G Waldman,
|
1999-12-02 02:42:23 -05:00
|
|
|
Douglas E. Wegscheid,
|
|
|
|
Jasmin Zainul,
|
|
|
|
@iftex
|
|
|
|
Bojan @v{Z}drnja,
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
Bojan Zdrnja,
|
|
|
|
@end ifinfo
|
|
|
|
Kristijan Zimmer.
|
|
|
|
|
|
|
|
Apologies to all who I accidentally left out, and many thanks to all the
|
|
|
|
subscribers of the Wget mailing list.
|
|
|
|
|
|
|
|
@node Copying, Concept Index, Appendices, Top
|
2000-11-14 17:49:07 -05:00
|
|
|
@chapter Copying
|
1999-12-02 02:42:23 -05:00
|
|
|
@cindex copying
|
|
|
|
@cindex GPL
|
2000-11-14 17:49:07 -05:00
|
|
|
@cindex GFDL
|
|
|
|
|
|
|
|
Wget is @dfn{free software}, where ``free'' refers to liberty, not
|
2000-11-16 07:35:27 -05:00
|
|
|
price. As the GNU people like to say, think of ``free speech'' rather
|
|
|
|
than ``free beer''. The exact legal distribution terms follow below,
|
|
|
|
but in short, you have the right (freedom) to run and change Wget and
|
|
|
|
distribute it to other people, and even---if you want---charge money for
|
|
|
|
any of these things. The sole restriction is that you have to grant
|
|
|
|
your recipients the same rights.
|
2000-11-14 17:49:07 -05:00
|
|
|
|
|
|
|
This method of licensing software is also known as @dfn{open-source},
|
|
|
|
because it requires that the recipients always receive a program's
|
|
|
|
source code along with the program.
|
|
|
|
|
|
|
|
More specifically:
|
|
|
|
|
|
|
|
@quotation
|
|
|
|
This program is free software; you can redistribute it and/or modify it
|
|
|
|
under the terms of the GNU General Public License as published by the
|
|
|
|
Free Software Foundation; either version 2 of the License, or (at your
|
|
|
|
option) any later version.
|
|
|
|
|
|
|
|
This program is distributed in the hope that it will be useful, but
|
|
|
|
WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
|
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU
|
|
|
|
General Public License for more details.
|
|
|
|
|
|
|
|
You should have received a copy of the GNU General Public License
|
|
|
|
along with this program; if not, write to the Free Software
|
|
|
|
Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
|
|
|
|
@end quotation
|
|
|
|
|
|
|
|
In addition to this, this manual is free in the same sense:
|
|
|
|
|
|
|
|
@quotation
|
|
|
|
Permission is granted to copy, distribute and/or modify this document
|
|
|
|
under the terms of the GNU Free Documentation License, Version 1.1 or
|
2000-11-15 05:44:18 -05:00
|
|
|
any later version published by the Free Software Foundation; with the
|
|
|
|
Invariant Sections being ``GNU General Public License'' and ``GNU Free
|
|
|
|
Documentation License'', with no Front-Cover Texts, and with no
|
|
|
|
Back-Cover Texts. A copy of the license is included in the section
|
|
|
|
entitled ``GNU Free Documentation License''.
|
2000-11-14 17:49:07 -05:00
|
|
|
@end quotation
|
|
|
|
|
|
|
|
@c #### Maybe we should wrap these licenses in ifinfo? Stallman says
|
|
|
|
@c that the GFDL needs to be present in the manual, and to me it would
|
|
|
|
@c suck to include the license for the manual and not the license for
|
|
|
|
@c the program.
|
|
|
|
|
|
|
|
The full texts of the GNU General Public License and of the GNU Free
|
|
|
|
Documentation License are available below.
|
|
|
|
|
|
|
|
@menu
|
|
|
|
* GNU General Public License::
|
|
|
|
* GNU Free Documentation License::
|
|
|
|
@end menu
|
|
|
|
|
|
|
|
@node GNU General Public License, GNU Free Documentation License, Copying, Copying
|
|
|
|
@section GNU General Public License
|
1999-12-02 02:42:23 -05:00
|
|
|
@center Version 2, June 1991
|
|
|
|
|
|
|
|
@display
|
|
|
|
Copyright @copyright{} 1989, 1991 Free Software Foundation, Inc.
|
|
|
|
675 Mass Ave, Cambridge, MA 02139, USA
|
|
|
|
|
|
|
|
Everyone is permitted to copy and distribute verbatim copies
|
|
|
|
of this license document, but changing it is not allowed.
|
|
|
|
@end display
|
|
|
|
|
|
|
|
@unnumberedsec Preamble
|
|
|
|
|
|
|
|
The licenses for most software are designed to take away your
|
|
|
|
freedom to share and change it. By contrast, the GNU General Public
|
|
|
|
License is intended to guarantee your freedom to share and change free
|
|
|
|
software---to make sure the software is free for all its users. This
|
|
|
|
General Public License applies to most of the Free Software
|
|
|
|
Foundation's software and to any other program whose authors commit to
|
|
|
|
using it. (Some other Free Software Foundation software is covered by
|
|
|
|
the GNU Library General Public License instead.) You can apply it to
|
|
|
|
your programs, too.
|
|
|
|
|
|
|
|
When we speak of free software, we are referring to freedom, not
|
|
|
|
price. Our General Public Licenses are designed to make sure that you
|
|
|
|
have the freedom to distribute copies of free software (and charge for
|
|
|
|
this service if you wish), that you receive source code or can get it
|
|
|
|
if you want it, that you can change the software or use pieces of it
|
|
|
|
in new free programs; and that you know you can do these things.
|
|
|
|
|
|
|
|
To protect your rights, we need to make restrictions that forbid
|
|
|
|
anyone to deny you these rights or to ask you to surrender the rights.
|
|
|
|
These restrictions translate to certain responsibilities for you if you
|
|
|
|
distribute copies of the software, or if you modify it.
|
|
|
|
|
|
|
|
For example, if you distribute copies of such a program, whether
|
|
|
|
gratis or for a fee, you must give the recipients all the rights that
|
|
|
|
you have. You must make sure that they, too, receive or can get the
|
|
|
|
source code. And you must show them these terms so they know their
|
|
|
|
rights.
|
|
|
|
|
|
|
|
We protect your rights with two steps: (1) copyright the software, and
|
|
|
|
(2) offer you this license which gives you legal permission to copy,
|
|
|
|
distribute and/or modify the software.
|
|
|
|
|
|
|
|
Also, for each author's protection and ours, we want to make certain
|
|
|
|
that everyone understands that there is no warranty for this free
|
|
|
|
software. If the software is modified by someone else and passed on, we
|
|
|
|
want its recipients to know that what they have is not the original, so
|
|
|
|
that any problems introduced by others will not reflect on the original
|
|
|
|
authors' reputations.
|
|
|
|
|
|
|
|
Finally, any free program is threatened constantly by software
|
|
|
|
patents. We wish to avoid the danger that redistributors of a free
|
|
|
|
program will individually obtain patent licenses, in effect making the
|
|
|
|
program proprietary. To prevent this, we have made it clear that any
|
|
|
|
patent must be licensed for everyone's free use or not licensed at all.
|
|
|
|
|
|
|
|
The precise terms and conditions for copying, distribution and
|
|
|
|
modification follow.
|
|
|
|
|
|
|
|
@iftex
|
|
|
|
@unnumberedsec TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
@center TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION
|
|
|
|
@end ifinfo
|
|
|
|
|
|
|
|
@enumerate
|
|
|
|
@item
|
|
|
|
This License applies to any program or other work which contains
|
|
|
|
a notice placed by the copyright holder saying it may be distributed
|
|
|
|
under the terms of this General Public License. The ``Program'', below,
|
|
|
|
refers to any such program or work, and a ``work based on the Program''
|
|
|
|
means either the Program or any derivative work under copyright law:
|
|
|
|
that is to say, a work containing the Program or a portion of it,
|
|
|
|
either verbatim or with modifications and/or translated into another
|
|
|
|
language. (Hereinafter, translation is included without limitation in
|
|
|
|
the term ``modification''.) Each licensee is addressed as ``you''.
|
|
|
|
|
|
|
|
Activities other than copying, distribution and modification are not
|
|
|
|
covered by this License; they are outside its scope. The act of
|
|
|
|
running the Program is not restricted, and the output from the Program
|
|
|
|
is covered only if its contents constitute a work based on the
|
|
|
|
Program (independent of having been made by running the Program).
|
|
|
|
Whether that is true depends on what the Program does.
|
|
|
|
|
|
|
|
@item
|
|
|
|
You may copy and distribute verbatim copies of the Program's
|
|
|
|
source code as you receive it, in any medium, provided that you
|
|
|
|
conspicuously and appropriately publish on each copy an appropriate
|
|
|
|
copyright notice and disclaimer of warranty; keep intact all the
|
|
|
|
notices that refer to this License and to the absence of any warranty;
|
|
|
|
and give any other recipients of the Program a copy of this License
|
|
|
|
along with the Program.
|
|
|
|
|
|
|
|
You may charge a fee for the physical act of transferring a copy, and
|
|
|
|
you may at your option offer warranty protection in exchange for a fee.
|
|
|
|
|
|
|
|
@item
|
|
|
|
You may modify your copy or copies of the Program or any portion
|
|
|
|
of it, thus forming a work based on the Program, and copy and
|
|
|
|
distribute such modifications or work under the terms of Section 1
|
|
|
|
above, provided that you also meet all of these conditions:
|
|
|
|
|
|
|
|
@enumerate a
|
|
|
|
@item
|
|
|
|
You must cause the modified files to carry prominent notices
|
|
|
|
stating that you changed the files and the date of any change.
|
|
|
|
|
|
|
|
@item
|
|
|
|
You must cause any work that you distribute or publish, that in
|
|
|
|
whole or in part contains or is derived from the Program or any
|
|
|
|
part thereof, to be licensed as a whole at no charge to all third
|
|
|
|
parties under the terms of this License.
|
|
|
|
|
|
|
|
@item
|
|
|
|
If the modified program normally reads commands interactively
|
|
|
|
when run, you must cause it, when started running for such
|
|
|
|
interactive use in the most ordinary way, to print or display an
|
|
|
|
announcement including an appropriate copyright notice and a
|
|
|
|
notice that there is no warranty (or else, saying that you provide
|
|
|
|
a warranty) and that users may redistribute the program under
|
|
|
|
these conditions, and telling the user how to view a copy of this
|
|
|
|
License. (Exception: if the Program itself is interactive but
|
|
|
|
does not normally print such an announcement, your work based on
|
|
|
|
the Program is not required to print an announcement.)
|
|
|
|
@end enumerate
|
|
|
|
|
|
|
|
These requirements apply to the modified work as a whole. If
|
|
|
|
identifiable sections of that work are not derived from the Program,
|
|
|
|
and can be reasonably considered independent and separate works in
|
|
|
|
themselves, then this License, and its terms, do not apply to those
|
|
|
|
sections when you distribute them as separate works. But when you
|
|
|
|
distribute the same sections as part of a whole which is a work based
|
|
|
|
on the Program, the distribution of the whole must be on the terms of
|
|
|
|
this License, whose permissions for other licensees extend to the
|
|
|
|
entire whole, and thus to each and every part regardless of who wrote it.
|
|
|
|
|
|
|
|
Thus, it is not the intent of this section to claim rights or contest
|
|
|
|
your rights to work written entirely by you; rather, the intent is to
|
|
|
|
exercise the right to control the distribution of derivative or
|
|
|
|
collective works based on the Program.
|
|
|
|
|
|
|
|
In addition, mere aggregation of another work not based on the Program
|
|
|
|
with the Program (or with a work based on the Program) on a volume of
|
|
|
|
a storage or distribution medium does not bring the other work under
|
|
|
|
the scope of this License.
|
|
|
|
|
|
|
|
@item
|
|
|
|
You may copy and distribute the Program (or a work based on it,
|
|
|
|
under Section 2) in object code or executable form under the terms of
|
|
|
|
Sections 1 and 2 above provided that you also do one of the following:
|
|
|
|
|
|
|
|
@enumerate a
|
|
|
|
@item
|
|
|
|
Accompany it with the complete corresponding machine-readable
|
|
|
|
source code, which must be distributed under the terms of Sections
|
|
|
|
1 and 2 above on a medium customarily used for software interchange; or,
|
|
|
|
|
|
|
|
@item
|
|
|
|
Accompany it with a written offer, valid for at least three
|
|
|
|
years, to give any third party, for a charge no more than your
|
|
|
|
cost of physically performing source distribution, a complete
|
|
|
|
machine-readable copy of the corresponding source code, to be
|
|
|
|
distributed under the terms of Sections 1 and 2 above on a medium
|
|
|
|
customarily used for software interchange; or,
|
|
|
|
|
|
|
|
@item
|
|
|
|
Accompany it with the information you received as to the offer
|
|
|
|
to distribute corresponding source code. (This alternative is
|
|
|
|
allowed only for noncommercial distribution and only if you
|
|
|
|
received the program in object code or executable form with such
|
|
|
|
an offer, in accord with Subsection b above.)
|
|
|
|
@end enumerate
|
|
|
|
|
|
|
|
The source code for a work means the preferred form of the work for
|
|
|
|
making modifications to it. For an executable work, complete source
|
|
|
|
code means all the source code for all modules it contains, plus any
|
|
|
|
associated interface definition files, plus the scripts used to
|
|
|
|
control compilation and installation of the executable. However, as a
|
|
|
|
special exception, the source code distributed need not include
|
|
|
|
anything that is normally distributed (in either source or binary
|
|
|
|
form) with the major components (compiler, kernel, and so on) of the
|
|
|
|
operating system on which the executable runs, unless that component
|
|
|
|
itself accompanies the executable.
|
|
|
|
|
|
|
|
If distribution of executable or object code is made by offering
|
|
|
|
access to copy from a designated place, then offering equivalent
|
|
|
|
access to copy the source code from the same place counts as
|
|
|
|
distribution of the source code, even though third parties are not
|
|
|
|
compelled to copy the source along with the object code.
|
|
|
|
|
|
|
|
@item
|
|
|
|
You may not copy, modify, sublicense, or distribute the Program
|
|
|
|
except as expressly provided under this License. Any attempt
|
|
|
|
otherwise to copy, modify, sublicense or distribute the Program is
|
|
|
|
void, and will automatically terminate your rights under this License.
|
|
|
|
However, parties who have received copies, or rights, from you under
|
|
|
|
this License will not have their licenses terminated so long as such
|
|
|
|
parties remain in full compliance.
|
|
|
|
|
|
|
|
@item
|
|
|
|
You are not required to accept this License, since you have not
|
|
|
|
signed it. However, nothing else grants you permission to modify or
|
|
|
|
distribute the Program or its derivative works. These actions are
|
|
|
|
prohibited by law if you do not accept this License. Therefore, by
|
|
|
|
modifying or distributing the Program (or any work based on the
|
|
|
|
Program), you indicate your acceptance of this License to do so, and
|
|
|
|
all its terms and conditions for copying, distributing or modifying
|
|
|
|
the Program or works based on it.
|
|
|
|
|
|
|
|
@item
|
|
|
|
Each time you redistribute the Program (or any work based on the
|
|
|
|
Program), the recipient automatically receives a license from the
|
|
|
|
original licensor to copy, distribute or modify the Program subject to
|
|
|
|
these terms and conditions. You may not impose any further
|
|
|
|
restrictions on the recipients' exercise of the rights granted herein.
|
|
|
|
You are not responsible for enforcing compliance by third parties to
|
|
|
|
this License.
|
|
|
|
|
|
|
|
@item
|
|
|
|
If, as a consequence of a court judgment or allegation of patent
|
|
|
|
infringement or for any other reason (not limited to patent issues),
|
|
|
|
conditions are imposed on you (whether by court order, agreement or
|
|
|
|
otherwise) that contradict the conditions of this License, they do not
|
|
|
|
excuse you from the conditions of this License. If you cannot
|
|
|
|
distribute so as to satisfy simultaneously your obligations under this
|
|
|
|
License and any other pertinent obligations, then as a consequence you
|
|
|
|
may not distribute the Program at all. For example, if a patent
|
|
|
|
license would not permit royalty-free redistribution of the Program by
|
|
|
|
all those who receive copies directly or indirectly through you, then
|
|
|
|
the only way you could satisfy both it and this License would be to
|
|
|
|
refrain entirely from distribution of the Program.
|
|
|
|
|
|
|
|
If any portion of this section is held invalid or unenforceable under
|
|
|
|
any particular circumstance, the balance of the section is intended to
|
|
|
|
apply and the section as a whole is intended to apply in other
|
|
|
|
circumstances.
|
|
|
|
|
|
|
|
It is not the purpose of this section to induce you to infringe any
|
|
|
|
patents or other property right claims or to contest validity of any
|
|
|
|
such claims; this section has the sole purpose of protecting the
|
|
|
|
integrity of the free software distribution system, which is
|
|
|
|
implemented by public license practices. Many people have made
|
|
|
|
generous contributions to the wide range of software distributed
|
|
|
|
through that system in reliance on consistent application of that
|
|
|
|
system; it is up to the author/donor to decide if he or she is willing
|
|
|
|
to distribute software through any other system and a licensee cannot
|
|
|
|
impose that choice.
|
|
|
|
|
|
|
|
This section is intended to make thoroughly clear what is believed to
|
|
|
|
be a consequence of the rest of this License.
|
|
|
|
|
|
|
|
@item
|
|
|
|
If the distribution and/or use of the Program is restricted in
|
|
|
|
certain countries either by patents or by copyrighted interfaces, the
|
|
|
|
original copyright holder who places the Program under this License
|
|
|
|
may add an explicit geographical distribution limitation excluding
|
|
|
|
those countries, so that distribution is permitted only in or among
|
|
|
|
countries not thus excluded. In such case, this License incorporates
|
|
|
|
the limitation as if written in the body of this License.
|
|
|
|
|
|
|
|
@item
|
|
|
|
The Free Software Foundation may publish revised and/or new versions
|
|
|
|
of the General Public License from time to time. Such new versions will
|
|
|
|
be similar in spirit to the present version, but may differ in detail to
|
|
|
|
address new problems or concerns.
|
|
|
|
|
|
|
|
Each version is given a distinguishing version number. If the Program
|
|
|
|
specifies a version number of this License which applies to it and ``any
|
|
|
|
later version'', you have the option of following the terms and conditions
|
|
|
|
either of that version or of any later version published by the Free
|
|
|
|
Software Foundation. If the Program does not specify a version number of
|
|
|
|
this License, you may choose any version ever published by the Free Software
|
|
|
|
Foundation.
|
|
|
|
|
|
|
|
@item
|
|
|
|
If you wish to incorporate parts of the Program into other free
|
|
|
|
programs whose distribution conditions are different, write to the author
|
|
|
|
to ask for permission. For software which is copyrighted by the Free
|
|
|
|
Software Foundation, write to the Free Software Foundation; we sometimes
|
|
|
|
make exceptions for this. Our decision will be guided by the two goals
|
|
|
|
of preserving the free status of all derivatives of our free software and
|
|
|
|
of promoting the sharing and reuse of software generally.
|
|
|
|
|
|
|
|
@iftex
|
|
|
|
@heading NO WARRANTY
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
@center NO WARRANTY
|
|
|
|
@end ifinfo
|
|
|
|
@cindex no warranty
|
|
|
|
|
|
|
|
@item
|
|
|
|
BECAUSE THE PROGRAM IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY
|
|
|
|
FOR THE PROGRAM, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN
|
|
|
|
OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES
|
|
|
|
PROVIDE THE PROGRAM ``AS IS'' WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED
|
|
|
|
OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF
|
|
|
|
MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS
|
|
|
|
TO THE QUALITY AND PERFORMANCE OF THE PROGRAM IS WITH YOU. SHOULD THE
|
|
|
|
PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING,
|
|
|
|
REPAIR OR CORRECTION.
|
|
|
|
|
|
|
|
@item
|
|
|
|
IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
|
|
|
|
WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR
|
|
|
|
REDISTRIBUTE THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES,
|
|
|
|
INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING
|
|
|
|
OUT OF THE USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED
|
|
|
|
TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY
|
|
|
|
YOU OR THIRD PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER
|
|
|
|
PROGRAMS), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE
|
|
|
|
POSSIBILITY OF SUCH DAMAGES.
|
|
|
|
@end enumerate
|
|
|
|
|
|
|
|
@iftex
|
|
|
|
@heading END OF TERMS AND CONDITIONS
|
|
|
|
@end iftex
|
|
|
|
@ifinfo
|
|
|
|
@center END OF TERMS AND CONDITIONS
|
|
|
|
@end ifinfo
|
|
|
|
|
|
|
|
@page
|
|
|
|
@unnumberedsec How to Apply These Terms to Your New Programs
|
|
|
|
|
|
|
|
If you develop a new program, and you want it to be of the greatest
|
|
|
|
possible use to the public, the best way to achieve this is to make it
|
|
|
|
free software which everyone can redistribute and change under these terms.
|
|
|
|
|
|
|
|
To do so, attach the following notices to the program. It is safest
|
|
|
|
to attach them to the start of each source file to most effectively
|
|
|
|
convey the exclusion of warranty; and each file should have at least
|
|
|
|
the ``copyright'' line and a pointer to where the full notice is found.
|
|
|
|
|
|
|
|
@smallexample
|
|
|
|
@var{one line to give the program's name and an idea of what it does.}
|
|
|
|
Copyright (C) 19@var{yy} @var{name of author}
|
|
|
|
|
|
|
|
This program is free software; you can redistribute it and/or
|
|
|
|
modify it under the terms of the GNU General Public License
|
|
|
|
as published by the Free Software Foundation; either version 2
|
|
|
|
of the License, or (at your option) any later version.
|
|
|
|
|
|
|
|
This program is distributed in the hope that it will be useful,
|
|
|
|
but WITHOUT ANY WARRANTY; without even the implied warranty of
|
|
|
|
MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
|
|
|
GNU General Public License for more details.
|
|
|
|
|
|
|
|
You should have received a copy of the GNU General Public License
|
|
|
|
along with this program; if not, write to the Free Software
|
|
|
|
Foundation, Inc., 675 Mass Ave, Cambridge, MA 02139, USA.
|
|
|
|
@end smallexample
|
|
|
|
|
|
|
|
Also add information on how to contact you by electronic and paper mail.
|
|
|
|
|
|
|
|
If the program is interactive, make it output a short notice like this
|
|
|
|
when it starts in an interactive mode:
|
|
|
|
|
|
|
|
@smallexample
|
|
|
|
Gnomovision version 69, Copyright (C) 19@var{yy} @var{name of author}
|
|
|
|
Gnomovision comes with ABSOLUTELY NO WARRANTY; for details
|
|
|
|
type `show w'. This is free software, and you are welcome
|
2000-03-02 08:44:56 -05:00
|
|
|
to redistribute it under certain conditions; type `show c'
|
1999-12-02 02:42:23 -05:00
|
|
|
for details.
|
|
|
|
@end smallexample
|
|
|
|
|
|
|
|
The hypothetical commands @samp{show w} and @samp{show c} should show
|
|
|
|
the appropriate parts of the General Public License. Of course, the
|
|
|
|
commands you use may be called something other than @samp{show w} and
|
|
|
|
@samp{show c}; they could even be mouse-clicks or menu items---whatever
|
|
|
|
suits your program.
|
|
|
|
|
|
|
|
You should also get your employer (if you work as a programmer) or your
|
|
|
|
school, if any, to sign a ``copyright disclaimer'' for the program, if
|
|
|
|
necessary. Here is a sample; alter the names:
|
|
|
|
|
|
|
|
@smallexample
|
|
|
|
@group
|
|
|
|
Yoyodyne, Inc., hereby disclaims all copyright
|
|
|
|
interest in the program `Gnomovision'
|
2000-03-02 08:44:56 -05:00
|
|
|
(which makes passes at compilers) written
|
1999-12-02 02:42:23 -05:00
|
|
|
by James Hacker.
|
|
|
|
|
|
|
|
@var{signature of Ty Coon}, 1 April 1989
|
|
|
|
Ty Coon, President of Vice
|
|
|
|
@end group
|
|
|
|
@end smallexample
|
|
|
|
|
|
|
|
This General Public License does not permit incorporating your program into
|
|
|
|
proprietary programs. If your program is a subroutine library, you may
|
|
|
|
consider it more useful to permit linking proprietary applications with the
|
|
|
|
library. If this is what you want to do, use the GNU Library General
|
|
|
|
Public License instead of this License.
|
|
|
|
|
2000-11-14 17:49:07 -05:00
|
|
|
@node GNU Free Documentation License, , GNU General Public License, Copying
|
|
|
|
@section GNU Free Documentation License
|
|
|
|
@center Version 1.1, March 2000
|
|
|
|
|
|
|
|
@display
|
|
|
|
Copyright (C) 2000 Free Software Foundation, Inc.
|
|
|
|
59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
|
|
|
|
|
|
|
|
Everyone is permitted to copy and distribute verbatim copies
|
|
|
|
of this license document, but changing it is not allowed.
|
|
|
|
@end display
|
|
|
|
@sp 1
|
|
|
|
@enumerate 0
|
|
|
|
@item
|
|
|
|
PREAMBLE
|
|
|
|
|
|
|
|
The purpose of this License is to make a manual, textbook, or other
|
|
|
|
written document ``free'' in the sense of freedom: to assure everyone
|
|
|
|
the effective freedom to copy and redistribute it, with or without
|
|
|
|
modifying it, either commercially or noncommercially. Secondarily,
|
|
|
|
this License preserves for the author and publisher a way to get
|
|
|
|
credit for their work, while not being considered responsible for
|
|
|
|
modifications made by others.
|
|
|
|
|
|
|
|
This License is a kind of ``copyleft'', which means that derivative
|
|
|
|
works of the document must themselves be free in the same sense. It
|
|
|
|
complements the GNU General Public License, which is a copyleft
|
|
|
|
license designed for free software.
|
|
|
|
|
|
|
|
We have designed this License in order to use it for manuals for free
|
|
|
|
software, because free software needs free documentation: a free
|
|
|
|
program should come with manuals providing the same freedoms that the
|
|
|
|
software does. But this License is not limited to software manuals;
|
|
|
|
it can be used for any textual work, regardless of subject matter or
|
|
|
|
whether it is published as a printed book. We recommend this License
|
|
|
|
principally for works whose purpose is instruction or reference.
|
|
|
|
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
APPLICABILITY AND DEFINITIONS
|
|
|
|
|
|
|
|
This License applies to any manual or other work that contains a
|
|
|
|
notice placed by the copyright holder saying it can be distributed
|
|
|
|
under the terms of this License. The ``Document'', below, refers to any
|
|
|
|
such manual or work. Any member of the public is a licensee, and is
|
|
|
|
addressed as ``you''.
|
|
|
|
|
|
|
|
A ``Modified Version'' of the Document means any work containing the
|
|
|
|
Document or a portion of it, either copied verbatim, or with
|
|
|
|
modifications and/or translated into another language.
|
|
|
|
|
|
|
|
A ``Secondary Section'' is a named appendix or a front-matter section of
|
|
|
|
the Document that deals exclusively with the relationship of the
|
|
|
|
publishers or authors of the Document to the Document's overall subject
|
|
|
|
(or to related matters) and contains nothing that could fall directly
|
|
|
|
within that overall subject. (For example, if the Document is in part a
|
|
|
|
textbook of mathematics, a Secondary Section may not explain any
|
|
|
|
mathematics.) The relationship could be a matter of historical
|
|
|
|
connection with the subject or with related matters, or of legal,
|
|
|
|
commercial, philosophical, ethical or political position regarding
|
|
|
|
them.
|
|
|
|
|
|
|
|
The ``Invariant Sections'' are certain Secondary Sections whose titles
|
|
|
|
are designated, as being those of Invariant Sections, in the notice
|
|
|
|
that says that the Document is released under this License.
|
|
|
|
|
|
|
|
The ``Cover Texts'' are certain short passages of text that are listed,
|
|
|
|
as Front-Cover Texts or Back-Cover Texts, in the notice that says that
|
|
|
|
the Document is released under this License.
|
|
|
|
|
|
|
|
A ``Transparent'' copy of the Document means a machine-readable copy,
|
|
|
|
represented in a format whose specification is available to the
|
|
|
|
general public, whose contents can be viewed and edited directly and
|
|
|
|
straightforwardly with generic text editors or (for images composed of
|
|
|
|
pixels) generic paint programs or (for drawings) some widely available
|
|
|
|
drawing editor, and that is suitable for input to text formatters or
|
|
|
|
for automatic translation to a variety of formats suitable for input
|
|
|
|
to text formatters. A copy made in an otherwise Transparent file
|
|
|
|
format whose markup has been designed to thwart or discourage
|
|
|
|
subsequent modification by readers is not Transparent. A copy that is
|
|
|
|
not ``Transparent'' is called ``Opaque''.
|
|
|
|
|
|
|
|
Examples of suitable formats for Transparent copies include plain
|
|
|
|
ASCII without markup, Texinfo input format, LaTeX input format, SGML
|
|
|
|
or XML using a publicly available DTD, and standard-conforming simple
|
|
|
|
HTML designed for human modification. Opaque formats include
|
|
|
|
PostScript, PDF, proprietary formats that can be read and edited only
|
|
|
|
by proprietary word processors, SGML or XML for which the DTD and/or
|
|
|
|
processing tools are not generally available, and the
|
|
|
|
machine-generated HTML produced by some word processors for output
|
|
|
|
purposes only.
|
|
|
|
|
|
|
|
The ``Title Page'' means, for a printed book, the title page itself,
|
|
|
|
plus such following pages as are needed to hold, legibly, the material
|
|
|
|
this License requires to appear in the title page. For works in
|
|
|
|
formats which do not have any title page as such, ``Title Page'' means
|
|
|
|
the text near the most prominent appearance of the work's title,
|
|
|
|
preceding the beginning of the body of the text.
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
VERBATIM COPYING
|
|
|
|
|
|
|
|
You may copy and distribute the Document in any medium, either
|
|
|
|
commercially or noncommercially, provided that this License, the
|
|
|
|
copyright notices, and the license notice saying this License applies
|
|
|
|
to the Document are reproduced in all copies, and that you add no other
|
|
|
|
conditions whatsoever to those of this License. You may not use
|
|
|
|
technical measures to obstruct or control the reading or further
|
|
|
|
copying of the copies you make or distribute. However, you may accept
|
|
|
|
compensation in exchange for copies. If you distribute a large enough
|
|
|
|
number of copies you must also follow the conditions in section 3.
|
|
|
|
|
|
|
|
You may also lend copies, under the same conditions stated above, and
|
|
|
|
you may publicly display copies.
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
COPYING IN QUANTITY
|
|
|
|
|
|
|
|
If you publish printed copies of the Document numbering more than 100,
|
|
|
|
and the Document's license notice requires Cover Texts, you must enclose
|
|
|
|
the copies in covers that carry, clearly and legibly, all these Cover
|
|
|
|
Texts: Front-Cover Texts on the front cover, and Back-Cover Texts on
|
|
|
|
the back cover. Both covers must also clearly and legibly identify
|
|
|
|
you as the publisher of these copies. The front cover must present
|
|
|
|
the full title with all words of the title equally prominent and
|
|
|
|
visible. You may add other material on the covers in addition.
|
|
|
|
Copying with changes limited to the covers, as long as they preserve
|
|
|
|
the title of the Document and satisfy these conditions, can be treated
|
|
|
|
as verbatim copying in other respects.
|
|
|
|
|
|
|
|
If the required texts for either cover are too voluminous to fit
|
|
|
|
legibly, you should put the first ones listed (as many as fit
|
|
|
|
reasonably) on the actual cover, and continue the rest onto adjacent
|
|
|
|
pages.
|
|
|
|
|
|
|
|
If you publish or distribute Opaque copies of the Document numbering
|
|
|
|
more than 100, you must either include a machine-readable Transparent
|
|
|
|
copy along with each Opaque copy, or state in or with each Opaque copy
|
|
|
|
a publicly-accessible computer-network location containing a complete
|
|
|
|
Transparent copy of the Document, free of added material, which the
|
|
|
|
general network-using public has access to download anonymously at no
|
|
|
|
charge using public-standard network protocols. If you use the latter
|
|
|
|
option, you must take reasonably prudent steps, when you begin
|
|
|
|
distribution of Opaque copies in quantity, to ensure that this
|
|
|
|
Transparent copy will remain thus accessible at the stated location
|
|
|
|
until at least one year after the last time you distribute an Opaque
|
|
|
|
copy (directly or through your agents or retailers) of that edition to
|
|
|
|
the public.
|
|
|
|
|
|
|
|
It is requested, but not required, that you contact the authors of the
|
|
|
|
Document well before redistributing any large number of copies, to give
|
|
|
|
them a chance to provide you with an updated version of the Document.
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
MODIFICATIONS
|
|
|
|
|
|
|
|
You may copy and distribute a Modified Version of the Document under
|
|
|
|
the conditions of sections 2 and 3 above, provided that you release
|
|
|
|
the Modified Version under precisely this License, with the Modified
|
|
|
|
Version filling the role of the Document, thus licensing distribution
|
|
|
|
and modification of the Modified Version to whoever possesses a copy
|
|
|
|
of it. In addition, you must do these things in the Modified Version:
|
|
|
|
|
|
|
|
A. Use in the Title Page (and on the covers, if any) a title distinct
|
|
|
|
from that of the Document, and from those of previous versions
|
|
|
|
(which should, if there were any, be listed in the History section
|
|
|
|
of the Document). You may use the same title as a previous version
|
|
|
|
if the original publisher of that version gives permission.@*
|
|
|
|
B. List on the Title Page, as authors, one or more persons or entities
|
|
|
|
responsible for authorship of the modifications in the Modified
|
|
|
|
Version, together with at least five of the principal authors of the
|
|
|
|
Document (all of its principal authors, if it has less than five).@*
|
|
|
|
C. State on the Title page the name of the publisher of the
|
|
|
|
Modified Version, as the publisher.@*
|
|
|
|
D. Preserve all the copyright notices of the Document.@*
|
|
|
|
E. Add an appropriate copyright notice for your modifications
|
|
|
|
adjacent to the other copyright notices.@*
|
|
|
|
F. Include, immediately after the copyright notices, a license notice
|
|
|
|
giving the public permission to use the Modified Version under the
|
|
|
|
terms of this License, in the form shown in the Addendum below.@*
|
|
|
|
G. Preserve in that license notice the full lists of Invariant Sections
|
|
|
|
and required Cover Texts given in the Document's license notice.@*
|
|
|
|
H. Include an unaltered copy of this License.@*
|
|
|
|
I. Preserve the section entitled ``History'', and its title, and add to
|
|
|
|
it an item stating at least the title, year, new authors, and
|
|
|
|
publisher of the Modified Version as given on the Title Page. If
|
|
|
|
there is no section entitled ``History'' in the Document, create one
|
|
|
|
stating the title, year, authors, and publisher of the Document as
|
|
|
|
given on its Title Page, then add an item describing the Modified
|
|
|
|
Version as stated in the previous sentence.@*
|
|
|
|
J. Preserve the network location, if any, given in the Document for
|
|
|
|
public access to a Transparent copy of the Document, and likewise
|
|
|
|
the network locations given in the Document for previous versions
|
|
|
|
it was based on. These may be placed in the ``History'' section.
|
|
|
|
You may omit a network location for a work that was published at
|
|
|
|
least four years before the Document itself, or if the original
|
|
|
|
publisher of the version it refers to gives permission.@*
|
|
|
|
K. In any section entitled ``Acknowledgements'' or ``Dedications'',
|
|
|
|
preserve the section's title, and preserve in the section all the
|
|
|
|
substance and tone of each of the contributor acknowledgements
|
|
|
|
and/or dedications given therein.@*
|
|
|
|
L. Preserve all the Invariant Sections of the Document,
|
|
|
|
unaltered in their text and in their titles. Section numbers
|
|
|
|
or the equivalent are not considered part of the section titles.@*
|
|
|
|
M. Delete any section entitled ``Endorsements''. Such a section
|
|
|
|
may not be included in the Modified Version.@*
|
|
|
|
N. Do not retitle any existing section as ``Endorsements''
|
|
|
|
or to conflict in title with any Invariant Section.@*
|
|
|
|
@sp 1
|
|
|
|
If the Modified Version includes new front-matter sections or
|
|
|
|
appendices that qualify as Secondary Sections and contain no material
|
|
|
|
copied from the Document, you may at your option designate some or all
|
|
|
|
of these sections as invariant. To do this, add their titles to the
|
|
|
|
list of Invariant Sections in the Modified Version's license notice.
|
|
|
|
These titles must be distinct from any other section titles.
|
|
|
|
|
|
|
|
You may add a section entitled ``Endorsements'', provided it contains
|
|
|
|
nothing but endorsements of your Modified Version by various
|
|
|
|
parties--for example, statements of peer review or that the text has
|
|
|
|
been approved by an organization as the authoritative definition of a
|
|
|
|
standard.
|
|
|
|
|
|
|
|
You may add a passage of up to five words as a Front-Cover Text, and a
|
|
|
|
passage of up to 25 words as a Back-Cover Text, to the end of the list
|
|
|
|
of Cover Texts in the Modified Version. Only one passage of
|
|
|
|
Front-Cover Text and one of Back-Cover Text may be added by (or
|
|
|
|
through arrangements made by) any one entity. If the Document already
|
|
|
|
includes a cover text for the same cover, previously added by you or
|
|
|
|
by arrangement made by the same entity you are acting on behalf of,
|
|
|
|
you may not add another; but you may replace the old one, on explicit
|
|
|
|
permission from the previous publisher that added the old one.
|
|
|
|
|
|
|
|
The author(s) and publisher(s) of the Document do not by this License
|
|
|
|
give permission to use their names for publicity for or to assert or
|
|
|
|
imply endorsement of any Modified Version.
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
COMBINING DOCUMENTS
|
|
|
|
|
|
|
|
You may combine the Document with other documents released under this
|
|
|
|
License, under the terms defined in section 4 above for modified
|
|
|
|
versions, provided that you include in the combination all of the
|
|
|
|
Invariant Sections of all of the original documents, unmodified, and
|
|
|
|
list them all as Invariant Sections of your combined work in its
|
|
|
|
license notice.
|
|
|
|
|
|
|
|
The combined work need only contain one copy of this License, and
|
|
|
|
multiple identical Invariant Sections may be replaced with a single
|
|
|
|
copy. If there are multiple Invariant Sections with the same name but
|
|
|
|
different contents, make the title of each such section unique by
|
|
|
|
adding at the end of it, in parentheses, the name of the original
|
|
|
|
author or publisher of that section if known, or else a unique number.
|
|
|
|
Make the same adjustment to the section titles in the list of
|
|
|
|
Invariant Sections in the license notice of the combined work.
|
|
|
|
|
|
|
|
In the combination, you must combine any sections entitled ``History''
|
|
|
|
in the various original documents, forming one section entitled
|
|
|
|
``History''; likewise combine any sections entitled ``Acknowledgements'',
|
|
|
|
and any sections entitled ``Dedications''. You must delete all sections
|
|
|
|
entitled ``Endorsements.''
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
COLLECTIONS OF DOCUMENTS
|
|
|
|
|
|
|
|
You may make a collection consisting of the Document and other documents
|
|
|
|
released under this License, and replace the individual copies of this
|
|
|
|
License in the various documents with a single copy that is included in
|
|
|
|
the collection, provided that you follow the rules of this License for
|
|
|
|
verbatim copying of each of the documents in all other respects.
|
|
|
|
|
|
|
|
You may extract a single document from such a collection, and distribute
|
|
|
|
it individually under this License, provided you insert a copy of this
|
|
|
|
License into the extracted document, and follow this License in all
|
|
|
|
other respects regarding verbatim copying of that document.
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
AGGREGATION WITH INDEPENDENT WORKS
|
|
|
|
|
|
|
|
A compilation of the Document or its derivatives with other separate
|
|
|
|
and independent documents or works, in or on a volume of a storage or
|
|
|
|
distribution medium, does not as a whole count as a Modified Version
|
|
|
|
of the Document, provided no compilation copyright is claimed for the
|
|
|
|
compilation. Such a compilation is called an ``aggregate'', and this
|
|
|
|
License does not apply to the other self-contained works thus compiled
|
|
|
|
with the Document, on account of their being thus compiled, if they
|
|
|
|
are not themselves derivative works of the Document.
|
|
|
|
|
|
|
|
If the Cover Text requirement of section 3 is applicable to these
|
|
|
|
copies of the Document, then if the Document is less than one quarter
|
|
|
|
of the entire aggregate, the Document's Cover Texts may be placed on
|
|
|
|
covers that surround only the Document within the aggregate.
|
|
|
|
Otherwise they must appear on covers around the whole aggregate.
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
TRANSLATION
|
|
|
|
|
|
|
|
Translation is considered a kind of modification, so you may
|
|
|
|
distribute translations of the Document under the terms of section 4.
|
|
|
|
Replacing Invariant Sections with translations requires special
|
|
|
|
permission from their copyright holders, but you may include
|
|
|
|
translations of some or all Invariant Sections in addition to the
|
|
|
|
original versions of these Invariant Sections. You may include a
|
|
|
|
translation of this License provided that you also include the
|
|
|
|
original English version of this License. In case of a disagreement
|
|
|
|
between the translation and the original English version of this
|
|
|
|
License, the original English version will prevail.
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
TERMINATION
|
|
|
|
|
|
|
|
You may not copy, modify, sublicense, or distribute the Document except
|
|
|
|
as expressly provided for under this License. Any other attempt to
|
|
|
|
copy, modify, sublicense or distribute the Document is void, and will
|
|
|
|
automatically terminate your rights under this License. However,
|
|
|
|
parties who have received copies, or rights, from you under this
|
|
|
|
License will not have their licenses terminated so long as such
|
|
|
|
parties remain in full compliance.
|
|
|
|
@sp 1
|
|
|
|
@item
|
|
|
|
FUTURE REVISIONS OF THIS LICENSE
|
|
|
|
|
|
|
|
The Free Software Foundation may publish new, revised versions
|
|
|
|
of the GNU Free Documentation License from time to time. Such new
|
|
|
|
versions will be similar in spirit to the present version, but may
|
|
|
|
differ in detail to address new problems or concerns. See
|
|
|
|
http://www.gnu.org/copyleft/.
|
|
|
|
|
|
|
|
Each version of the License is given a distinguishing version number.
|
|
|
|
If the Document specifies that a particular numbered version of this
|
|
|
|
License ``or any later version'' applies to it, you have the option of
|
|
|
|
following the terms and conditions either of that specified version or
|
|
|
|
of any later version that has been published (not as a draft) by the
|
|
|
|
Free Software Foundation. If the Document does not specify a version
|
|
|
|
number of this License, you may choose any version ever published (not
|
|
|
|
as a draft) by the Free Software Foundation.
|
|
|
|
|
|
|
|
@end enumerate
|
|
|
|
|
|
|
|
@unnumberedsec ADDENDUM: How to use this License for your documents
|
|
|
|
|
|
|
|
To use this License in a document you have written, include a copy of
|
|
|
|
the License in the document and put the following copyright and
|
|
|
|
license notices just after the title page:
|
|
|
|
|
|
|
|
@smallexample
|
|
|
|
@group
|
|
|
|
|
|
|
|
Copyright (C) @var{year} @var{your name}.
|
|
|
|
Permission is granted to copy, distribute and/or modify this document
|
|
|
|
under the terms of the GNU Free Documentation License, Version 1.1
|
|
|
|
or any later version published by the Free Software Foundation;
|
|
|
|
with the Invariant Sections being @var{list their titles}, with the
|
|
|
|
Front-Cover Texts being @var{list}, and with the Back-Cover Texts being @var{list}.
|
|
|
|
A copy of the license is included in the section entitled ``GNU
|
|
|
|
Free Documentation License''.
|
|
|
|
@end group
|
|
|
|
@end smallexample
|
|
|
|
If you have no Invariant Sections, write ``with no Invariant Sections''
|
|
|
|
instead of saying which ones are invariant. If you have no
|
|
|
|
Front-Cover Texts, write ``no Front-Cover Texts'' instead of
|
|
|
|
``Front-Cover Texts being @var{list}''; likewise for Back-Cover Texts.
|
|
|
|
|
|
|
|
If your document contains nontrivial examples of program code, we
|
|
|
|
recommend releasing these examples in parallel under your choice of
|
|
|
|
free software license, such as the GNU General Public License,
|
|
|
|
to permit their use in free software.
|
|
|
|
|
|
|
|
|
1999-12-02 02:42:23 -05:00
|
|
|
@node Concept Index, , Copying, Top
|
|
|
|
@unnumbered Concept Index
|
|
|
|
@printindex cp
|
|
|
|
|
|
|
|
@contents
|
|
|
|
|
|
|
|
@bye
|