1
0
mirror of https://github.com/moparisthebest/curl synced 2024-11-10 11:35:07 -05:00

cmdline-docs: more conversion

This commit is contained in:
Daniel Stenberg 2016-11-16 14:20:36 +01:00
parent c3c1e96185
commit 342aa4797e
36 changed files with 477 additions and 8 deletions

View File

@ -0,0 +1,18 @@
Long: connect-to
Arg: <HOST1:PORT1:HOST2:PORT2>
Help: Connect to host
Added: 7.49.0
See-also: resolve header
---
For a request to the given HOST:PORT pair, connect to
CONNECT-TO-HOST:CONNECT-TO-PORT instead. This option is suitable to direct
requests at a specific server, e.g. at a specific cluster node in a cluster of
servers. This option is only used to establish the network connection. It
does NOT affect the hostname/port that is used for TLS/SSL (e.g. SNI,
certificate verification) or for the application protocols. "host" and "port"
may be the empty string, meaning "any host/port". "connect-to-host" and
"connect-to-port" may also be the empty string, meaning "use the request's
original host/port".
This option can be used many times to add many connect rules.

View File

@ -0,0 +1,32 @@
Long: ftp-port
Arg: <address>
Help: Use PORT instead of PASV
Short: P
Protocols: FTP
See-also: ftp-pasv disable-eprt
---
Reverses the default initiator/listener roles when connecting with FTP. This
option makes curl use active mode. curl then tells the server to connect back
to the client's specified address and port, while passive mode asks the server
to setup an IP address and port for it to connect to. <address> should be one
of:
.RS
.IP interface
i.e "eth0" to specify which interface's IP address you want to use (Unix only)
.IP "IP address"
i.e "192.168.10.1" to specify the exact IP address
.IP "host name"
i.e "my.host.domain" to specify the machine
.IP "-"
make curl pick the same IP address that is already used for the control
connection
.RE
If this option is used several times, the last one will be used. Disable the
use of PORT with --ftp-pasv. Disable the attempt to use the EPRT command
instead of PORT by using --disable-eprt. EPRT is really PORT++.
Since 7.19.5, you can append \&":[start]-[end]\&" to the right of the address,
to tell curl what TCP port range to use. That means you specify a port range,
from a lower to a higher number. A single number works as well, but do note
that it increases the risk of failure since the port may not be available.

View File

@ -1,4 +1,3 @@
Short:
Long: http1.1 Long: http1.1
Tags: Versions Tags: Versions
Protocols: HTTP Protocols: HTTP

View File

@ -1,4 +1,3 @@
Short:
Long: http2-prior-knowledge Long: http2-prior-knowledge
Tags: Versions Tags: Versions
Protocols: HTTP Protocols: HTTP

View File

@ -1,4 +1,3 @@
Short:
Long: http2 Long: http2
Tags: Versions Tags: Versions
Protocols: HTTP Protocols: HTTP

View File

@ -0,0 +1,10 @@
Long: mail-auth
Arg: <address>
Protocols: SMTP
Help: Originator address of the original email
Added: 7.25.0
See-also: mail-rcpt mail-from
---
Specify a single address. This will be used to specify the authentication
address (identity) of a submitted message that is being relayed to another
server.

View File

@ -0,0 +1,8 @@
Long: mail-from
Arg: <address>
Help: Mail from this address
Protocols: SMTP
Added: 7.20.0
See-also: mail-rcpt mail-auth
---
Specify a single address that the given mail should get sent from.

View File

@ -0,0 +1,19 @@
Long: mail-rcpt
Arg: <address>
Help: Mail from this address
Protocols: SMTP
Added: 7.20.0
---
Specify a single address, user name or mailing list name. Repeat this
option several times to send to multiple recipients.
When performing a mail transfer, the recipient should specify a valid email
address to send the mail to.
When performing an address verification (VRFY command), the recipient should be
specified as the user name or user name and domain (as per Section 3.5 of
RFC5321). (Added in 7.34.0)
When performing a mailing list expand (EXPN command), the recipient should be
specified using the mailing list name, such as "Friends" or "London-Office".
(Added in 7.34.0)

View File

@ -0,0 +1,12 @@
Long: max-filesize
Arg: <bytes>
Help: Maximum file size to download
See-also: limit-rate
---
Specify the maximum size (in bytes) of a file to download. If the file
requested is larger than this value, the transfer will not start and curl will
return with exit code 63.
\fBNOTE:\fP The file size is not always known prior to download, and for such
files this option has no effect even if the file transfer ends up being larger
than this given limit. This concerns both FTP and HTTP transfers.

View File

@ -0,0 +1,11 @@
Long: max-redirs
Arg: <num>
Help: Maximum number of redirects allowed
Protocols: HTTP
---
Set maximum number of redirection-followings allowed. When --location is used,
is used to prevent curl from following redirections \&"in absurdum". By
default, the limit is set to 50 redirections. Set this option to -1 to make it
unlimited.
If this option is used several times, the last one will be used.

View File

@ -0,0 +1,27 @@
Long: metalink
Help: Process given URLs as metalink XML file
Added: 7.27.0
Requires: metalink
---
This option can tell curl to parse and process a given URI as Metalink file
(both version 3 and 4 (RFC 5854) are supported) and make use of the mirrors
listed within for failover if there are errors (such as the file or server not
being available). It will also verify the hash of the file after the download
completes. The Metalink file itself is downloaded and processed in memory and
not stored in the local file system.
Example to use a remote Metalink file:
curl --metalink http://www.example.com/example.metalink
To use a Metalink file in the local file system, use FILE protocol (file://):
curl --metalink file://example.metalink
Please note that if FILE protocol is disabled, there is no way to use a local
Metalink file at the time of this writing. Also note that if --metalink and
--include are used together, --include will be ignored. This is because
including headers in the response will break Metalink parser and if the
headers are included in the file described in Metalink file, hash check will
fail.

View File

@ -0,0 +1,15 @@
Long: negotiate
Help: Use HTTP Negotiate (SPNEGO) authentication
Protocols: HTTP
See-also: basic ntlm anyauth proxy-negotiate
---
Enables Negotiate (SPNEGO) authentication.
This option requires a library built with GSS-API or SSPI support. Use
--version to see if your curl supports GSS-API/SSPI or SPNEGO.
When using this option, you must also provide a fake --user option to activate
the authentication code properly. Sending a '-u :' is enough as the user name
and password from the --user option aren't actually used.
If this option is used several times, only the first one is used.

View File

@ -0,0 +1,12 @@
Long: netrc-file
Help: Specify FILE for netrc
Arg: <filemame>
Added: 7.21.5
Mutexed: netrc
---
This option is similar to --netrc, except that you provide the path (absolute
or relative) to the netrc file that Curl should use. You can only specify one
netrc file per invocation. If several --netrc-file options are provided,
the last one will be used.
It will abide by --netrc-optional if specified.

View File

@ -0,0 +1,7 @@
Long: netrc-optional
Help: Use either .netrc or URL
Mutexed: netrc
See-also: netrc-file
---
Very similar to --netrc, but this option makes the .netrc usage \fBoptional\fP
and not mandatory as the --netrc option does.

17
docs/cmdline-opts/netrc.d Normal file
View File

@ -0,0 +1,17 @@
Long: netrc
Short: n
Help: Must read .netrc for user name and password
---
Makes curl scan the \fI.netrc\fP (\fI_netrc\fP on Windows) file in the user's
home directory for login name and password. This is typically used for FTP on
Unix. If used with HTTP, curl will enable user authentication. See
\fInetrc(5)\fP \fIftp(1)\fP for details on the file format. Curl will not
complain if that file doesn't have the right permissions (it should not be
either world- or group-readable). The environment variable "HOME" is used to
find the home directory.
A quick and very simple example of how to setup a \fI.netrc\fP to allow curl
to FTP to the machine host.domain.com with user name \&'myself' and password
\&'secret' should look similar to:
.B "machine host.domain.com login myself password secret"

View File

@ -1,9 +1,7 @@
Short:
Long: no-alpn Long: no-alpn
Tags: Tags: HTTP/2
Protocols: HTTPS Protocols: HTTPS
Added: 7.36.0 Added: 7.36.0
Mutexed:
See-also: no-npn http2 See-also: no-npn http2
Requires: TLS Requires: TLS
Help: Disable the ALPN TLS extension Help: Disable the ALPN TLS extension

View File

@ -0,0 +1,11 @@
Long: no-buffer
Short: N
Help: Disable buffering of the output stream
---
Disables the buffering of the output stream. In normal work situations, curl
will use a standard buffered output stream that will have the effect that it
will output the data in chunks, not necessarily exactly when the data arrives.
Using this option will disable that buffering.
Note that this is the negated option name documented. You can thus use
--buffer to enforce the buffering.

View File

@ -0,0 +1,8 @@
Long: no-keepalive
Help: Disable TCP keepalive on the connection
---
Disables the use of keepalive messages on the TCP connection. curl otherwis
enables them by default.
Note that this is the negated option name documented. You can thus use
--keepalive to enforce keepalive.

View File

@ -1,6 +1,5 @@
Short:
Long: no-npn Long: no-npn
Tags: Versions Tags: Versions HTTP/2
Protocols: HTTPS Protocols: HTTPS
Added: 7.36.0 Added: 7.36.0
Mutexed: Mutexed:

View File

@ -0,0 +1,13 @@
Long: no-sessionid
Help: Disable SSL session-ID reusing
Protocols: TLS
Added: 7.16.0
---
Disable curl's use of SSL session-ID caching. By default all transfers are
done using the cache. Note that while nothing should ever get hurt by
attempting to reuse SSL session-IDs, there seem to be broken SSL
implementations in the wild that may require you to disable this in order for
you to succeed.
Note that this is the negated option name documented. You can thus use
--sessionid to enforce session-ID caching.

View File

@ -0,0 +1,11 @@
Long: noproxy
Arg: <no-proxy-list>
Help: List of hosts which do not use proxy
Added: 7.19.4
---
Comma-separated list of hosts which do not use a proxy, if one is specified.
The only wildcard is a single * character, which matches all hosts, and
effectively disables the proxy. Each name in this list is matched as either
a domain which contains the hostname, or the hostname itself. For example,
local.com would match local.com, local.com:80, and www.local.com, but not
www.notlocal.com.

View File

@ -0,0 +1,7 @@
Long: ntlm-wb
Help: Use HTTP NTLM authentication with winbind
Protocols: HTTP
See-also: ntlm proxy-ntlm
---
Enables NTLM much in the style --ntlm does, but hand over the authentication
to the separate binary ntlmauth application that is executed when needed.

18
docs/cmdline-opts/ntlm.d Normal file
View File

@ -0,0 +1,18 @@
Long: ntlm
Help: Use HTTP NTLM authentication
Mutexed: basic negotiated digest anyauth
See-also: proxy-ntlm
Protocols: HTTP
Requires: TLS
---
Enables NTLM authentication. The NTLM authentication method was designed by
Microsoft and is used by IIS web servers. It is a proprietary protocol,
reverse-engineered by clever people and implemented in curl based on their
efforts. This kind of behavior should not be endorsed, you should encourage
everyone who uses NTLM to switch to a public and documented authentication
method instead, such as Digest.
If you want to enable NTLM for your proxy authentication, then use
--proxy-ntlm.
If this option is used several times, only the first one is used.

View File

@ -0,0 +1,11 @@
Long: oauth2-bearer
Help: OAuth 2 Bearer Token
Protocols: IMAP POP3 SMTP
---
Specify the Bearer Token for OAUTH 2.0 server authentication. The Bearer Token
is used in conjunction with the user name which can be specified as part of
the --url or --user options.
The Bearer Token and user name are formatted according to RFC 6750.
If this option is used several times, the last one will be used.

View File

@ -0,0 +1,32 @@
Long: output
Arg: <file>
Short: o
Help: Write to file instead of stdout
See-also: remote-name remote-name-all remote-header-name
---
Write output to <file> instead of stdout. If you are using {} or [] to fetch
multiple documents, you can use '#' followed by a number in the <file>
specifier. That variable will be replaced with the current string for the URL
being fetched. Like in:
curl http://{one,two}.example.com -o "file_#1.txt"
or use several variables like:
curl http://{site,host}.host[1-5].com -o "#1_#2"
You may use this option as many times as the number of URLs you have. For
example, if you specify two URLs on the same command line, you can use it like
this:
curl -o aa example.com -o bb example.net
and the order of the -o options and the URLs doesn't matter, just that the
first -o is for the first URL and so on, so the above command line can also be
written as
curl example.com example.net -o aa -o bb
See also the --create-dirs option to create the local directories
dynamically. Specifying the output as '-' (a single dash) will force the
output to be done to stdout.

8
docs/cmdline-opts/pass.d Normal file
View File

@ -0,0 +1,8 @@
Long: pass
Arg: <phrase>
Help: Pass phrase for the private key
Protocols: SSH TLS
---
Passphrase for the private key
If this option is used several times, the last one will be used.

View File

@ -0,0 +1,7 @@
Long: path-as-is
Help: Do not squash .. sequences in URL path
Added: 7.42.0
---
Tell curl to not handle sequences of /../ or /./ in the given URL
path. Normally curl will squash or merge them according to standards but with
this option set you tell it not to do that.

View File

@ -0,0 +1,11 @@
Long: post301
Help: Do not switch to GET after following a 301
Protocols: HTTP
See-also: post302 post303 location
Added: 7.17.1
---
Tells curl to respect RFC 7230/6.4.2 and not convert POST requests into GET
requests when following a 301 redirection. The non-RFC behaviour is ubiquitous
in web browsers, so curl does the conversion by default to maintain
consistency. However, a server may require a POST to remain a POST after such
a redirection. This option is meaningful only when using --location.

View File

@ -0,0 +1,11 @@
Long: post302
Help: Do not switch to GET after following a 302
Protocols: HTTP
See-also: post301 post303 location
Added: 7.19.1
---
Tells curl to respect RFC 7230/6.4.2 and not convert POST requests into GET
requests when following a 302 redirection. The non-RFC behaviour is ubiquitous
in web browsers, so curl does the conversion by default to maintain
consistency. However, a server may require a POST to remain a POST after such
a redirection. This option is meaningful only when using --location.

View File

@ -0,0 +1,11 @@
Long: post303
Help: Do not switch to GET after following a 303
Protocols: HTTP
See-also: post302 post301 location
Added: 7.26.0
---
Tells curl to respect RFC 7230/6.4.2 and not convert POST requests into GET
requests when following a 303 redirection. The non-RFC behaviour is ubiquitous
in web browsers, so curl does the conversion by default to maintain
consistency. However, a server may require a POST to remain a POST after such
a redirection. This option is meaningful only when using --location.

View File

@ -0,0 +1,18 @@
Long: proto-default
Help: Use PROTOCOL for any URL missing a scheme
Arg: <protocol>
Added: 7.45.0
---
Tells curl to use \fIprotocol\fP for any URL missing a scheme name.
Example:
curl --proto-default https ftp.mozilla.org
An unknown or unsupported protocol causes error
\fICURLE_UNSUPPORTED_PROTOCOL\fP (1).
This option does not change the default proxy protocol (http).
Without this option curl would make a guess based on the host, see --url for
details.

View File

@ -0,0 +1,17 @@
Long: proto-redir
Arg: <protocols>
Help: Enable/disable PROTOCOLS on redirect
Added: 7.20.2
---
Tells curl to limit what protocols it may use on redirect. Protocols denied by
--proto are not overridden by this option. See --proto for how protocols are
represented.
Example, allow only HTTP and HTTPS on redirect:
curl --proto-redir -all,http,https http://example.com
By default curl will allow all protocols on redirect except several disabled
for security reasons: Since 7.19.4 FILE and SCP are disabled, and since 7.40.0
SMB and SMBS are also disabled. Specifying \fIall\fP or \fI+all\fP enables all
protocols on redirect, including those disabled for security.

43
docs/cmdline-opts/proto.d Normal file
View File

@ -0,0 +1,43 @@
Long: proto
Arg: <protocols>
Help: Enable/disable PROTOCOLS
See-also: proto-redir proto-default
Added: 7.20.2
---
Tells curl to limit what protocols it may use in the transfer. Protocols are
evaluated left to right, are comma separated, and are each a protocol name or
'all', optionally prefixed by zero or more modifiers. Available modifiers are:
.RS
.TP 3
.B +
Permit this protocol in addition to protocols already permitted (this is
the default if no modifier is used).
.TP
.B -
Deny this protocol, removing it from the list of protocols already permitted.
.TP
.B =
Permit only this protocol (ignoring the list already permitted), though
subject to later modification by subsequent entries in the comma separated
list.
.RE
.IP
For example:
.RS
.TP 15
.B --proto -ftps
uses the default protocols, but disables ftps
.TP
.B --proto -all,https,+http
only enables http and https
.TP
.B --proto =http,https
also only enables http and https
.RE
Unknown protocols produce a warning. This allows scripts to safely rely on
being able to disable potentially dangerous protocols, without relying upon
support for that protocol being built into curl to avoid an error.
This option can be used multiple times, in which case the effect is the same
as concatenating the protocols into one instance of the option.

View File

@ -0,0 +1,20 @@
Long: proxy-header
Arg: <header>
Help: Pass custom header LINE to proxy
Protocols: HTTP
Added: 7.37.0
---
Extra header to include in the request when sending HTTP to a proxy. You may
specify any number of extra headers. This is the equivalent option to --header
but is for proxy communication only like in CONNECT requests when you want a
separate header sent to the proxy to what is sent to the actual remote host.
curl will make sure that each header you add/replace is sent with the proper
end-of-line marker, you should thus \fBnot\fP add that as a part of the header
content: do not add newlines or carriage returns, they will only mess things
up for you.
Headers specified with this option will not be included in requests that curl
knows will not be sent to a proxy.
This option can be used multiple times to add/replace/remove multiple headers.

View File

@ -0,0 +1,9 @@
Long: proxytunnel
Help: Operate through a HTTP proxy tunnel (using CONNECT)
See-also: proxy
---
When an HTTP proxy is used --proxy, this option will cause non-HTTP protocols
to attempt to tunnel through the proxy instead of merely using it to do
HTTP-like operations. The tunnel approach is made with the HTTP proxy CONNECT
request and requires that the proxy allows direct connect to the remote port
number curl wants to tunnel through to.

View File

@ -0,0 +1,21 @@
Long: remote-name
Short: O
Help: Write output to a file named as the remote file
---
Write output to a local file named like the remote file we get. (Only the file
part of the remote file is used, the path is cut off.)
The file will be saved in the current working directory. If you want the file
saved in a different directory, make sure you change the current working
directory before invoking curl with this option.
The remote file name to use for saving is extracted from the given URL,
nothing else, and if it already exists it will be overwritten. If you want the
server to be able to choose the file name refer to --remote-header-name which
can be used in addition to this option. If the server chooses a file name and
that name already exists it will not be overwritten.
There is no URL decoding done on the file name. If it has %20 or other URL
encoded parts of the name, they will end up as-is as file name.
You may use this option as many times as the number of URLs you have.