1
0
mirror of https://github.com/moparisthebest/curl synced 2024-12-22 08:08:50 -05:00

cmdline-opts: first test version of a new man page generator kit

See MANPAGE.md for the description of how this works. Each command line
option is now described in a separate .d file.
This commit is contained in:
Daniel Stenberg 2016-11-13 23:40:12 +01:00
parent ebf985c159
commit 050aa80309
15 changed files with 583 additions and 0 deletions

View File

@ -0,0 +1,47 @@
# curl man page generator
This is the curl man page generator. It generates a single nroff man page
output from the set of sources files in this directory.
There is one source file for each supported command line option. The format is
described below.
## Option files
Each command line option is described in a file named `<long name>.d`, where
option name is written without any prefixing dashes. Like the file name for
the -v, --verbose option is named `verbose.d`.
Each file has a set of meta-data and a body of text.
### Meta-data
Short: (single letter, without dash)
Long: (long form name, without dashes)
Arg: (the argument the option takes)
Magic: (description of "magic" options)
Tags: (space separated list)
Protocols: (space separated list for which protocols this option works)
Added: (version number in which this was added)
Mutexed: (space separated list of options this overrides)
Requires: (space separated list of features this option requres)
See-also: (space separated list of related options)
--- (end of meta-data)
### Body
The body of the description. Only refer to options with their long form option
version, like --verbose. The output generator will replace such with the
correct markup that shows both short and long version.
## Header
`page-header` is the nroff formatted file that will be output before the
generated options output.
## Generate
`perl gen.pl`
This command outputs an nroff file, meant to become `curl.1`. The full curl
man page.

View File

@ -0,0 +1,23 @@
Short: c
Long: cookie-jar
Arg: <filename>
Protocols: HTTP
---
Specify to which file you want curl to write all cookies after a completed
operation. Curl writes all cookies from its in-memory cookie storage to the
given file at the end of operations. If no cookies are known, no data will be
written. The file will be written using the Netscape cookie file format. If
you set the file name to a single dash, "-", the cookies will be written to
stdout.
This command line option will activate the cookie engine that makes curl
record and use cookies. Another way to activate it is to use the --cookie
option.
If the cookie jar can't be created or written to, the whole curl operation
won't fail or even report an error clearly. Using --verbose will get a warning
displayed, but that is the only visible feedback you get about this possibly
lethal situation.
If this option is used several times, the last specified file name will be
used.

View File

@ -0,0 +1,35 @@
Short: b
Long: cookie
Arg: <name=data>
Protocols: HTTP
---
Pass the data to the HTTP server in the Cookie header. It is supposedly
the data previously received from the server in a "Set-Cookie:" line. The
data should be in the format "NAME1=VALUE1; NAME2=VALUE2".
If no '=' symbol is used in the argument, it is instead treated as a filename
to read previously stored cookie from. This option also activates the cookie
engine which will make curl record incoming cookies, which may be handy if
you're using this in combination with the --location option or do multiple URL
transfers on the same invoke.
The file format of the file to read cookies from should be plain HTTP headers
(Set-Cookie style) or the Netscape/Mozilla cookie file format.
The file specified with --cookie is only used as input. No cookies will be
written to the file. To store cookies, use the --cookie-jar option.
Exercise caution if you are using this option and multiple transfers may
occur. If you use the NAME1=VALUE1; format, or in a file use the Set-Cookie
format and don't specify a domain, then the cookie is sent for any domain
(even after redirects are followed) and cannot be modified by a server-set
cookie. If the cookie engine is enabled and a server sets a cookie of the same
name then both will be sent on a future transfer to that server, likely not
what you intended. To address these issues set a domain in Set-Cookie (doing
that will include sub domains) or use the Netscape format.
If this option is used several times, the last one will be used.
Users very often want to both read cookies from a file and write updated
cookies back to a file, so using both --cookie and --cookie-jar in the same
command line is common.

216
docs/cmdline-opts/gen.pl Executable file
View File

@ -0,0 +1,216 @@
#!/usr/bin/perl
my $some_dir=".";
opendir(my $dh, $some_dir) || die "Can't opendir $some_dir: $!";
my @s = grep { /\.d$/ && -f "$some_dir/$_" } readdir($dh);
closedir $dh;
my %optshort;
my %optlong;
# get the long name version, return the man page string
sub manpageify {
my ($k)=@_;
my $l;
if($optlong{$k} ne "") {
# both short + long
$l = "\\fI-".$optlong{$k}.", --$k\\fP";
}
else {
# only long
$l = "\\fI--$k\\fP";
}
return $l;
}
sub printdesc {
my @desc = @_;
for my $d (@desc) {
# skip lines starting with space (examples)
if($d =~ /^[^ ]/) {
for my $k (keys %optlong) {
my $l = manpageify($k);
$d =~ s/--$k(\s)/$l$1/;
}
}
print $d;
}
}
sub single {
my ($f)=@_;
open(F, "<$f");
my $short;
my $long;
my $tags;
my $added;
my $protocols;
my $arg;
my $mutexed;
my $requires;
my $seealso;
my $magic; # cmdline special option
while(<F>) {
if(/^Short: (.)/i) {
$short=$1;
}
elsif(/^Long: (.*)/i) {
$long=$1;
}
elsif(/^Added: (.*)/i) {
$added=$1;
}
elsif(/^Tags: (.*)/i) {
$tags=$1;
}
elsif(/^Arg: (.*)/i) {
$arg=$1;
}
elsif(/^Magic: (.*)/i) {
$magic=$1;
}
elsif(/^Mutexed: (.*)/i) {
$mutexed=$1;
}
elsif(/^Protocols: (.*)/i) {
$protocols=$1;
}
elsif(/^See-also: (.*)/i) {
$seealso=$1;
}
elsif(/^Requires: (.*)/i) {
$requires=$1;
}
elsif(/^---/) {
last;
}
}
my @dest;
while(<F>) {
push @desc, $_;
}
close(F);
my $opt;
if(defined($short) && $long) {
$opt = "-$short, --$long";
}
elsif($short && !$long) {
$opt = "-$short";
}
elsif($long && !$short) {
$opt = "--$long";
}
if($arg) {
$opt .= " $arg";
}
print ".IP \"$opt\"\n";
my $o;
if($protocols) {
$o++;
print "($protocols) ";
}
if(!$arg && !$mutexed && !$magic) {
$o++;
print "[Boolean] ";
}
if($magic) {
$o++;
print "[cmdline control] ";
}
print "\n" if($o);
printdesc(@desc);
undef @desc;
my @foot;
if($seealso) {
my @m=split(/ /, $seealso);
my $mstr;
for my $k (@m) {
my $l = manpageify($k);
$mstr .= sprintf "%s$l", $mstr?" and ":"";
}
push @foot, "See also $mstr. ";
}
if($requires) {
my $l = manpageify($long);
push @foot, "$l requires that the underlying libcurl".
" was built to support $requires. ";
}
if($mutexed) {
my @m=split(/ /, $mutexed);
my $mstr;
for my $k (@m) {
my $l = manpageify($k);
$mstr .= sprintf "%s$l", $mstr?" and ":"";
}
push @foot, "This option overrides $mstr. ";
}
if($added) {
push @foot, "Added in $added. ";
}
if($foot[0]) {
print "\n";
print @foot;
print "\n";
}
}
sub getshortlong {
my ($f)=@_;
open(F, "<$f");
my $short;
my $long;
while(<F>) {
if(/^Short: (.)/i) {
$short=$1;
}
elsif(/^Long: (.*)/i) {
$long=$1;
}
elsif(/^---/) {
last;
}
}
close(F);
if($short) {
$optshort{$short}=$long;
}
if($long) {
$optlong{$long}=$short;
}
}
sub indexoptions {
foreach my $f (@s) {
getshortlong($f);
}
}
sub header {
open(F, "<page-header");
my @d;
while(<F>) {
push @d, $_;
}
close(F);
printdesc(@d);
}
#------------------------------------------------------------------------
# learn all existing options
indexoptions();
# show the page header
header();
# output docs for all options
foreach my $f (sort @s) {
single($f);
}

View File

@ -0,0 +1,9 @@
Short: 0
Long: http1.0
Tags: Versions
Protocols: HTTP
Added:
Mutexed: http1.1 http2
---
Tells curl to use HTTP version 1.0 instead of using its internally preferred
HTTP version.

View File

@ -0,0 +1,8 @@
Short:
Long: http1.1
Tags: Versions
Protocols: HTTP
Added: 7.33.0
Mutexed: http1.0 http2
---
Tells curl to use HTTP version 1.1.

View File

@ -0,0 +1,12 @@
Short:
Long: http2-prior-knowledge
Tags: Versions
Protocols: HTTP
Added: 7.49.0
Mutexed: http1.1 http1.0 http2
Requires: HTTP/2
---
Tells curl to issue its non-TLS HTTP requests using HTTP/2 without HTTP/1.1
Upgrade. It requires prior knowledge that the server supports HTTP/2 straight
away. HTTPS requests will still do HTTP/2 the standard way with negotiated
protocol version in the TLS handshake.

10
docs/cmdline-opts/http2.d Normal file
View File

@ -0,0 +1,10 @@
Short:
Long: http2
Tags: Versions
Protocols: HTTP
Added: 7.33.0
Mutexed: http1.1 http1.0 http2-prior-knowledge
Requires: HTTP/2
See-also: no-alpn
---
Tells curl to use HTTP version 2.

19
docs/cmdline-opts/next.d Normal file
View File

@ -0,0 +1,19 @@
Short: :
Long: next
Tags:
Protocols:
Added: 7.36.0
Magic: divider
---
Tells curl to use a separate operation for the following URL and associated
options. This allows you to send several URL requests, each with their own
specific options, for example, such as different user names or custom requests
for each.
--next will reset all local options and only global ones will have their
values survive over to the operation following the --next instruction. Global
options include --verbose and --fail-early.
For example, you can do both a GET and a POST in a single command line:
curl www1.example.com --next -d postthis www2.example.com

View File

@ -0,0 +1,12 @@
Short:
Long: no-alpn
Tags:
Protocols: HTTPS
Added: 7.36.0
Mutexed:
See-also: no-npn http2
Requires: TLS
---
Disable the ALPN TLS extension. ALPN is enabled by default if libcurl was built
with an SSL library that supports ALPN. ALPN is used by a libcurl that supports
HTTP/2 to negotiate HTTP/2 support with the server during https sessions.

View File

@ -0,0 +1,12 @@
Short:
Long: no-npn
Tags: Versions
Protocols: HTTPS
Added: 7.36.0
Mutexed:
See-also: no-alpn http2
Requires: TLS
---
Disable the NPN TLS extension. NPN is enabled by default if libcurl was built
with an SSL library that supports NPN. NPN is used by a libcurl that supports
HTTP/2 to negotiate HTTP/2 support with the server during https sessions.

View File

@ -0,0 +1,138 @@
.\" **************************************************************************
.\" * _ _ ____ _
.\" * Project ___| | | | _ \| |
.\" * / __| | | | |_) | |
.\" * | (__| |_| | _ <| |___
.\" * \___|\___/|_| \_\_____|
.\" *
.\" * Copyright (C) 1998 - 2016, Daniel Stenberg, <daniel@haxx.se>, et al.
.\" *
.\" * This software is licensed as described in the file COPYING, which
.\" * you should have received as part of this distribution. The terms
.\" * are also available at https://curl.haxx.se/docs/copyright.html.
.\" *
.\" * You may opt to use, copy, modify, merge, publish, distribute and/or sell
.\" * copies of the Software, and permit persons to whom the Software is
.\" * furnished to do so, under the terms of the COPYING file.
.\" *
.\" * This software is distributed on an "AS IS" basis, WITHOUT WARRANTY OF ANY
.\" * KIND, either express or implied.
.\" *
.\" **************************************************************************
.\"
.TH curl 1 "30 Nov 2014" "Curl 7.40.0" "Curl Manual"
.SH NAME
curl \- transfer a URL
.SH SYNOPSIS
.B curl [options]
.I [URL...]
.SH DESCRIPTION
.B curl
is a tool to transfer data from or to a server, using one of the supported
protocols (DICT, FILE, FTP, FTPS, GOPHER, HTTP, HTTPS, IMAP, IMAPS, LDAP,
LDAPS, POP3, POP3S, RTMP, RTSP, SCP, SFTP, SMB, SMBS, SMTP, SMTPS, TELNET
and TFTP). The command is designed to work without user interaction.
curl offers a busload of useful tricks like proxy support, user
authentication, FTP upload, HTTP post, SSL connections, cookies, file transfer
resume, Metalink, and more. As you will see below, the number of features will
make your head spin!
curl is powered by libcurl for all transfer-related features. See
\fIlibcurl(3)\fP for details.
.SH URL
The URL syntax is protocol-dependent. You'll find a detailed description in
RFC 3986.
You can specify multiple URLs or parts of URLs by writing part sets within
braces as in:
http://site.{one,two,three}.com
or you can get sequences of alphanumeric series by using [] as in:
ftp://ftp.example.com/file[1-100].txt
ftp://ftp.example.com/file[001-100].txt (with leading zeros)
ftp://ftp.example.com/file[a-z].txt
Nested sequences are not supported, but you can use several ones next to each
other:
http://example.com/archive[1996-1999]/vol[1-4]/part{a,b,c}.html
You can specify any amount of URLs on the command line. They will be fetched
in a sequential manner in the specified order.
You can specify a step counter for the ranges to get every Nth number or
letter:
http://example.com/file[1-100:10].txt
http://example.com/file[a-z:2].txt
When using [] or {} sequences when invoked from a command line prompt, you
probably have to put the full URL within double quotes to avoid the shell from
interfering with it. This also goes for other characters treated special, like
for example '&', '?' and '*'.
Provide the IPv6 zone index in the URL with an escaped percentage sign and the
interface name. Like in
http://[fe80::3%25eth0]/
If you specify URL without protocol:// prefix, curl will attempt to guess what
protocol you might want. It will then default to HTTP but try other protocols
based on often-used host name prefixes. For example, for host names starting
with "ftp." curl will assume you want to speak FTP.
curl will do its best to use what you pass to it as a URL. It is not trying to
validate it as a syntactically correct URL by any means but is instead
\fBvery\fP liberal with what it accepts.
curl will attempt to re-use connections for multiple file transfers, so that
getting many files from the same server will not do multiple connects /
handshakes. This improves speed. Of course this is only done on files
specified on a single command line and cannot be used between separate curl
invokes.
.SH "PROGRESS METER"
curl normally displays a progress meter during operations, indicating the
amount of transferred data, transfer speeds and estimated time left, etc. The
progress meter displays number of bytes and the speeds are in bytes per
second. The suffixes (k, M, G, T, P) are 1024 based. For example 1k is 1024
bytes. 1M is 1048576 bytes.
curl displays this data to the terminal by default, so if you invoke curl to
do an operation and it is about to write data to the terminal, it
\fIdisables\fP the progress meter as otherwise it would mess up the output
mixing progress meter and response data.
If you want a progress meter for HTTP POST or PUT requests, you need to
redirect the response output to a file, using shell redirect (>), -o [file] or
similar.
It is not the same case for FTP upload as that operation does not spit out
any response data to the terminal.
If you prefer a progress "bar" instead of the regular meter, --progress-bar is
your friend.
.SH OPTIONS
Options start with one or two dashes. Many of the options require an
additional value next to them.
The short "single-dash" form of the options, -d for example, may be used with
or without a space between it and its value, although a space is a recommended
separator. The long "double-dash" form, --data for example, requires a space
between it and its value.
Short version options that don't need any additional values can be used
immediately next to each other, like for example you can specify all the
options -O, -L and -v at once as -OLv.
In general, all boolean options are enabled with --\fBoption\fP and yet again
disabled with --\fBno-\fPoption. That is, you use the exact same option name
but prefix it with "no-". However, in this list we mostly only list and show
the --option version of them. (This concept with --no options was added in
7.19.0. Previously most options were toggled on/off on repeated use of the
same command line option.)

View File

@ -0,0 +1,12 @@
Short: #
Long: progress-bar
Tags:
Protocols:
---
Make curl display transfer progress as a simple progress bar instead of the
standard, more informational, meter.
This progress bar draws a single line of '#' characters across the screen and
shows a percentage if the transfer size is known. For transfers without a
known size, it will instead output one '#' character for every 1024 bytes
transferred.

13
docs/cmdline-opts/tlsv1.d Normal file
View File

@ -0,0 +1,13 @@
Short: 1
Long: tlsv1
Tags: Versions
Protocols: SSL
Added:
Mutexed: tlsv1.1 tlsv1.2
Requires: TLS
See-also: http1.1 http2
---
Forces curl to use TLS version 1.x when negotiating with a remote TLS server.
You can use options --tlsv1.0, --tlsv1.1, --tlsv1.2, and --tlsv1.3 to control
the TLS version more precisely (if the SSL backend in use supports such a
level of control).

View File

@ -0,0 +1,17 @@
Short: v
Long: verbose
Mutexed: trace trace-ascii
---
Makes curl verbose during the operation. Useful for debugging and seeing
what's going on "under the hood". A line starting with '>' means "header data"
sent by curl, '<' means "header data" received by curl that is hidden in
normal cases, and a line starting with '*' means additional info provided by
curl.
If you only want HTTP headers in the output, --include might be the option
you're looking for.
If you think this option still doesn't give you enough details, consider using
--trace or --trace-ascii instead.
Use --silent to make curl really quiet.