/* NEVER EVER edit this manually, fix the mkhelp script instead! */ #include void hugehelp(void) { puts ( " _ _ ____ _ \n" " Project ___| | | | _ \\| | \n" " / __| | | | |_) | | \n" " | (__| |_| | _ <| |___ \n" " \\___|\\___/|_| \\_\\_____|\n" "NAME\n" " curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE,\n" " HTTP or HTTPS syntax.\n" "\n" "SYNOPSIS\n" " curl [options] url\n" "\n" "DESCRIPTION\n" " curl is a client to get documents/files from servers, using\n" " any of the supported protocols. The command is designed to\n" " work without user interaction or any kind of interactivity.\n" "\n" " curl offers a busload of useful tricks like proxy support,\n" " user authentication, ftp upload, HTTP post, SSL (https:)\n" " connections, cookies, file transfer resume and more.\n" "\n" "URL\n" " The URL syntax is protocol dependent. You'll find a detailed\n" " description in RFC 2396.\n" "\n" " You can specify multiple URLs or parts of URLs by writing\n" " part sets within braces as in:\n" "\n" " http://site.{one,two,three}.com\n" "\n" " or you can get sequences of alphanumeric series by using []\n" " as in:\n" "\n" " ftp://ftp.numericals.com/file[1-100].txt\n" " ftp://ftp.numericals.com/file[001-100].txt (with leading\n" " zeros)\n" " ftp://ftp.letters.com/file[a-z].txt\n" "\n" " It is possible to specify up to 9 sets or series for a URL,\n" " but no nesting is supported at the moment:\n" "\n" " http://www.any.org/archive[1996-1999]/vol­\n" " ume[1-4]part{a,b,c,index}.html\n" "\n" "OPTIONS\n" " -a/--append\n" " (FTP) When used in a ftp upload, this will tell curl to\n" " append to the target file instead of overwriting it. If\n" " the file doesn't exist, it will be created.\n" "\n" " If this option is used twice, the second one will dis­\n" " able append mode again.\n" "\n" " -A/--user-agent \n" " (HTTP) Specify the User-Agent string to send to the\n" " HTTP server. Some badly done CGIs fail if its not set\n" " to \"Mozilla/4.0\". To encode blanks in the string,\n" " surround the string with single quote marks. This can\n" " also be set with the -H/--header flag of course.\n" "\n" " If this option is used more than once, the last one\n" " will be the one to be used.\n" "\n" " -b/--cookie \n" " (HTTP) Pass the data to the HTTP server as a cookie. It\n" " is supposedly the data previously received from the\n" " server in a \"Set-Cookie:\" line. The data should be in\n" " the format \"NAME1=VALUE1; NAME2=VALUE2\".\n" "\n" " If no '=' letter is used in the line, it is treated as\n" " a filename to use to read previously stored cookie\n" " lines from, which should be used in this session if\n" " they match. Using this method also activates the\n" " \"cookie parser\" which will make curl record incoming\n" " cookies too, which may be handy if you're using this in\n" " combination with the -L/--location option. The file\n" " format of the file to read cookies from should be plain\n" " HTTP headers or the netscape cookie file format.\n" "\n" " NOTE that the file specified with -b/--cookie is only\n" " used as input. No cookies will be stored in the file.\n" " To store cookies, save the HTTP headers to a file using\n" " -D/--dump-header!\n" "\n" " If this option is used more than once, the last one\n" " will be the one to be used.\n" "\n" " -B/--use-ascii\n" " Use ASCII transfer when getting an FTP file or LDAP\n" " info. For FTP, this can also be enforced by using an\n" " URL that ends with \";type=A\". This option causes data\n" " sent to stdout to be in text mode for win32 systems.\n" "\n" " If this option is used twice, the second one will dis­\n" " able ASCII usage.\n" "\n" " -c/--continue\n" " Deprecated. Use '-C -' instead. Continue/Resume a pre­\n" " vious file transfer. This instructs curl to continue\n" " appending data on the file where it was previously\n" " left, possibly because of a broken connection to the\n" " server. There must be a named physical file to append\n" " to for this to work. Note: Upload resume is depening\n" " on a command named SIZE not always present in all ftp\n" " servers! Upload resume is for FTP only. HTTP resume is\n" " only possible with HTTP/1.1 or later servers.\n" "\n" " -C/--continue-at \n" " Continue/Resume a previous file transfer at the given\n" " offset. The given offset is the exact number of bytes\n" " that will be skipped counted from the beginning of the\n" " source file before it is transfered to the destination.\n" " If used with uploads, the ftp server command SIZE will\n" " not be used by curl. Upload resume is for FTP only.\n" " HTTP resume is only possible with HTTP/1.1 or later\n" " servers.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -d/--data \n" " (HTTP) Sends the specified data in a POST request to\n" " the HTTP server. Note that the data is sent exactly as\n" " specified with no extra processing (with all newlines\n" " cut off). The data is expected to be \"url-encoded\".\n" " This will cause curl to pass the data to the server\n" " using the content-type application/x-www-form-urlen­\n" " coded. Compare to -F. If more than one -d/--data option\n" " is used on the same command line, the data pieces spec­\n" " ified will be merged together with a separating &-let­\n" " ter. Thus, using '-d name=daniel -d skill=lousy' would\n" " generate a post chunk that looks like\n" "\n" " If you start the data with the letter @, the rest\n" " should be a file name to read the data from, or - if\n" " you want curl to read the data from stdin. The con­\n" " tents of the file must already be url-encoded. Multiple\n" " files can also be specified.\n" "\n" " To post data purely binary, you should instead use the\n" " --data-binary option.\n" "\n" " -d/--data is the same as --data-ascii.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " --data-ascii \n" " (HTTP) This is an alias for the -d/--data option.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " --data-binary \n" " (HTTP) This posts data in a similar manner as --data-\n" " ascii does, although when using this option the entire\n" " context of the posted data is kept as-is. If you want\n" " to post a binary file without the strip-newlines fea­\n" " ture of the --data-ascii option, this is for you.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -D/--dump-header \n" " (HTTP/FTP) Write the HTTP headers to this file. Write\n" " the FTP file info to this file if -I/--head is used.\n" "\n" " This option is handy to use when you want to store the\n" " cookies that a HTTP site sends to you. The cookies\n" " could then be read in a second curl invoke by using the\n" " -b/--cookie option!\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -e/--referer \n" " (HTTP) Sends the \"Referer Page\" information to the HTTP\n" " server. This can also be set with the -H/--header flag\n" " of course. When used with -L/--location you can append\n" " \";auto\" to the referer URL to make curl automatically\n" " set the previous URL when it follows a Location:\n" " header. The \";auto\" string can be used alone, even if\n" " you don't set an initial referer.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -E/--cert \n" " (HTTPS) Tells curl to use the specified certificate\n" " file when getting a file with HTTPS. The certificate\n" " must be in PEM format. If the optional password isn't\n" " specified, it will be queried for on the terminal. Note\n" " that this certificate is the private key and the pri­\n" " vate certificate concatenated!\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " --cacert \n" " (HTTPS) Tells curl to use the specified certificate\n" " file to verify the peer. The certificate must be in PEM\n" " format.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -f/--fail\n" " (HTTP) Fail silently (no output at all) on server\n" " errors. This is mostly done like this to better enable\n" " scripts etc to better deal with failed attempts. In\n" " normal cases when a HTTP server fails to deliver a doc­\n" " ument, it returns a HTML document stating so (which\n" " often also describes why and more). This flag will\n" " prevent curl from outputting that and fail silently\n" " instead.\n" "\n" " If this option is used twice, the second will again\n" " disable silent failure.\n" "\n" " -F/--form \n" " (HTTP) This lets curl emulate a filled in form in which\n" " a user has pressed the submit button. This causes curl\n" " to POST data using the content-type multipart/form-data\n" " according to RFC1867. This enables uploading of binary\n" " files etc. To force the 'content' part to be be a file,\n" " prefix the file name with an @ sign. To just get the\n" " content part from a file, prefix the file name with the\n" " letter <. The difference between @ and < is then that @\n" " makes a file get attached in the post as a file upload,\n" " while the < makes a text field and just get the con­\n" " tents for that text field from a file.\n" "\n" " Example, to send your password file to the server,\n" " where input:\n" "\n" " curl -F password=@/etc/passwd www.mypasswords.com\n" "\n" " To read the file's content from stdin insted of a file,\n" " use - where the file name should've been. This goes for\n" ); puts( " both @ and < constructs.\n" "\n" " This option can be used multiple times.\n" "\n" " -h/--help\n" " Usage help.\n" "\n" " -H/--header
\n" " (HTTP) Extra header to use when getting a web page. You\n" " may specify any number of extra headers. Note that if\n" " you should add a custom header that has the same name\n" " as one of the internal ones curl would use, your exter­\n" " nally set header will be used instead of the internal\n" " one. This allows you to make even trickier stuff than\n" " curl would normally do. You should not replace inter­\n" " nally set headers without knowing perfectly well what\n" " you're doing. Replacing an internal header with one\n" " without content on the right side of the colon will\n" " prevent that header from appearing.\n" "\n" " This option can be used multiple times.\n" "\n" " -i/--include\n" " (HTTP) Include the HTTP-header in the output. The HTTP-\n" " header includes things like server-name, date of the\n" " document, HTTP-version and more...\n" " If this option is used twice, the second will again\n" " disable header include.\n" "\n" " --interface \n" " Perform an operation using a specified interface. You\n" " can enter interface name, IP address or host name. An\n" " example could look like:\n" "\n" " curl --interface eth0:1 http://www.netscape.com/\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -I/--head\n" " (HTTP/FTP) Fetch the HTTP-header only! HTTP-servers\n" " feature the command HEAD which this uses to get nothing\n" " but the header of a document. When used on a FTP file,\n" " curl displays the file size only.\n" "\n" " If this option is used twice, the second will again\n" " disable header only.\n" "\n" " --krb4 \n" " (FTP) Enable kerberos4 authentication and use. The\n" " level must be entered and should be one of 'clear',\n" " 'safe', 'confidential' or 'private'. Should you use a\n" " level that is not one of these, 'private' will instead\n" " be used.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -K/--config \n" " Specify which config file to read curl arguments from.\n" " The config file is a text file in which command line\n" " arguments can be written which then will be used as if\n" " they were written on the actual command line. Options\n" " and their parameters must be specified on the same con­\n" " fig file line. If the parameter is to contain white\n" " spaces, the parameter must be inclosed within quotes.\n" " If the first column of a config line is a '#' charac­\n" " ter, the rest of the line will be treated as a comment.\n" "\n" " Specify the filename as '-' to make curl read the file\n" " from stdin.\n" "\n" " This option can be used multiple times.\n" "\n" " -l/--list-only\n" " (FTP) When listing an FTP directory, this switch forces\n" " a name-only view. Especially useful if you want to\n" " machine-parse the contents of an FTP directory since\n" " the normal directory view doesn't use a standard look\n" " or format.\n" "\n" " If this option is used twice, the second will again\n" " disable list only.\n" "\n" " -L/--location\n" " (HTTP/HTTPS) If the server reports that the requested\n" " page has a different location (indicated with the\n" " header line Location:) this flag will let curl attempt\n" " to reattempt the get on the new place. If used together\n" " with -i or -I, headers from all requested pages will be\n" " shown. If this flag is used when making a HTTP POST,\n" " curl will automatically switch to GET after the initial\n" " POST has been done.\n" "\n" " If this option is used twice, the second will again\n" " disable location following.\n" "\n" " -m/--max-time \n" " Maximum time in seconds that you allow the whole opera­\n" " tion to take. This is useful for preventing your batch\n" " jobs from hanging for hours due to slow networks or\n" " links going down. This doesn't work fully in win32\n" " systems.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -M/--manual\n" " Manual. Display the huge help text.\n" "\n" " -n/--netrc\n" " Makes curl scan the .netrc file in the user's home\n" " directory for login name and password. This is typi­\n" " cally used for ftp on unix. If used with http, curl\n" " will enable user authentication. See netrc(4) for\n" " details on the file format. Curl will not complain if\n" " that file hasn't the right permissions (it should not\n" " be world nor group readable). The environment variable\n" " \"HOME\" is used to find the home directory.\n" "\n" " A quick and very simple example of how to setup a\n" " .netrc to allow curl to ftp to the machine\n" " host.domain.com with user name\n" "\n" " machine host.domain.com login myself password secret\n" "\n" " If this option is used twice, the second will again\n" " disable netrc usage.\n" "\n" " -N/--no-buffer\n" " Disables the buffering of the output stream. In normal\n" " work situations, curl will use a standard buffered out­\n" " put stream that will have the effect that it will out­\n" " put the data in chunks, not necessarily exactly when\n" " the data arrives. Using this option will disable that\n" " buffering.\n" "\n" " If this option is used twice, the second will again\n" " switch on buffering.\n" "\n" " -o/--output \n" " Write output to instead of stdout. If you are\n" " using {} or [] to fetch multiple documents, you can use\n" " '#' followed by a number in the specifier. That\n" " variable will be replaced with the current string for\n" " the URL being fetched. Like in:\n" "\n" " curl http://{one,two}.site.com -o \"file_#1.txt\"\n" "\n" " or use several variables like:\n" "\n" " curl http://{site,host}.host[1-5].com -o \"#1_#2\"\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -O/--remote-name\n" " Write output to a local file named like the remote file\n" " we get. (Only the file part of the remote file is used,\n" " the path is cut off.)\n" "\n" " -p/--proxytunnel\n" " When an HTTP proxy is used, this option will cause non-\n" " HTTP protocols to attempt to tunnel through the proxy\n" " instead of merely using it to do HTTP-like operations.\n" " The tunnel approach is made with the HTTP proxy CONNECT\n" " request and requires that the proxy allows direct con­\n" " nect to the remote port number curl wants to tunnel\n" " through to.\n" "\n" " If this option is used twice, the second will again\n" " disable proxy tunnel.\n" "\n" " -P/--ftpport
\n" " (FTP) Reverses the initiator/listener roles when con­\n" " necting with ftp. This switch makes Curl use the PORT\n" " command instead of PASV. In practice, PORT tells the\n" " server to connect to the client's specified address and\n" " port, while PASV asks the server for an ip address and\n" " port to connect to.
should be one of:\n" " interface i.e \"eth0\" to specify which interface's IP\n" " address you want to use (Unix only)\n" "\n" " IP address i.e \"192.168.10.1\" to specify exact IP num­\n" " ber\n" "\n" " host name i.e \"my.host.domain\" to specify machine\n" "\n" " - (any single-letter string) to make it pick\n" " the machine's default\n" "\n" " If this option is used serveral times, the last one will be\n" " used.\n" "\n" " -q If used as the first parameter on the command line, the\n" " $HOME/.curlrc file will not be read and used as a con­\n" " fig file.\n" "\n" " -Q/--quote \n" " (FTP) Send an arbitrary command to the remote FTP\n" " server, by using the QUOTE command of the server. Not\n" " all servers support this command, and the set of QUOTE\n" " commands are server specific! Quote commands are sent\n" " BEFORE the transfer is taking place. To make commands\n" " take place after a successful transfer, prefix them\n" " with a dash '-'. You may specify any amount of commands\n" " to be run before and after the transfer. If the server\n" " returns failure for one of the commands, the entire\n" " operation will be aborted.\n" "\n" " This option can be used multiple times.\n" "\n" " -r/--range \n" " (HTTP/FTP) Retrieve a byte range (i.e a partial docu­\n" " ment) from a HTTP/1.1 or FTP server. Ranges can be\n" " specified in a number of ways.\n" "\n" " 0-499 specifies the first 500 bytes\n" "\n" " 500-999 specifies the second 500 bytes\n" "\n" " -500 specifies the last 500 bytes\n" "\n" " 9500 specifies the bytes from offset 9500 and for­\n" " ward\n" "\n" " 0-0,-1 specifies the first and last byte only(*)(H)\n" "\n" " 500-700,600-799\n" " specifies 300 bytes from offset 500(H)\n" "\n" " 100-199,500-599\n" " specifies two separate 100 bytes ranges(*)(H)\n" "\n" " (*) = NOTE that this will cause the server to reply with a\n" " multipart response!\n" "\n" " You should also be aware that many HTTP/1.1 servers do not\n" " have this feature enabled, so that when you attempt to get a\n" " range, you'll instead get the whole document.\n" "\n" " FTP range downloads only support the simple syntax 'start-\n" " stop' (optionally with one of the numbers omitted). It\n" " depends on the non-RFC command SIZE.\n" "\n" " If this option is used serveral times, the last one will be\n" " used.\n" "\n" " -s/--silent\n" " Silent mode. Don't show progress meter or error mes­\n" " sages. Makes Curl mute.\n" "\n" " If this option is used twice, the second will again\n" " disable mute.\n" "\n" " -S/--show-error\n" " When used with -s it makes curl show error message if\n" " it fails.\n" "\n" " If this option is used twice, the second will again\n" ); puts( " disable show error.\n" "\n" " -t/--upload\n" " Deprecated. Use '-T -' instead. Transfer the stdin\n" " data to the specified file. Curl will read everything\n" " from stdin until EOF and store with the supplied name.\n" " If this is used on a http(s) server, the PUT command\n" " will be used.\n" "\n" " -T/--upload-file \n" " Like -t, but this transfers the specified local file.\n" " If there is no file part in the specified URL, Curl\n" " will append the local file name. NOTE that you must use\n" " a trailing / on the last directory to really prove to\n" " Curl that there is no file name or curl will think that\n" " your last directory name is the remote file name to\n" " use. That will most likely cause the upload operation\n" " to fail. If this is used on a http(s) server, the PUT\n" " command will be used.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -u/--user \n" " Specify user and password to use when fetching. See\n" " README.curl for detailed examples of how to use this.\n" " If no password is specified, curl will ask for it\n" " interactively.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -U/--proxy-user \n" " Specify user and password to use for Proxy authentica­\n" " tion. If no password is specified, curl will ask for it\n" " interactively.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " --url \n" " Set the URL to fetch. This option is mostly handy when\n" " you wanna specify URL in a config file.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -v/--verbose\n" " Makes the fetching more verbose/talkative. Mostly\n" " usable for debugging. Lines starting with '>' means\n" " data sent by curl, '<' means data received by curl that\n" " is hidden in normal cases and lines starting with '*'\n" " means additional info provided by curl.\n" "\n" " If this option is used twice, the second will again\n" " disable verbose.\n" "\n" " -V/--version\n" " Displays the full version of curl, libcurl and other\n" " 3rd party libraries linked with the executable.\n" "\n" " -w/--write-out \n" " Defines what to display after a completed and success­\n" " ful operation. The format is a string that may contain\n" " plain text mixed with any number of variables. The\n" " string can be specified as \"string\", to get read from a\n" " particular file you specify it \"@filename\" and to tell\n" " curl to read the format from stdin you write \"@-\".\n" "\n" " The variables present in the output format will be sub­\n" " stituted by the value or text that curl thinks fit, as\n" " described below. All variables are specified like\n" " %{variable_name} and to output a normal % you just\n" " write them like %%. You can output a newline by using\n" " \\n, a carrige return with \\r and a tab space with \\t.\n" " NOTE: The %-letter is a special letter in the\n" " win32-environment, where all occurrences of % must be\n" " doubled when using this option.\n" "\n" " Available variables are at this point:\n" "\n" " url_effective The URL that was fetched last. This is\n" " mostly meaningful if you've told curl to\n" " follow location: headers.\n" "\n" " http_code The numerical code that was found in the\n" " last retrieved HTTP(S) page.\n" "\n" " time_total The total time, in seconds, that the\n" " full operation lasted. The time will be\n" " displayed with millisecond resolution.\n" "\n" " time_namelookup\n" " The time, in seconds, it took from the\n" " start until the name resolving was com­\n" " pleted.\n" "\n" " time_connect The time, in seconds, it took from the\n" " start until the connect to the remote\n" " host (or proxy) was completed.\n" "\n" " time_pretransfer\n" " The time, in seconds, it took from the\n" " start until the file transfer is just\n" " about to begin. This includes all pre-\n" " transfer commands and negotiations that\n" " are specific to the particular proto­\n" " col(s) involved.\n" "\n" " size_download The total amount of bytes that were\n" " downloaded.\n" "\n" " size_upload The total amount of bytes that were\n" " uploaded.\n" "\n" " size_header The total amount of bytes of the down­\n" " loaded headers.\n" "\n" " size_request The total amount of bytes that were sent\n" " in the HTTP request.\n" "\n" " speed_download The average download speed that curl\n" " measured for the complete download.\n" "\n" " speed_upload The average upload speed that curl mea­\n" " sured for the complete upload.\n" " If this option is used serveral times, the last one will be\n" " used.\n" "\n" " -x/--proxy \n" " Use specified proxy. If the port number is not speci­\n" " fied, it is assumed at port 1080.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -X/--request \n" " (HTTP) Specifies a custom request to use when communi­\n" " cating with the HTTP server. The specified request\n" " will be used instead of the standard GET. Read the HTTP\n" " 1.1 specification for details and explanations.\n" "\n" " (FTP) Specifies a custom FTP command to use instead of\n" " LIST when doing file lists with ftp.\n" "\n" " If this option is used serveral times, the last one\n" " will be used.\n" "\n" " -y/--speed-time