/* NEVER EVER edit this manually, fix the mkhelp script instead! */ #include void hugehelp(void) { puts ( " _ _ ____ _ \n" " Project ___| | | | _ \\| | \n" " / __| | | | |_) | | \n" " | (__| |_| | _ <| |___ \n" " \\___|\\___/|_| \\_\\_____|\n" "NAME\n" " curl - get a URL with FTP, TELNET, LDAP, GOPHER, DICT, FILE,\n" " HTTP or HTTPS syntax.\n" "\n" "SYNOPSIS\n" " curl [options] url\n" "\n" "DESCRIPTION\n" " curl is a client to get documents/files from servers, using\n" " any of the supported protocols. The command is designed to\n" " work without user interaction or any kind of interactivity.\n" "\n" " curl offers a busload of useful tricks like proxy support,\n" " user authentication, ftp upload, HTTP post, SSL (https:)\n" " connections, cookies, file transfer resume and more.\n" "\n" "URL\n" " The URL syntax is protocol dependent. You'll find a detailed\n" " description in RFC 2396.\n" "\n" " You can specify multiple URLs or parts of URLs by writing\n" " part sets within braces as in:\n" "\n" " http://site.{one,two,three}.com\n" "\n" " or you can get sequences of alphanumeric series by using []\n" " as in:\n" "\n" " ftp://ftp.numericals.com/file[1-100].txt\n" " ftp://ftp.numericals.com/file[001-100].txt (with leading\n" " zeros)\n" " ftp://ftp.letters.com/file[a-z].txt\n" "\n" " It is possible to specify up to 9 sets or series for a URL,\n" " but no nesting is supported at the moment:\n" "\n" " http://www.any.org/archive[1996-1999]/vol­\n" " ume[1-4]part{a,b,c,index}.html\n" "\n" "OPTIONS\n" " -a/--append\n" " (FTP) When used in a ftp upload, this will tell curl to\n" " append to the target file instead of overwriting it. If\n" " the file doesn't exist, it will be created.\n" "\n" " -A/--user-agent \n" " (HTTP) Specify the User-Agent string to send to the\n" " HTTP server. Some badly done CGIs fail if its not set\n" " to \"Mozilla/4.0\". To encode blanks in the string, sur­\n" " round the string with single quote marks. This can\n" " also be set with the -H/--header flag of course.\n" " -b/--cookie \n" " (HTTP) Pass the data to the HTTP server as a cookie. It\n" " is supposedly the data previously received from the\n" " server in a \"Set-Cookie:\" line. The data should be in\n" " the format \"NAME1=VALUE1; NAME2=VALUE2\".\n" "\n" " If no '=' letter is used in the line, it is treated as\n" " a filename to use to read previously stored cookie\n" " lines from, which should be used in this session if\n" " they match. Using this method also activates the\n" " \"cookie parser\" which will make curl record incoming\n" " cookies too, which may be handy if you're using this in\n" " combination with the -L/--location option. The file\n" " format of the file to read cookies from should be plain\n" " HTTP headers or the netscape cookie file format.\n" "\n" " NOTE that the file specified with -b/--cookie is only\n" " used as input. No cookies will be stored in the file.\n" " To store cookies, save the HTTP headers to a file using\n" " -D/--dump-header!\n" "\n" " -B/--ftp-ascii\n" " (FTP/LDAP) Use ASCII transfer when getting an FTP file\n" " or LDAP info. For FTP, this can also be enforced by\n" " using an URL that ends with \";type=A\".\n" "\n" " -c/--continue\n" " Continue/Resume a previous file transfer. This\n" " instructs curl to continue appending data on the file\n" " where it was previously left, possibly because of a\n" " broken connection to the server. There must be a named\n" " physical file to append to for this to work. Note:\n" " Upload resume is depening on a command named SIZE not\n" " always present in all ftp servers! Upload resume is for\n" " FTP only. HTTP resume is only possible with HTTP/1.1\n" " or later servers.\n" "\n" " -C/--continue-at \n" " Continue/Resume a previous file transfer at the given\n" " offset. The given offset is the exact number of bytes\n" " that will be skipped counted from the beginning of the\n" " source file before it is transfered to the destination.\n" " If used with uploads, the ftp server command SIZE will\n" " not be used by curl. Upload resume is for FTP only.\n" " HTTP resume is only possible with HTTP/1.1 or later\n" " servers.\n" "\n" " -d/--data \n" " (HTTP) Sends the specified data in a POST request to\n" " the HTTP server. Note that the data is sent exactly as\n" " specified with no extra processing. The data is\n" " expected to be \"url-encoded\". This will cause curl to\n" " pass the data to the server using the content-type\n" " application/x-www-form-urlencoded. Compare to -F.\n" "\n" " If you start the data with the letter @, the rest\n" " should be a file name to read the data from, or - if\n" " you want curl to read the data from stdin. The con­\n" " tents of the file must already be url-encoded.\n" "\n" " -D/--dump-header \n" " (HTTP/FTP) Write the HTTP headers to this file. Write\n" " the FTP file info to this file if -I/--head is used.\n" "\n" " This option is handy to use when you want to store the\n" " cookies that a HTTP site sends to you. The cookies\n" " could then be read in a second curl invoke by using the\n" " -b/--cookie option!\n" "\n" " -e/--referer \n" " (HTTP) Sends the \"Referer Page\" information to the HTTP\n" " server. Some badly done CGIs fail if it's not set. This\n" " can also be set with the -H/--header flag of course.\n" "\n" " -E/--cert \n" " (HTTPS) Tells curl to use the specified certificate\n" " file when getting a file with HTTPS. The certificate\n" " must be in PEM format. If the optional password isn't\n" " specified, it will be queried for on the terminal. Note\n" " that this certificate is the private key and the pri­\n" " vate certificate concatenated!\n" "\n" " -f/--fail\n" " (HTTP) Fail silently (no output at all) on server\n" " errors. This is mostly done like this to better enable\n" " scripts etc to better deal with failed attempts. In\n" " normal cases when a HTTP server fails to deliver a doc­\n" " ument, it returns a HTML document stating so (which\n" " often also describes why and more). This flag will pre­\n" " vent curl from outputting that and fail silently\n" " instead.\n" "\n" " -F/--form \n" " (HTTP) This lets curl emulate a filled in form in which\n" " a user has pressed the submit button. This causes curl\n" " to POST data using the content-type multipart/form-data\n" " according to RFC1867. This enables uploading of binary\n" " files etc. To force the 'content' part to be read from\n" " a file, prefix the file name with an @ sign. Example,\n" " to send your password file to the server, where 'pass­\n" " word' is the name of the form-field to which\n" " /etc/passwd will be the input:\n" "\n" " curl -F password=@/etc/passwd www.mypasswords.com\n" " To read the file's content from stdin insted of a file,\n" " use - where the file name should've been.\n" "\n" " -h/--help\n" " Usage help.\n" "\n" " -H/--header
\n" " (HTTP) Extra header to use when getting a web page. You\n" " may specify any number of extra headers. Note that if\n" " you should add a custom header that has the same name\n" " as one of the internal ones curl would use, your exter­\n" " nally set header will be used instead of the internal\n" " one. This allows you to make even trickier stuff than\n" " curl would normally do. You should not replace inter­\n" " nally set headers without knowing perfectly well what\n" " you're doing.\n" "\n" " -i/--include\n" " (HTTP) Include the HTTP-header in the output. The HTTP-\n" " header includes things like server-name, date of the\n" " document, HTTP-version and more...\n" "\n" " -I/--head\n" " (HTTP/FTP) Fetch the HTTP-header only! HTTP-servers\n" " feature the command HEAD which this uses to get nothing\n" " but the header of a document. When used on a FTP file,\n" " curl displays the file size only.\n" "\n" " -K/--config \n" " Specify which config file to read curl arguments from.\n" " The config file is a text file in which command line\n" " arguments can be written which then will be used as if\n" " they were written on the actual command line. If the\n" " first column of a config line is a '#' character, the\n" " rest of the line will be treated as a comment.\n" "\n" " Specify the filename as '-' to make curl read the file\n" " from stdin.\n" "\n" " -l/--list-only\n" " (FTP) When listing an FTP directory, this switch forces\n" " a name-only view. Especially useful if you want to\n" " machine-parse the contents of an FTP directory since\n" " the normal directory view doesn't use a standard look\n" " or format.\n" "\n" " -L/--location\n" " (HTTP/HTTPS) If the server reports that the requested\n" " page has a different location (indicated with the\n" " header line Location:) this flag will let curl attempt\n" " to reattempt the get on the new place. If used together\n" " with -i or -I, headers from all requested pages will be\n" " shown.\n" "\n" " -m/--max-time \n" " Maximum time in seconds that you allow the whole opera­\n" " tion to take. This is useful for preventing your batch\n" " jobs from hanging for hours due to slow networks or\n" " links going down. This doesn't work properly in win32\n" " systems.\n" "\n" " -M/--manual\n" " Manual. Display the huge help text.\n" "\n" " -n/--netrc\n" " Makes curl scan the .netrc file in the user's home\n" " directory for login name and password. This is typi­\n" " cally used for ftp on unix. If used with http, curl\n" " will enable user authentication. See netrc(4) for\n" " details on the file format. Curl will not complain if\n" " that file hasn't the right permissions (it should not\n" " be world nor group readable). The environment variable\n" " \"HOME\" is used to find the home directory.\n" "\n" " A quick and very simple example of how to setup a\n" " .netrc to allow curl to ftp to the machine\n" " host.domain.com with user name\n" "\n" " machine host.domain.com user myself password secret\n" "\n" " -N/--no-buffer\n" " Disables the buffering of the output stream. In normal\n" " work situations, curl will use a standard buffered out­\n" " put stream that will have the effect that it will out­\n" " put the data in chunks, not necessarily exactly when\n" " the data arrives. Using this option will disable that\n" " buffering.\n" "\n" " -o/--output \n" " Write output to instead of stdout. If you are\n" " using {} or [] to fetch multiple documents, you can use\n" " #[num] in the specifier. That variable will be\n" " replaced with the current string for the URL being\n" " fetched. Like in:\n" "\n" " curl http://{one,two}.site.com -o \"file_#1.txt\"\n" "\n" " or use several variables like:\n" "\n" " curl http://{site,host}.host[1-5].com -o \"#1_#2\"\n" "\n" " -O/--remote-name\n" " Write output to a local file named like the remote file\n" " we get. (Only the file part of the remote file is used,\n" " the path is cut off.)\n" "\n" " -P/--ftpport
\n" " (FTP) Reverses the initiator/listener roles when con­\n" " necting with ftp. This switch makes Curl use the PORT\n" " command instead of PASV. In practice, PORT tells the\n" " server to connect to the client's specified address and\n" " port, while PASV asks the server for an ip address and\n" " port to connect to.
should be one of:\n" "\n" " interface i.e \"eth0\" to specify which interface's IP\n" " address you want to use (Unix only)\n" "\n" " IP address i.e \"192.168.10.1\" to specify exact IP num­\n" " ber\n" "\n" " host name i.e \"my.host.domain\" to specify machine\n" "\n" " - (any single-letter string) to make it pick\n" " the machine's default\n" "\n" " -q If used as the first parameter on the command line, the\n" " $HOME/.curlrc file will not be read and used as a con­\n" " fig file.\n" "\n" " -Q/--quote \n" " (FTP) Send an arbitrary command to the remote FTP\n" " server, by using the QUOTE command of the server. Not\n" " all servers support this command, and the set of QUOTE\n" " commands are server specific! Quote commands are sent\n" " BEFORE the transfer is taking place. To make commands\n" " take place after a successful transfer, prefix them\n" " with a dash '-'. You may specify any amount of commands\n" " to be run before and after the transfer. If the server\n" " returns failure for one of the commands, the entire\n" " operation will be aborted.\n" "\n" " -r/--range \n" " (HTTP/FTP) Retrieve a byte range (i.e a partial docu­\n" " ment) from a HTTP/1.1 or FTP server. Ranges can be\n" " specified in a number of ways.\n" "\n" " 0-499 specifies the first 500 bytes\n" "\n" " 500-999 specifies the second 500 bytes\n" "\n" " -500 specifies the last 500 bytes\n" "\n" " 9500 specifies the bytes from offset 9500 and for­\n" " ward\n" "\n" " 0-0,-1 specifies the first and last byte only(*)(H)\n" " 500-700,600-799\n" " specifies 300 bytes from offset 500(H)\n" "\n" " 100-199,500-599\n" " specifies two separate 100 bytes ranges(*)(H)\n" "\n" " (*) = NOTE that this will cause the server to reply with a\n" " multipart response!\n" "\n" " You should also be aware that many HTTP/1.1 servers do not\n" " have this feature enabled, so that when you attempt to get a\n" " range, you'll instead get the whole document.\n" "\n" " FTP range downloads only support the simple syntax 'start-\n" " stop' (optionally with one of the numbers omitted). It\n" " depends on the non-RFC command SIZE.\n" "\n" " -s/--silent\n" " Silent mode. Don't show progress meter or error mes­\n" " sages. Makes Curl mute.\n" "\n" " -S/--show-error\n" " When used with -s it makes curl show error message if\n" " it fails.\n" "\n" " -t/--upload\n" " Transfer the stdin data to the specified file. Curl\n" " will read everything from stdin until EOF and store\n" " with the supplied name. If this is used on a http(s)\n" " server, the PUT command will be used.\n" "\n" " -T/--upload-file \n" " Like -t, but this transfers the specified local file.\n" " If there is no file part in the specified URL, Curl\n" " will append the local file name. NOTE that you must use\n" " a trailing / on the last directory to really prove to\n" " Curl that there is no file name or curl will think that\n" " your last directory name is the remote file name to\n" " use. That will most likely cause the upload operation\n" " to fail. If this is used on a http(s) server, the PUT\n" " command will be used.\n" "\n" " -u/--user \n" " Specify user and password to use when fetching. See\n" " README.curl for detailed examples of how to use this.\n" " If no password is specified, curl will ask for it\n" " interactively.\n" "\n" " -U/--proxy-user \n" " Specify user and password to use for Proxy authentica­\n" " tion. If no password is specified, curl will ask for it\n" " interactively.\n" " -v/--verbose\n" " Makes the fetching more verbose/talkative. Mostly\n" " usable for debugging. Lines starting with '>' means\n" " data sent by curl, '<' means data received by curl that\n" " is hidden in normal cases and lines starting with '*'\n" " means additional info provided by curl.\n" "\n" " -V/--version\n" " Displays the full version of curl, libcurl and other\n" " 3rd party libraries linked with the executable.\n" "\n" " -w/--write-out \n" " Defines what to display after a completed and success­\n" " ful operation. The format is a string that may contain\n" " plain text mixed with any number of variables. The\n" " string can be specified as \"string\", to get read from a\n" " particular file you specify it \"@filename\" and to tell\n" " curl to read the format from stdin you write \"@-\".\n" "\n" " The variables present in the output format will be sub­\n" " stituted by the value or text that curl thinks fit, as\n" " described below. All variables are specified like\n" " %{variable_name} and to output a normal % you just\n" " write them like %%. You can output a newline by using\n" " \\n, a carrige return with \\r and a tab space with \\t.\n" "\n" " NOTE: The %-letter is a special letter in the\n" " win32-environment, where all occurrences of % must be\n" " doubled when using this option.\n" "\n" " Available variables are at this point:\n" "\n" " url_effective The URL that was fetched last. This is\n" " mostly meaningful if you've told curl to\n" " follow location: headers.\n" "\n" " http_code The numerical code that was found in the\n" " last retrieved HTTP(S) page.\n" "\n" " time_total The total time, in seconds, that the\n" " full operation lasted. The time will be\n" " displayed with millisecond resolution.\n" "\n" " time_namelookup\n" " The time, in seconds, it took from the\n" " start until the name resolving was com­\n" " pleted.\n" "\n" " time_connect The time, in seconds, it took from the\n" " start until the connect to the remote\n" " host (or proxy) was completed.\n" " time_pretransfer\n" " The time, in seconds, it took from the\n" " start until the file transfer is just\n" " about to begin. This includes all pre-\n" " transfer commands and negotiations that\n" " are specific to the particular proto­\n" " col(s) involved.\n" "\n" " size_download The total amount of bytes that were\n" " downloaded.\n" "\n" " size_upload The total amount of bytes that were\n" " uploaded.\n" "\n" " speed_download The average download speed that curl\n" " measured for the complete download.\n" "\n" " speed_upload The average upload speed that curl mea­\n" " sured for the complete download.\n" "\n" " -x/--proxy \n" " Use specified proxy. If the port number is not speci­\n" " fied, it is assumed at port 1080.\n" "\n" " -X/--request \n" " (HTTP) Specifies a custom request to use when communi­\n" " cating with the HTTP server. The specified request\n" " will be used instead of the standard GET. Read the HTTP\n" " 1.1 specification for details and explanations.\n" "\n" " (FTP) Specifies a custom FTP command to use instead of\n" " LIST when doing file lists with ftp.\n" "\n" " -y/--speed-time