updates and fixes

This commit is contained in:
Daniel Stenberg 2000-03-16 11:32:53 +00:00
parent 90030a49c7
commit 5992252b3d
8 changed files with 138 additions and 34 deletions

View File

@ -1,4 +1,10 @@
Date: 1999-08-04 _ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
CONTRIBUTE
To Think About When Contributing Source Code To Think About When Contributing Source Code

11
FAQ
View File

@ -1,4 +1,4 @@
Date: 19 November 1999 Date: 15 March 2000
Frequently Asked Questions about Curl Frequently Asked Questions about Curl
@ -29,3 +29,12 @@ Date: 19 November 1999
I am very interested in once and for all getting some kind of report or I am very interested in once and for all getting some kind of report or
README file from those who have used libcurl in a threaded environment, README file from those who have used libcurl in a threaded environment,
since I haven't and I get this question more and more frequently! since I haven't and I get this question more and more frequently!
4. Why doesn't my posting using -F work?
You can't simply use -F or -d at your choice. The web server that will
receive your post assumes one of the formats. If the form you're trying to
"fake" sets the type to 'multipart/form-data', than and only then you must
use the -F type. In all the most common cases, you should use -d which then
causes a posting with the type 'application/x-www-form-urlencoded'.

View File

@ -1,7 +1,16 @@
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
FEATURES
Misc Misc
- full URL syntax - full URL syntax
- custom maximum download time - custom maximum download time
- custom least download speed acceptable - custom least download speed acceptable
- custom output result after completion
- multiple URLs - multiple URLs
- guesses protocol from host name unless specified - guesses protocol from host name unless specified
- uses .netrc - uses .netrc
@ -21,6 +30,7 @@ HTTP
- follow redirects - follow redirects
- custom HTTP request - custom HTTP request
- cookie get/send - cookie get/send
- understands the netscape cookie file
- custom headers (that can replace internally generated headers) - custom headers (that can replace internally generated headers)
- custom user-agent string - custom user-agent string
- custom referer string - custom referer string

1
FILES
View File

@ -1,3 +1,4 @@
BUGS
CHANGES CHANGES
CONTRIBUTE CONTRIBUTE
FEATURES FEATURES

26
INSTALL
View File

@ -6,6 +6,32 @@
How To Compile How To Compile
Curl has been compiled and built on numerous different operating systems. The
way to proceed is mainly devided in two different ways: the unix way or the
windows way.
If you're using Windows (95, 98, NT) or OS/2, you should continue reading from
the Win32 header below. All other systems should be capable of being installed
as described un the the UNIX header.
PORTS
=====
Just to show off, this is a probably incomplete list of known hardware and
operating systems that curl has been compiled for:
Sparc Solaris 2.4, 2.5, 2.5.1, 2.6, 7
Sparc SunOS 4.1.*
i386 Linux 1.3, 2.0, 2.2
MIPS IRIX
HP-PA HP-UX
PowerPC Mac OS X
- Ultrix
i386 OpenBSD
m68k OpenBSD
i386 Windows 95, 98, NT
i386 OS/2
m68k AmigaOS 3
UNIX UNIX
==== ====

16
README
View File

@ -26,3 +26,19 @@ README
Sweden -- ftp://ftp.sunet.se/pub/www/utilities/curl/ Sweden -- ftp://ftp.sunet.se/pub/www/utilities/curl/
Germany -- ftp://ftp.fu-berlin.de/pub/unix/network/curl/ Germany -- ftp://ftp.fu-berlin.de/pub/unix/network/curl/
China -- http://www.pshowing.com/curl/ China -- http://www.pshowing.com/curl/
To download the very latest source off the CVS server do this:
cvs -d :pserver:cvs@curl.sourceforge.net/curl login
(just press enter when asked for password)
cvs -d :pserver:cvs@curl.sourceforge.net/curl co .
(now, you'll get all the latest sources downloaded into your current
directory. Note that this does not create a directory named curl or
anything)
cvs -d :pserver:cvs@curl.sourceforge.net/curl logout
(you're off the hook!)

View File

@ -122,33 +122,37 @@ UPLOADING
FTP FTP
Upload all data on stdin to a specified ftp site: Upload all data on stdin to a specified ftp site:
curl -t ftp://ftp.upload.com/myfile curl -t ftp://ftp.upload.com/myfile
Upload data from a specified file, login with user and password: Upload data from a specified file, login with user and password:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile curl -T uploadfile -u user:passwd ftp://ftp.upload.com/myfile
Upload a local file to the remote site, and use the local file name remote Upload a local file to the remote site, and use the local file name remote
too: too:
curl -T uploadfile -u user:passwd ftp://ftp.upload.com/ curl -T uploadfile -u user:passwd ftp://ftp.upload.com/
NOTE: Curl is not currently supporing ftp upload through a proxy! The reason Upload a local file to get appended to the remote file using ftp:
for this is simply that proxies are seldomly configured to allow this and
that no author has supplied code that makes it possible! curl -T localfile -a ftp://ftp.upload.com/remotefile
NOTE: Curl does not support ftp upload through a proxy! The reason for this
is simply that proxies are seldomly configured to allow this and that no
author has supplied code that makes it possible!
HTTP HTTP
Upload all data on stdin to a specified http site: Upload all data on stdin to a specified http site:
curl -t http://www.upload.com/myfile curl -t http://www.upload.com/myfile
Note that the http server must've been configured to accept PUT before this Note that the http server must've been configured to accept PUT before this
can be done successfully. can be done successfully.
For other ways to do http data upload, see the POST section below. For other ways to do http data upload, see the POST section below.
VERBOSE / DEBUG VERBOSE / DEBUG
@ -457,9 +461,9 @@ FTP and firewalls
HTTPS HTTPS
Secure HTTP requires SSLeay to be installed and used when curl is built. If Secure HTTP requires SSL libraries to be installed and used when curl is
that is done, curl is capable of retrieving and posting documents using the built. If that is done, curl is capable of retrieving and posting documents
HTTPS procotol. using the HTTPS procotol.
Example: Example:
@ -472,9 +476,10 @@ HTTPS
browsers (Netscape and MSEI both use the so called PKCS#12 format). If you browsers (Netscape and MSEI both use the so called PKCS#12 format). If you
want curl to use the certificates you use with your (favourite) browser, you want curl to use the certificates you use with your (favourite) browser, you
may need to download/compile a converter that can convert your browser's may need to download/compile a converter that can convert your browser's
formatted certificates to PEM formatted ones. Dr Stephen N. Henson has formatted certificates to PEM formatted ones. This kind of converter is
written a patch for SSLeay that adds this functionality. You can get his included in recent versions of OpenSSL, and for older versions Dr Stephen
patch (that requires an SSLeay installation) from his site at: N. Henson has written a patch for SSLeay that adds this functionality. You
can get his patch (that requires an SSLeay installation) from his site at:
http://www.drh-consultancy.demon.co.uk/ http://www.drh-consultancy.demon.co.uk/
Example on how to automatically retrieve a document using a certificate with Example on how to automatically retrieve a document using a certificate with
@ -601,6 +606,34 @@ ENVIRONMENT VARIABLES
The usage of the -x/--proxy flag overrides the environment variables. The usage of the -x/--proxy flag overrides the environment variables.
NETRC
Unix introduced the .netrc concept a long time ago. It is a way for a user
to specify name and password for commonly visited ftp sites in a file so
that you don't have to type them in each time you visit those sites. You
realize this is a big security risk if someone else gets hold of your
passwords, so therefor most unix programs won't read this file unless it is
only readable by yourself (curl doesn't care though).
Curl supports .netrc files if told so (using the -n/--netrc option). This is
not restricted to only ftp, but curl can use it for all protocols where
authentication is used.
A very simple .netrc file could look something like:
machine curl.haxx.nu login iamdaniel password mysecret
CUSTOM OUTPUT
To better allow script programmers to get to know about the progress of
curl, the -w/--write-out option was introduced. Using this, you can specify
what information from the previous transfer you want to extract.
To display the amount of bytes downloaded together with some text and an
ending newline:
curl -w 'We downloaded %{size_download} bytes\n' www.download.com
MAILING LIST MAILING LIST
We have an open mailing list to discuss curl, its development and things We have an open mailing list to discuss curl, its development and things

33
TODO
View File

@ -24,18 +24,17 @@ TODO
* HTTP Pipelining/persistant connections * HTTP Pipelining/persistant connections
- I'm gonna introduce HTTP "pipelining". Curl should be able - We should introduce HTTP "pipelining". Curl could be able to request for
to request for several HTTP documents in one connect. It is the beginning several HTTP documents in one connect. It would be the beginning for
for supporing more advanced functions in the future, like web site supporing more advanced functions in the future, like web site
mirroring. This will require that the urlget() function supports several mirroring. This will require that the urlget() function supports several
documents from a single HTTP server, which it doesn't today. documents from a single HTTP server, which it doesn't today.
- When curl supports fetching several documents from the same - When curl supports fetching several documents from the same server using
server using pipelining, I'd like to offer that function to the command pipelining, I'd like to offer that function to the command line. Anyone has
line. Anyone has a good idea how? The current way of specifying one URL a good idea how? The current way of specifying one URL with the output sent
with the output sent to the stdout or a file gets in the way. Imagine a to the stdout or a file gets in the way. Imagine a syntax that supports
syntax that supports "additional documents from the same server" in a way "additional documents from the same server" in a way similar to:
similar to:
curl <main URL> --more-doc <path> --more-doc <path> curl <main URL> --more-doc <path> --more-doc <path>
@ -52,12 +51,11 @@ TODO
And some friendly person's server source code is available at And some friendly person's server source code is available at
http://hopf.math.nwu.edu/digestauth/index.html http://hopf.math.nwu.edu/digestauth/index.html
Then there's the Apache mod_digest source code too of course. Then there's the Apache mod_digest source code too of course. It seems as
It seems as if Netscape doesn't support this, and not many servers if Netscape doesn't support this, and not many servers do. Although this is
do. Although this is a lot better authentication method than the more a lot better authentication method than the more common "Basic". Basic
common "Basic". Basic sends the password in cleartext over the network, sends the password in cleartext over the network, this "Digest" method uses
this "Digest" method uses a challange-response protocol which increases a challange-response protocol which increases security quite a lot.
security quite a lot.
* Different FTP Upload Through Web Proxy * Different FTP Upload Through Web Proxy
I don't know any web proxies that allow CONNECT through on port 21, but I don't know any web proxies that allow CONNECT through on port 21, but
@ -88,3 +86,8 @@ TODO
(http://search.ietf.org/internet-drafts/draft-murray-auth-ftp-ssl-05.txt) (http://search.ietf.org/internet-drafts/draft-murray-auth-ftp-ssl-05.txt)
* HTTP POST resume using Range: * HTTP POST resume using Range:
* Make curl capable of verifying the server's certificate when connecting
with HTTPS://.
* Make the timeout work as expected!