curl/CHANGES

1837 lines
80 KiB
Plaintext

_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
History of Changes
Daniel (14 March 2001)
- I increased the interface number for libcurl as I've removed the low level
functions from the interface. I also took this opportunity to rename the
Curl_strequal function to curl_strequal and Curl_strnequal to curl_strnequal,
as they're public libcurl functions (even if they're still undocumented).
This will make older programs not capable of using the new libcurl with
just a drop-in replacement.
- Jörn Hartroth updated stuff for win32 compiles:
o config-win32.h was fixed for socklen_t
o lib/ssluse.c had a bad #endif placement
o lib/file.c was made to compile on win32 again
o lib/Makefile.m32 was updated with the new files
Daniel (13 March 2001)
- It only took an hour or so before Jörn Hartroth found a problem in the
chunked transfer-encoding. Given his fine example-site, I could easily spot
the problem and when I re-read the spec (the part I have pasted in the top
of the http_chunks.h file), I realized I had made my state-machine slightly
wrong and didn't expect/handle the trailing CRLF that comes after the data
in each chunk (and those extra two bytes sure feel wasted).
Had to modify test case 34 to match this as well.
Version 7.7-beta2
Daniel (13 March 2001)
- Added the policy stuff to the curl_easy_setopt man page for the two supported
policies.
- Implemented some support for the CURLOPT_CLOSEPOLICY option. The policies
CURLCLOSEPOLICY_LEAST_RECENTLY_USED and CURLCLOSEPOLICY_OLDEST are now
supported, and the "least recently used" is used as default if no policy
is chosen.
Daniel (12 March 2001)
- Added CURLOPT_RANDOM_FILE and CURLOPT_EGDSOCKET to libcurl for seeding the
SSL random engine. The random seeding support was also brought to the curl
client with the new options --random-file <file> and --egd-file <file>. I
need some people to really test this to know they work as supposed. Remember
that libcurl now informs (if verbose is on) if the random seed is considered
weak (HTTPS connections).
- Made the chunked transfer-encoding engine detected bad formatted data length
and return error if so (we can't possibly extract sensible data if this is
the case). Added a test case that detects this. Number 36. Now there are 60
test cases.
- Added 5 new libcurl options to curl/curl.h that can be used to control the
persistant connection support in libcurl. They're also documented (fairly
thoroughly) in the curl_easy_setopt.3 man page. Three of them are now
implemented, although not really tested at this point... Anyway, the new
implemented options are named CURLOPT_MAXCONNECTS, CURLOPT_FRESH_CONNECT,
CURLOPT_FORBID_REUSE. The ones still left to write code for are:
CURLOPT_CLOSEPOLICY and its related option CURLOPT_CLOSEFUNCTION.
- Made curl (the actual command line tool) use the new libcurl 7.7 persistant
connection support by re-using the same curl handle for every specified file
transfer and after some more test case tweaking we have 100% test case OK.
I made some test cases return HTTP/1.0 now to make sure that works as well.
- Had to add 'Connection: close' to the headers of a bunch of test cases so
that curl behaves "old-style" since the test http server doesn't do multiple
connections... Now I get 100% test case OK.
- The curl.haxx.se site, the main curl mailing list and my personal email are
all dead today due to power blackout in the area where the main servers are
located. Horrible.
- I've made persistance work over a squid HTTP proxy. I find it disturbing
that it uses headers that aren't present in any HTTP standard though
(Proxy-Connection:) and that makes me feel that I'm now on the edge of what
the standard actually defines. I need to get this code excercised on a lot
of different HTTP proxies before I feel safe.
Now I'm facing the problem with my test suite servers (both FTP and HTTP)
not supporting persistant connections and libcurl is doing them now. I have
to fix the test servers to get all the test cases do OK.
Daniel (8 March 2001)
- Guenole Bescon reported that libcurl did output errors to stderr even if
MUTE and NOPROGRESS was set. It turned out to be a bug and happens if
there's an error and no ERRORBUFFER is set. This is now corrected.
Version 7.7-beta1
Daniel (8 March 2001)
- "Transfer-Encoding: chunked" is no longer any trouble for libcurl. I've
added two source files and I've run some test downloads that look fine.
- HTTP HEAD works too, even on 1.1 servers.
Daniel (5 March 2001)
- The current 57 test cases now pass OK. It would suggest that libcurl works
using the old-style with one connection per handle. The test suite doesn't
handle multiple connections yet so there are no test cases for this.
- I patched the telnet.c heavily to not use any global variables anymore. It
should make it a lot nicer library-wise.
- The file:// support was modified slightly to use the internal connect-first-
then-do approach.
Daniel (4 March 2001)
- More bugs erased.
Version 7.7-alpha2
Daniel (4 March 2001)
- Now, there's even a basic check that a re-used connection is still alive
before it is assumed so. A few first tests have proven that libcurl will
then re-connect instead of re-use the dead connection!
Daniel (2 March 2001)
- Now they work intermixed as well. Major coolness!
- More fiddling around, my 'tiny' client I have for testing purposes now has
proved to download both FTP and HTTP with persistant connections. They do
not work intermixed yet though.
Daniel (1 March 2001)
- Wilfredo Sanchez pointed out a minor spelling mistake in a man page and that
curl_slist_append() should take a const char * as second argument. It does
now.
Daniel (22 February 2001)
- The persistant connections start to look good for HTTP. On a subsequent
request, it seems that libcurl now can pick an already existing connection
if a suitable one exists, or it opens a new one.
- Douglas R. Horner mailed me corrections to the curl_formparse() man page
that I applied.
Daniel (20 February 2001)
- Added the docs/examples/win32sockets.c file for our windows friends.
- Linus Nielsen Feltzing provided brand new TELNET functionality and
improvements:
* Negotiation is now passive. Curl does not negotiate until the peer does.
* Possibility to set negotiation options on the command line, currently only
XDISPLOC, TTYPE and NEW_ENVIRON (called NEW_ENV).
* Now sends the USER environment variable if the -u switch is used.
* Use -t to set telnet options (Linus even updated the man page, awesome!)
- Haven't done this big changes to curl for a while. Moved around a lot of
struct fields and stuff to make multiple connections get connection specific
data in separate structs so that they can co-exist in a nice way. See the
mailing lists for discussions around how this is gonna be implemented. Docs
and more will follow.
Studied the HTTP RFC to find out better how persistant connections should
work. Seems cool enough.
Daniel (19 February 2001)
- Bob Schader brought me two files that help set up a MS VC++ libcurl project
easier. He also provided me with an up-to-date libcurl.def file.
- I moved a bunch of prototypes from the public <curl/curl.h> file to the
library private urldata.h. This is because of the upcoming changes. The
low level interface is no longer being planned to become reality.
Daniel (15 February 2001)
- CURLOPT_POST is not required anymore. Just setting the POST string with
CURLOPT_POSTFIELDS will switch on the HTTP POST. Most other things in
libcurl already works this way, i.e they require only the parameter to
switch on a feature so I think this works well with the rest. Setting a NULL
string switches off the POST again.
- Excellent suggestions from Rich Gray, Rick Jones, Johan Nilsson and Bjorn
Reese helped me define a way how to incorporate persistant connections into
libcurl in a very smooth way. If done right, no change may have to be made
to older programs and they will just start using persistant connections when
applicable!
Daniel (13 February 2001)
- Changed the word 'timeouted' to 'timed out' in two different error messages.
Suggested by Larry Fahnoe.
Version 7.6.1
Daniel (9 February 2001)
- Frank Reid and Cain Hopwood provided information and research around a HTTPS
PUT/upload problem we seem to have. No solution found yet.
Daniel (8 February 2001)
- An interesting discussion is how to specify an empty password without having
curl ask for it interactively? The current implmentation takes an empty
password as a request for a password prompt. However, I still want to
support a blank user field. Thus, today if you enter "-u :" (without user
and password) curl will prompt for the password. Tricky. How would you
specify you want the prompt otherwise?
- Made the netrc parse result possible to use for other protocols than FTP and
HTTP (such as the upcoming TELNET fixes).
- The previously mentioned "MSVC++ problems" turned out to be a non-issue.
- Added a HTTP file upload code example in the docs/examples/ section on
request.
- Adjusted the FTP response fix slightly.
Version 7.6.1-pre3
Daniel (7 February 2001)
- SM found a flaw in the response reading function for FTP that could make
libcurl not get out of the loop properly when it should, if libcurl got -1
returned when reading the socket.
- I found a similar mistake in http.c when using a proxy and reading the
results from the proxy connection.
Daniel (6 February 2001)
- A friendly person named "SM" (nntp at iname.com) pointed out that the VC
makefile in src/ needed the libpath set for the debug build to work.
- Daniel Gehriger stepped in to assist with the VC++ stuff Robert Weaver
brought up yesterday.
Daniel (5 February 2001)
- Jun-ichiro itojun Hagino brought a big patch that brings IPv6-awareness to
a bunch of different areas within libcurl.
- Robert Weaver told me about the problems the MS VC++ 6.0 compiler has with
the 'static' keyword on a number of libcurl functions. I might need to add a
patch that redefines static when libcurl is compiled with that compiler.
How do I know when VC++ compiles, anyone?
Daniel (4 February 2001)
- curl_getinfo() was extended with two new options:
CURLINFO_CONTENT_LENGTH_DOWNLOAD and CURLINFO_CONTENT_LENGTH_UPLOAD. They
return the full assumed content length of the transfer in the given
direction. The CURLINFO_CONTENT_LENGTH_DOWNLOAD will be the Content-Length:
size of a HTTP download. Added descriptions to the man page as well. This
was done after discussions with Bob Schader.
Daniel (3 February 2001)
- Ingo Ralf Blum provided another fix that makes curl build under the more
recent cygwin installations. It seems they've changed the preset defines to
not include WIN32 anymore.
Version 7.6.1-pre2
Daniel (31 January 2001)
- Curl_read() and curl_read() now return a ssize_t for the size, as it had to
be able to return -1. The telnet support crashed due to this and there was a
possibility to weird behaviour all over. Linus Nielsen Feltzing helped me
find this.
- Added a configure.in check for a working getaddrinfo() if IPv6 is requested.
I also made the configure script feature --enable-debug which sets a couple
of compiler options when used. It assumes gcc.
Daniel (30 January 2001)
- I finally took a stab at the long-term FIXME item I've had on myself, and
now libcurl will properly work when doing a HTTP range-request that follows
a Location:. Previously that would make libcurl fail saying that the server
doesn't seem to support range requests.
Daniel (29 January 2001)
- I added a test case for the HTTP PUT resume thing (test case 33).
Version 7.6.1-pre1
Daniel (29 January 2001)
- Yet another Content-Range change. Ok now? Bob Schader checks from his end
and it works for him.
Daniel (27 January 2001)
- So the HTTP PUT resume fix wasn't good. There should appearantly be a
Content-Range header when resuming a PUT.
- I noticed I broke the download-check that verifies that a resumed HTTP
download is actually resumed. It got broke because my new 'httpreq' field
in the main curl struct. I should get slapped. I added a test case for
this now, so I won't be able to ruin this again without noticing.
- Added a test case for content-length verifying when downloading HTTP.
- Made the progress meter title say if the transfer is being transfered. It
makes the output slightly better for resumes.
- When dealing with Location: and HTTP return codes, libcurl will not attempt
to follow the spirit of RFC2616 better. It means that when POSTing to a
URL that is being following to a second place, the standard will judge on
what to do. All HTTP codes except 303 and 305 will cause curl to make a
second POST operation. 303 will make a GET and 305 is not yet supported.
I also wrote two test cases for this POST/GET/Location stuff.
Version 7.6
Daniel (26 January 2001)
- Lots of mails back and forth with Bob Schader finally made me add a small
piece of code in the HTTP engine so that HTTP upload resume works. You can
now do an operation like 'curl -T file -C <offset> <URL>' and curl will PUT
the ending part of the file starting at given offet to the specified URL.
Version 7.6-pre4
Daniel (25 January 2001)
- I took hold of Rick Jones' question why we don't use recv() and send() for
reading/writing to the sockets and I've now modified the sread() and
swrite() macros to use them instead. If nothing else, they could be tested
in the next beta-round coming right up.
- Jeff Morrow found a problem with libcurl's usage of SSL_read() and supplied
his research results in how to fix this. It turns out we have to invoke the
function several times in some cases. The same goes for the SSL_write().
I made some rather drastic changes all over libcurl to make all writes and
reads get done on one single place so that this repeated-attempts thing
would only have to be implemented at one point.
- Rick Jones spotted that the 'total time' counter really didn't measure the
total time very accurate on subsecond levels.
- Johan Nilsson pointed out the need to more clearly specify that the timeout
value you set for a download is for the *entire* download. There's currently
no option available that sets a timeout for the connection phase only.
Daniel (24 January 2001)
- Ingo Ralf Blum submitted a series of patches required to get curl to compile
properly with cygwin.
- Robert Weaver posted a fix for the win32 section of the curl_getenv() code
that corrected a potential memory leak.
- Added comments in a few files in a sudden attempt to make the sources more
easy to read and understand!
Daniel (23 January 2001)
- Added simple IPv6 detection in the configure script and made the version
string add 'ipv6' to the enable section in that case. ENABLE_IPV6 will be
set if curl is compiled with IPv6 support enabled.
- Added a parser for IPv6-style specified IP-addresses in a URL. Thus, when
IPv6 gets enabled soon, we can use URLs like '[0::1]:80'...
- Made the URL globbing in the client possible to fail silently if there's an
error in the globbing. It makes it almost intuitive, so when you don't
follow the syntax rules, globbing is simply switched off and the raw string
is used instead.
I still think we'll get problems with IPv6-style IP-addresses when we *want*
globbing on parts of the URL as the initial part of the URL will for sure
seriously confuse the globber.
Daniel (22 January 2001)
- Björn Stenberg supplied a progress meter patch that makes it look better even
during slow starts. Previously it made some silly assumptions...
- Added two FTP tests for -Q and -Q - stuff since it was being discussed on
the mailing list. Had to correct the ftpserver.pl too as it bugged slightly.
Daniel (19 January 2001)
- Made the Location: parsers deal with any-length URLs. Thus I removed the last
code that restricts the length of URLs that curl supports.
- Added a --globoff test case (#28) and it quickly identified a memory problem
in src/main.c that I took care of.
Version 7.6-pre3
Daniel (17 January 2001)
- Made the two former files lib/download.c and lib/highlevel.c become the new
lib/transfer.c which makes more sense. I also did the rename from Transfer()
to Curl_Transfer() in the other source files that use the transfer function
in the spirit of using Curl_ prefix for library-scoped global symbols.
Daniel (11 January 2001)
- Added -g/--globoff that switches OFF the URL globbing and thus enables {}[]
letters to be part of the URL. Do note that RFC2396 section 2.4.3 explicitly
mention these letters to be escaped. This was posted as a feature request by
Jorge Gutierrez and as a bug by Terry.
- Short options to curl that requires parameters can now be specified without
having the option and its parameter space separated. -ofile works as good as
-o file. -m20 is equal to -m 20. Do note that this goes for single-letter
options only, verbose --long-style options still must be separated with
space from their parameters.
Daniel (8 January 2001)
- Francis Dagenais reported that the SCO compiler still fails when compiling
curl due to that getpass_r() prototype. I've now put it around #ifndef
HAVE_GETPASS_R in an attempt to please the SCO systems.
- Made some minor corrections to get the client to cleanup properly and I made
the separator work again when getting multiple globbed URLs to stdout.
- Worked with Loic Dachary to get the make dist and make distcheck work
correctly. The 'maketgz' script is now using the automake generated 'make
dist' when creating release archives. Loic successfully made 'make rpms'
automatically build RPMs!
Loic Dachary (6 January 2001)
- Automated generation of rpm packages, no need to be root.
- make distcheck generates a proper distribution (EXTRA_DIST
in all Makefile.am modified to match FILES).
Daniel (5 January 2001)
- Huge client-side hack: now multiple URLs are supported. Any number of URLs
can be specified on the command line, and they'll all be downloaded. There
must be a corresponding -o or -O for each URL or the data will be written to
stdout. This needs more testing, time to release a 7.6-pre package.
- The krb4 support was broken in the release. Fixed now.
- Huge internal symbol rename operation. All non-static but still lib-internal
symbols should now be prefixed with 'Curl_' to prevent collisions with other
libs. All public symbols should be prefixed with 'curl_' and the rest should
be static and thus invisible to the outside world. I updated the INTERNALS
document to say this as well.
Version 7.5.2
Daniel (4 January 2001)
- As Kevin P Roth suggested, I've added text to the man page for every command
line option and what happens when you specify that option more than
once. That hasn't been exactly crystal clear before.
- Made the configure script possible to run from outside the source-tree. For
odd reasons I can't build curl properly outside though. It has to do with
curl's dependencies on libcurl...
- Cut off all older (dated 1999 and earlier) CHANGES entries from this file.
The older piece is named CHANGES.0 and is added to the CVS repository in
case anyone would need it.
- I added another file 'CVS-INFO' to the CVS. It contains information about
files in the CVS that aren't included in release archives and how to build
curl when you get the sources off CVS.
- Updated CONTRIBUTE and FAQ due to the new license.
Daniel (3 January 2001)
- Renamed README.libcurl to LIBCURL
- Changed headers in all sources files to the new dual license concept of
curl: use the MIT/X derivate license *or* MPL. The LEGAL file was updated
accordingly and the MPL 1.1 and MIT/X derivate licenses are now part of the
release archive.
Daniel (30 December 2000)
- Made all FTP commands get sent with the trailing CRLF in one single write()
as splitting them up seems to confuse at least some firewalls (FW-1 being
one major).
Daniel (19 December 2000)
- Added file desrciptor and FILE handle leak detection to the memdebug system
and thus I found and removed a file descriptor leakage in the ftp parts
that happened when you did PORTed downloads.
- Added an include <stdio.h> in <curl/curl.h> since it uses FILE *.
Daniel (12 December 2000)
- Multiple URL downloads with -O was still bugging. Not anymore I think or
hope, or at least I've tried... :-O
- Francois Petitjean fixed another -O problem
Version 7.5.1
Daniel (11 December 2000)
- Cleaned up a few of the makefiles to use unix-style newlines only. As Kevin
P Roth found out, at least one CVS client behaved wrongly when it found
different newline conventions within the same file.
- Albert Chin-A-Young corrected the LDFLAGS use in the configure script for
the SSL stuff.
Daniel (6 December 2000)
- Massimo Squillace correctly described how libcurl could use session ids when
doing SSL connections.
- James Griffiths found out that curl would crash if the file you specify with
-o is shorter than the URL! This took some hours to fully hunt down, but it
is fixed now.
Daniel (5 December 2000)
- Jaepil Kim sent us makefiles that build curl using the free windows borland
compiler. The root makefile now accepts 'make borland' to build curl with
that compiler.
- Stefan Radman pointed out that the test makefiles didn't use the PERL
variable that the configure scripts figure out. Actually, you still need
perl in the path for the test suite to run ok.
- Rich Gray found numerous portability problems:
* The SCO compiler got an error on the getpass_r() prototype in getpass.h
since the curl one differed from the SCO one
* The HPUX compiler got an error because of how curl did the sigaction
stuff and used a define HPUX doesn't have (or need).
* A few more problems remain to be researched.
- Paul Harrington experienced a core dump using https. Not much details yet.
Daniel (4 December 2000)
- Jörn Hartroth fixed a problem with multiple URLs and -o/-O.
Version 7.5
Daniel (1 December 2000)
- Craig Davison gave us his updates on the VC++ makefiles, so now curl should
build fine with the Microsoft compiler on windows too.
- Fixed the libcurl versioning so that we don't ruin old programs when
releasing new shared library interfaces.
Daniel (30 November 2000)
- Renamed docs/README.curl to docs/MANUAL to better reflect what the document
actually contains.
Daniel (29 November 2000)
- I removed a bunch of '#if 0' sections from the code. They only make things
harder to follow. After all, we do have all older versions in the CVS.
Version 7.5-pre5
Daniel (28 November 2000)
- I filled in more error codes in the man page error code list that had been
lagging.
- James Griffiths mailed me a fine patch that introduces the CURLOPT_MAXREDIRS
libcurl option. When used, it'll prevent location following more than the
set number of times. It is useful to break out of endless redirect-loops.
Daniel (27 November 2000)
- Added two test cases for file://.
Daniel (22 November 2000)
- Added the libcurl CURLOPT_FILETIME setopt, when set it tries to get the
modified time of the remote document. This is a special option since it
involves an extra set of commands on FTP servers. (Using the MDTM command
which is not in the RFC959)
curl_easy_getinfo() got a corresponding CURLINFO_FILETIME to get the time
after a transfer. It'll return a zero if CURLOPT_FILETIME wasn't used or if
the time wasn't possible to get.
--head/-I used on a FTP server will now present a 'Last-Modified:' header
if curl could get the time of the specified file.
- Added the option '--cacert [file]' to curl, which allows a specified PEM
file to be used to verify the peer's certificate when doing HTTPS
connections. This has been requested, rather recently by Hulka Bohuslav but
others have asked for it before as well.
Daniel (21 November 2000)
- Numerous fixes the test suite has brought into the daylight:
* curl_unescape() could return a too long string
* on ftp transfer failures, there could be memory leaks
* ftp CWD could use bad directory names
* memdebug now uses the mprintf() routines for better portability
* free(NULL) removed when doing resumed transfers
- Added a bunch of test cases for FTP.
- General cleanups to make less warnings with gcc -Wall -pedantic.
- I made the tests/ftpserver.pl work with the most commonly used ftp
operations. PORT, PASV, RETR, STOR, LIST, SIZE, USER, PASS all work now. Now
all I have to do is integrate the ftp server doings in the runtests.pl
script so that ftp tests can be run the same way http tests already run.
Daniel (20 November 2000)
- Made libcurl capable of dealing with any-length URLs. The former limit of
4096 bytes was a bit annoying when people wanted to use curl to really make
life tough on a web server. Now, the command line limit is the most annoying
but that can be circumvented by using a config file.
NOTE: there is still a 4096-byte limit on URLs extracted from Location:
headers.
- Corrected the spelling of 'resolve' in two error messages.
- Alexander Kourakos posted a bug report and a patch that corrected it! It
turned out that lynx and wget support lowercase environment variable names
where curl only looked for the uppercase versions. Now curl will use the
lowercase versions if they exist, but if they don't, it'll use the uppercase
versions.
Daniel (17 November 2000)
- curl_formfree() was added. How come no one missed that one before? I ran the
test suite with the malloc debug enabled and got lots of "nice" warnings on
memory leaks. The most serious one was this. There were also leaks in the
cookie handling, and a few errors when curl failed to connect and similar
things. More tests cases were added to cover up and to verify that these
problems have been removed.
- Mucho updated config file parser (I'm dead tired of all the bug reports and
weird behaviour I get on the former one). It works slightly differently now,
although I doubt many people will notice the differences. The main
difference being that if you use options that require parameters, they must
both be specified on the same line. With this new parser, you can also
specify long options without '--' and you may separate options and
parameters with : or =. It makes a config file line could look like:
user-agent = "foobar and something"
Parameters within quotes may contain spaces. Without quotes, they're
expected to be a single non-space word.
Had to patch the command line argument parser a little to make this work.
- Added --url as an option to allow the URL to be specified this way. It makes
way nicer config files. The previous way of specifying URLs in the config
file doesn't work anymore.
Daniel (15 November 2000)
- Using certain characters in usernames or passwords for HTTP authentication
failed. This was due to the mprintf() that had a silly check for letters,
and if they weren't isprint() they weren't outputed "as-is". This caused
passwords and usernames using '§' (for example) to fail.
Version 7.4.2
Daniel (15 November 2000)
- 'tests/runtests.pl' now sorts the test cases properly when 'all' is used.
Daniel (14 November 2000)
- I fell over the draft-ietf-ftpext-mlst-12.txt Internet Draft titled
"Extensions to FTP" that contains a defined way how the ftp command SIZE
could be assumed to work.
- Laurent Papier posted a bug report about using "-C -" and FTP uploading a
file that isn't prsent on the server. The server might then return a 550 and
curl will fail. Should it instead as Laurent Papier suggests, start
uploading from the beginning as a normal upload?
Daniel (13 November 2000)
- Fixed a crash with the followlocation counter.
- While writing test cases for the test suite, I discovered an old limitation
that prevented -o and -T to be used at the same time. I removed this
immediately as this has no relevance in the current libcurl.
- Chris Faherty fixed a free-twice problem in lib/file.c
- I fixed the perl http server problem in the test suite.
Version 7.4.2 pre4
Daniel (10 November 2000)
- I've (finally) started working on the curl test suite. It is in the new
tests/ directory. It requires sh and perl. There's a TCP server in perl and
most of the other stuff running a pretty simple shell script.
I've only made four test cases so far, but it proves the system can work.
- Laurent Papier noticed that curl didn't set TYPE when doing --head checks
for sizes on FTP servers. Some servers seem to return different sizes
depending on whether ASCII or BINARY is used!
- Laurent Papier detected that if you appended a FTP upload and everything was
already uploaded, curl would hang.
- Angus Mackay's getpass_r() in lib/getpass.c is now compliant with the
getpass_r() function it seems some systems actually have.
- Venkataramana Mokkapati detected a bug in the cookie parser and corrected
it. If the cookie was set for the full host name (domain=full.host.com),
the cookie was never sent back because of a faulty length comparison between
the set domain length and the current host name.
Daniel (9 November 2000)
- Added a configure check for gethostbyname in -lsocket (OS/2 seems to need
it). Added a check for RSAglue/rsaref for the cases where libcrypto is found
but libssl isn't. I haven't verified this fix yet though, as I have no
system that requires those libs to build.
Version 7.4.2 pre3
Daniel (7 November 2000)
- Removed perror() outputs from getpass.c. Angus Mackay also agreed to a
slightly modified license of the getpass.c file as the prototype was changed.
Daniel (6 November 2000)
- Added possibility to set a password callback to use instead of the built-in.
They're controled with curl_easy_setopt() of course, the tags are
CURLOPT_PASSWDFUNCTION and CURLOPT_PASSWDDATA.
- Used T. Bharath's thinking and fixed the timers that showed terribly wrong
times when location: headers were followed.
- Emmanuel Tychon discovered that curl didn't really like user names only in
the URL. I corrected this and I also fixed the since long living problem
with URL encoded user names and passwords in the URLs. They should work now.
Daniel (2 November 2000)
- When I added --interface, the new error code that was added with it was
inserted in the wrong place and thus all error codes from 35 and upwards got
increased one step. This is now corrected, we're back at the previous
numbers. All new exit codes should be added at the end.
Daniel (1 November 2000)
- Added a check for signal() in the configure script so that if sigaction()
isn't present, we can use signal() instead.
- I'm having a license discussion going on privately. The issue is yet again
GPL-licensed programs that have problems with MPL. I am leaning towards
making a kind of dual-license that will solve this once and for all...
Daniel (31 October 2000)
- Added the packages/ directory. I intend to let this contain some docs and
templates on how to generate custom-format packages for various platforms.
I've now removed the RPM related curl.spec files from the archive root.
Daniel (30 October 2000)
- T. Bharath brought a set of patches that bring new functionality to
curl_easy_getinfo() and curl_easy_setopt(). Now you can request peer
certificate verification with the *setopt() CURLOPT_SSL_VERIFYPEER option
and then use the CURLOPT_CAINFO to set the certificate to verify the remote
peer against. After an such an operation with a verification request, the
*_getinfo() option CURLINFO_SSL_VERIFYRESULT will return information about
whether the verification succeeded or not.
Daniel (27 October 2000)
- Georg Horn brought us a splendid patch that solves the long-standing
annoying problem with timeouts that made curl exit with silly exit codes
(which as been commented out lately). This solution is sigaction() based and
of course then only works for unixes (and only those unixes that actually
have the sigaction() function).
Daniel (26 October 2000)
- Björn Stenberg supplied a patch that fixed the flaw mentioned by Kevin Roth
that made the password get echoed when prompted for interactively. The
getpass() function (now known as my_getpass()) was also fixed to not use any
static buffers. This also means we cannot use the "standard" getpass()
function even for those systems that have it, since it isn't thread-safe.
- Kevin Roth found out that if you'd write a config file with '-v url', the
url would not be used as "default URL" as documented, although if you wrote
it 'url -v' it worked! This has been corrected now.
- Kevin Roth's idea of using multiple -d options on the same command line was
just brilliant, and I couldn't really think of any reason why we shouldn't
support it! The append function always append '&' and then the new -d
chunk. This enables constructs like the following:
curl -d name=daniel -d age=unknown foobarsite.com
Daniel (24 October 2000)
- I fixed the lib/memdebug.c source so that it compiles on Linux and other
systems. It will be useful one day when someone else but me wants to run the
memory debugging system.
Daniel (23 October 2000)
- I modified the maketgz and configure scripts, so that the configure script
will fetch the version number from the include/curl/curl.h header files, and
then the maketgz doesn't have to rebuild the configure script when I build
release-archives.
- Björn Stenberg and Linus Nielsen correctly pointed out that curl was silly
enough to not allow @-letters in passwords when they were specified with the
-u or -U flags (CURLOPT_USERPWD and CURLOPT_PROXYUSERPWD). This also
suggests that curl probably should url-decode the password piece of an URL
so that you could pass an encoded @-letter there...
Daniel (20 October 2000)
- Yet another http server barfed on curl's request that include the port
number in the Host: header always. I now only include the port number if it
isn't the default (80 for HTTP, 443 for HTTPS). www.perl.com turned out to
run one of those nasty servers.
- The PHP4 module for curl had problems with referer that seems to have been
corrected just yesterday. (Sterling Hughes of the PHP team confirmed this)
Daniel (17 October 2000)
- Vladimir Oblomov reported that the -Y and -y options didn't work. They
didn't work for me either. This once again proves we should have that test
suite...
- I finally changed the error message libcurl returns if you try a https://
URL when the library wasn't build with SSL enabled. It will now return this
error:
"libcurl was built with SSL disabled, https: not supported!"
I really hope it will make it a bit clearer to users where the actual
problem lies.
Version 7.4.1
Daniel (16 October 2000)
- I forgot to remove some of the malloc debug defines from the makefiles in
the release archive (of course).
Version 7.4
Daniel (16 October 2000)
- The buffer overflow mentioned below was posted to bugtraq on Friday 13th.
Daniel (12 October 2000)
- Colin Robert Phipps elegantly corrected a buffer overflow. It could be used
by an evil ftp server to crash curl. I took the opportunity of replacing a
few other sprintf()s into snprintf()s as well.
Daniel (11 October 2000)
- Found some more memory leaks. This new simple memory debugger has turned out
really useful!
Version 7.4 pre6
Daniel (9 October 2000)
- Florian Koenig pointed out that the bool typedef in the curl/curl.h include
file was breaking PHP 4.0.3 compiling. The bool typedef is not used in the
public interface and was wrongly inserted in that header file.
- Jörg Hartroth corrected a minor memory leak in the src/urlglob.c stuff. It
didn't harm anyone since the memory is free()ed on exit anyway.
- Corrected the src/main.c. We use the _MPRINTF_REPLACE #define to use our
libcurl-printf() functions. This gives us snprintf() et al on all
platforms. I converted the allocated useragent string to one that uses a
local buffer.
- I've set an #if 0 section around the Content-Transfer-Encoding header
generated in lib/formdata.c. This will hopefully make curl do more
PHP-friendly multi-part posts.
Version 7.4 pre5
Daniel (9 October 2000)
- Nico Baggus found out that curl's ability to force a ASCII download when
using FTP was no longer working! I corrected this. This problem was probably
introduced when I redesigned libcurl for version 7.
- Georg Horn provided a source example that proved a memory leak in libcurl.
I added simple memory debugging facilities and now we can make libcurl log
all memory fiddling functions. An additional perl script is used to analyze
the output logfile and to match malloc()s with free()s etc. The memory leak
Georg found turned out to be the main cookie struct that cookie_cleanup()
didn't free! The perl script is named memanalyze.pl and it is available in
the CVS respository, not in the release archive.
Daniel (8 October 2000)
- Georg Horn found a GetHost() problem. It turned out it never assigned the
pointer in the third argument properly! This could make a crash, or at best
a memory leak!
Version 7.4 pre4
Daniel (6 October 2000)
- Is the -F post following the RFC 1867 spec? We had this dicussion on the
mailing list since it appears curl can't post -F form posts to a PHP
receiver... I've been in touch with the PHP developers about this.
- Domenico Andreoli found out that the long option '--proxy' wasn't working
anymore! The option parser got confused when I added the --proxytunnel for
7.3. This was indeed a very old flaw that hasn't turned up until now...
- Jörn Hartroth provided patches, updated makefiles and two new files for DLL
stuff on win32. He also pointed out that lib source files were compiled with
-I../src which isn't only wrong but plain stupid!
- Troels Walsted Hansen fixed a problem with HTTP resume. Curl previously used
a local variable badly, that could lead to crashes.
Version 7.4 pre3
Daniel (4 October 2000)
- More docs written. The curl_easy_getinfo.3 man page is now pretty accurate,
as is the -w section in curl.1. I added two options to enable the user to
get information about the received headers' size and the size of the HTTP
request. T. Bharath requested them.
Daniel (3 October 2000)
- Corrected a sever free() before use in the new add_buffer_send()! ;-)
Version 7.4 pre2
Daniel (3 October 2000)
- Jason S. Priebe sent me patches that changed the way curl issues HTTP
requests. The entire request is now issued in one single shot. It didn't do
this previously, and it has turned out that since the common browsers do it
this way, some sites have turned out to work with browsers but not with
curl! Although this is not a client-side problem, we want to be able to
fully emulate browsers, and thus we have now adjusted the networking layer
to slightly more appear as a browser. I adjusted Jason's patch, the faults
are probably mine.
Daniel (2 October 2000)
- Anyone who ever uploaded data with curl on a slow link has noticed that the
progess meter is updated very infrequently. That is due to the large buffer
size curl is using. It reads 50Kb and sends it, updates the progress meter
and loops. 50Kb is very much on a slow link, although it is pretty neat to
use on a fast one.
I've now made an adjustment that makes curl use a 2Kb buffer for uploads to
start with. If curl's average upload speed is faster than buffer size bytes
per second, curl will increase the used buffer size up to max 50Kb. It
should make the progress meter work better.
Version 7.4 pre1
Daniel (29 September 2000)
- Ripped out the -w stuff from the library and put in the curl tool. It gets
all the relevant info from the library using the new curl_easy_getinfo()
function.
- brad at openbsd.org mailed me a patch that corrected my kerberos mistake and
removed a compiler warning from hostip.c that OpenBSD people get.
Daniel (28 September 2000)
- Of course (I should probably get punished somehow) I didn't properly correct
the #include lines for the base64 stuff in the kerberos sources in the just
released 7.3 package. They still include the *_krb.h files! Now, the error
is sooo very easy to spot and fix so I won't bother with a quick bug fix
release. I'll post a patch whenever one is needed instead. It'll be
available in the CVS in a few minutes anyway.
Version 7.3
Daniel (28 September 2000)
- Removed the base64_krb.[ch] files. They've now replaced the former
base64.[ch] files.
Daniel (26 September 2000)
- Updated some docs.
- I changed the OpenSSL fix to work with older versions as well. The posted
patch was only working with 0.9.6 and no older ones.
Version 7.3-pre8
Daniel (25 September 2000)
- Erdmut Pfeifer informed us that curl didn't build with OpenSSL 0.9.6 and
showed us what needed to get patched in order to make it build properly
again.
- Dirk Kruschewski found a bug in the cookie parser. I made an alternative
approach to the solution Dirk himself suggested. The bug made a cookie
header that didn't end with a trailing semicolon to not get parsed.
- I've marked -c and -t deprecated now. If you use any of them, curl will tell
you to use "-C -" or "-T -" instead. I don't think occupying two letters for
nearly identical functions is good use. Also, -T - kind of follows the curl
tradition of using - for stdin where a file name is expected.
Daniel (23 September 2000)
- Martin Hedenfalk provided the patch that finally made the krb4 ftp upload
work!
Daniel (21 September 2000)
- The kerberos code is not quite thread-safe yet. There are a few more globals
that need to be take care of. Let's get the upload working first!
Daniel (20 September 2000)
- Richard Prescott solved another name lookup buffer size problem. I took this
opportunity to rewrite the GetHost() function. With these large buffer
sizes, I think keeping them as local arrays quickly turn ugly. I now use
malloc() to get the buffer memory. Thanks to this, I now can realloc() to a
large buffer in case of demand (errno == ERANGE) in case a solution like
that would become necessary. I still want to avoid that kind of nastiness.
- Tried to compile and run curl on Linux for alpha and FreeBSD for alpha. Went
as smooth as it could.
- Added a docs/examples directory with two tiny example sources that show how
to use libcurl. I hope users will supply me with more useful examples
further on.
- Applied a patch by Jörn Hartroth to no longer use the word 'inteface' in the
config struct in the src/main.c file since certain compilers have that word
"reservered". I figure that is some kind of C++ decease.
- Updated the curl.1 man page with --interface and --krb4.
- Modified the base64Encode() function to work like the kerberos one, so that
I could remove the use of that. There is no need for *two* base64 encoding
functions! ;-)
Version 7.3pre5
Daniel (19 September 2000)
- The kerberos4-layer source code that is much "influenced" by the original
krb4 source code, through yafc into curl, was using quite a lot of global
variables. libcurl can't work properly with globals like that why I had to
clean up almost every function in the new security.c to make them use
connection specific variables instead of the globals. I just hope I didn't
destroy anything now... :-) configure updated, version string now reflects
krb4 built-in. It almost works now. Only uploads are still being naughty.
Version 7.3pre3
Daniel (18 September 2000)
- Martin Hedenfalk supplied a major patch that introduces krb4-ftp support to
curl. Martin is the primary author of the ftp client named yafc and he did
not hesitate to help us implement this when I asked him. Many and sincere
thanks to a splendid effort. It didn't even take many hours!
- Stephen Kick supplied a big patch that introduces the --interface flag to
the curl tool and CURLOPT_INTERFACE for libcurl. It allows you to specify an
outgoing interface to use for your request. This may not work on all
platforms. This needs testing.
- Richard Prescott noticed that curl on Tru64 unix could core dumped if the
name didn't resolve properly. This was due to the GetHost() function not
returning an error even though it failed on some platforms!
Daniel (15 September 2000)
- Updated all sorts of documents in regards to the new proxytunnel support.
Version 7.3pre2
Daniel (15 September 2000)
- Kai-Uwe Rommel pointed out a problem in the httpproxytunnel stuff for ftp.
Adjusted it. Added better info message when setting up the tunnel and the
pasv message when doing the second connect.
Version 7.3pre1
Daniel (15 September 2000)
- libcurl now allows "httpproxytunnel" to an arbitrary host and port name. The
second connection on ftp needed that.
- TheArtOfHTTPScripting was corrected all over. I both type and spell really
bad at times!
Daniel (14 September 2000)
- -p/--proxytunnel was added to 'curl'. It uses the new
CURLOPT_HTTPPROXYTUNNEL libcurl option that allows "any" protocol to tunnel
through the specified http proxy. At the moment, this should work with ftp.
Daniel (13 September 2000)
- Jochen Schaeuble found that file:// didn't work as expected. Corrected this
and mailed the patch to the mailing list.
Daniel (7 September 2000)
- I changed the #define T() in curl.h since it turned out it wasn't really
a good symbol to use (when you compiled PHP with curl as a module, that
define collided with some IMAP define or something). This was posted to the
PHP bug tracker.
- I added extern "C" stuff in two header files to better allow libcurl usage
in C++ sorces. Discussions on the libcurl list with Danny Horswell lead to
this.
Version 7.2.1
Daniel (31 August 2000)
- Albert Chin-A-Young fixed the configure script *again* and now it seems to
detect Linux name resolving properly! (heard that before?)
- Troels Walsted Hansen pointed out that downloading a file containing the
letter '+' from an ftp server didn't work. It did work from HTTP though and
the reason was my lame URL decoder.
- I happened to notice that -I didn't at all work on ftp anymore. I corrected
that.
Version 7.2
Daniel (30 August 2000)
- Understanding AIX is a hard task. I believe I'll never figure out why they
solve things so differently from the other unixes. Now, I'm left with the
AIX 4.3 run-time warnings about duplicate symbols that according to this
article (http://www.geocrawler.com/archives/3/405/1999/9/0/2593428/) is a
libtool flaw. I tried the mentioned patch, although that stops the linking
completely.
So, if I select to ignore the ld warnings there are compiler warnings that
fill the screen pretty bad when curl compiles. It turns out that if I want
to '#include <arpa/inet.h>', I can get tid of the warnings by include the
following three include files before that one:
#include <net/if_dl.h>
#include <sys/mbuf.h>
#include <netinet/if_ether.h>
Now, is it really sane to add those include files before arpa/inet.h in all
the source files that include it?
Thanks to Albert Chin-A-Young at thewrittenword.com who gave me the AIX
login to try everything on.
Daniel (24 August 2000)
- Jan Schmidt supplied us a new VC6 makefile for Windows as the previous one
was not up to date but lacked several object files.
- More work on the naming.
- Albert Chin-A-Young provided a configure-check for large file support, as
some systems seem to need that for them to work. Had to change the position
for the config.h include file in every .c file in the libcurl dir...
- As suggested on the mailing list (by Troy Engel), I did use a --data-binary
option instead of the messy way I've left described below. It seems to
work. The libcurl fix remained the same as yesterday.
Daniel (23 August 2000)
- Back on the -d stripping newlines thing. The 'plain post' thing was added
when I had no thought of that one could actually post binary data with
it. Now, I have to add this functionality in a graceful manner and I think
I've managed to come up with a way: '-d @file;binary' will thus post the
file binary, exactly as its contents are. It is implemented with a new
*setopt() option (CURLOPT_POSTFIELDSIZE) to set the postfield size, since
libcurl can't strlen() the data in these cases.
- Albert Chin-A-Young made some very serious efforts and all the name
resolving problems seem to have been sorted out now on all the platforms
that previously showed them. I'll make another release now anyday because of
this.
- The FAQ was much enhanced when it comes to the licensing issues thanks to
Bjorn Reese.
Daniel (21 August 2000)
- Rick Welykochy pointed out a problem when you use -d to post and you want to
keep the newlines, as curl strips them off as a bonus before posting...
This needs to be addressed.
Version 7.1.1
Daniel (21 August 2000)
- Got more people involved in the gethostbyname_r() mess. Caolan McNamara sent
me configure-code that turned out to be very similar to my existing tests
which only make me more sure I'm on the right path. I changed the order of
the tests slightly, as it seems that some compilers don't yell error if a
function is used with too many parameters. Thus, the first tested function
will seem ok... Let's hope more compilers think of too-few parameters as bad
manners, as we're now trying the functions in that order; fewer first. I
should also add that Lars Hecking mailed me and volunteered to run tests on
a few odd systems. Coalan is keeping his work over at
http://www.csn.ul.ie/~caolan/publink/gethostbyname_r/. Might be handy in the
future as well.
Daniel (18 August 2000)
- I noticed I hadn't increased the name lookup buffer in lib/ftp.c. I don't
think this is the reason for the continued trouble though.
Daniel (17 August 2000)
- Fred Noz corrected my stupid mistakes in the gethostbyname_r() fluff. It
should affect some AIX, Digital Unix and HPUX 10 systems.
Daniel (15 August 2000)
- Mathieu Legare compiled and build 7.1 without errors on both AIX 4.2 as well
as AIX 4.3. Now why did problems occur before?
- Fred Noz reported a -w/--write-out bug that caused it to malfunction when
used combined with multiple URL retrievales. All but the first display got
screwed up!
Daniel (11 August 2000)
- Jason Priebe and an anonymous friend found some host names the Linux version
of curl could not resolve. It turned out the buffer used to retrieve that
information was too small. Fixed. One could argue about the usefulness of
not having the slightest trace of a man page for gethostbyname_r() on my
Linux Redhat installation...
Daniel (10 August 2000)
- Balaji S Rao was first in line to note the missing possibility to replace
the Content-Type: and Content-Length: headers when doing -d posts. I added
the possibility just now. It seems some people wants to do standard posts
using custom Content-Types.
Daniel (8 August 2000)
- Mike Dowell correctly discovered that curl did not approve of URLs with no
user name but password. As in 'http://:foo@haxx.se'. I corrected this.
Version 7.1
Daniel (7 August 2000)
- My AIX 4 fix does not work. I need help from a AIX 4 hacker.
- I added my new document in the docs directory. It is aimed to become a sort
of tutorial on how to do HTTP scripting with curl.
Daniel (4 August 2000)
- Working with Rich Gray on compiling curl for lots of different platforms.
My fix for AIX 3.2 was not good enough and was slightly changed, I had to
move an include file before another, as is now described in the source.
AIX 4.2 (4.X?) has different gethostbyname_r() and gethostbyaddr_r()
functions that the configure script didn't check for and thus the compile
broke with an error. I have now changed the gethostbyname_r() check in the
configure file to support all three versions of both these functions. My
implementation that uses the AIX-style is though not yet verified and I may
get problems to fix it if it turns out to bug since I don't have access to
any system using that.
For problems like that, I made the configure script allow --disable-thread
to completely switch off the check for threadsafe versions of a few
functions and thus go with the "good old versions" that tend to work
although will break thread-safeness for libcurl. Most people won't use
libcurl for other things than curl though, and curl doesn't need a
thread-safe lib.
- Working on my big tutorial about HTTP scripting with curl.
Daniel (1 August 2000)
- Rich Gray spotted a problem in src/setup.h caused by a #define strequal()
that was just a left-over from passed times. The strequal() is now a true
function supplied by libcurl for a portable case insensitive string
comparison. I added the prototypes in include/curl.h and removed the
now obsolete #define.
- Igor Khristophorov made a fix to allow resumed download from Sun's
JavaWebServer/1.1.1. It seems that their server sends bad Content-Range
headers.
- The makefiles forced a static library build, which is bad since we now use
libtool and thus have excellent shared library support! Albert Chin-A-Young
found out.
Version 7.0.11beta
Daniel (1 August 2000)
- Albert Chin-A-Young pointed out that 'make install' did not properly create
the header include directory, why it failed to install the header files as
it should. Automake isn't really equipped to deal with subdirectories
without Makefiles in any nice way. I had to run ahead and add Makefiles in
both include and include/curl before I managed to create a top-level
makefile that succeeds in install everything properly!
- Ok, no more "features" added now. Let's just verify that there's no major
flaws added now.
Daniel (31 July 2000)
- Both Jeff Schasny and Ketil Froyn asked me how to tell curl not to send one
of those internally generated headers. They didn't settle with the blank
ones you could tell curl to use. I rewrote the header-replace stuff a
little. Now, if you replace an internal header with your own and that new
one is a blank header you will only remove the internal one and not get any
blank. I couldn't figure out any case when you want that blank header.
Daniel (29 July 2000)
- It struck me that the lib used localtime() which is not thread-safe, so now
I use localtime_r() in the systems that has it.
- I went through this entire document and removed all email addresses and left
names only. I've really made an effort to always note who brought be bug
reports or fixes, but more and more people ask me to remove the email
addresses since they become victims for spams this way. Gordon Beaton got me
working on this.
Daniel (27 July 2000)
- Jörn Hartroth found out that when you specified a HTTP proxy in an
environment variable and used -L, curl failed in the second fetch. I
corrected this problem and posted a patch to the list. No need for an extra
beta release just for this.
Version 7.0.10beta
Daniel (27 July 2000)
- So, libtool replaced two of my files with symbolic links and I forgot to add
the two new libtool files to the release archive (and they were added as
symlinks as well!) This of course lead to that the configure script failed
on 7.0.9...
Version 7.0.9beta
Daniel (25 July 2000)
- Kristian Köhntopp <kris at koehntopp.de> brought a fix that makes libcurl
libtoolified, just as we've wanted for a while now. He also made the
recently added man pages get installed properly on 'make install' and some
other nice cleanups.
- In a discussion with Eetu Ojanen it struck me that if we use curl to get a
page using a password, and that page then sends a Location: to another
server that curl follows, curl will send the user name and password to that
server as well.
Now, I'll never be able to make curl do Location: following all that perfect
and you're all sooner or later required to write a script to do several
fetches when you're doing advanced stuff, but now I've modified curl to at
least *only* send the user name and password to the original server. Which
means that if get a page from server A with a password, that forwards curl
to server B, curl won't use the password there. If server B then forwards
curl back to server A again, the password will be used again.
This is not a perfect implementation, as in a browser case it would only use
the password if the left-prefix of the first path is the same. I just think
that this fix prevents a somewhat lurky "security hole".
As a side-note in this subject: HTTP passwords are sent in cleartext and
will never be considered to be safe or secure. Use HTTPS for that.
- As discussed on the mailing list, I converted the FTP response reading
function into using select() which then allows timeouts (even under win32!)
if the command-reply session gets too slow or dies completely. I made a
default timeout on 3600 seconds unless anything else is specified, since I
don't think anyone wants to wait more than that for a single character to
get received...
- Torsten Foertsch <torsten.foertsch at gmx.net> brought a set of fixes for
the rfc1867 form posts. He introduced 'name=<file' which brings a means to
suuply very large text chunks read from the given file name. It differs from
'name=@file' in the way that this latter thing is marked in the uploaded
contents as a file upload, while the first is just text (as in a input or
textarea field). Torsten also corrected a bug that would happen if you used
%s or similar in a -F file name.
- As discovered by Nico Baggus <Nico.Baggus at mail.ing.nl>, when transferring
files to/from FTP using type ASCII curl should not expect the transfer to be
the exact size reported by the server as the file size. Since ASCII may very
well mean that the content is translated while transfered, the final size
may very well differ. Therefor, curl now ignores the file size when doing
ASCII transfers in FTP.
Daniel (24 July 2000)
- Added CURLOPT_PROXYPORT to the curl_easy_setopt() call to allow the proxy
port number to be set separately from the proxy host name.
- Andrew <andrew at ugh.net.au> pointed out a netrc manual bug.
- The FTP transfer code now accepts a 250-code as well as the previously
accepted 226, after a successful file transfer. Mohan <mnair at
evergreen-funds.com> pointed this out.
- The check for *both* nsl and socket was never added in the v7 configure.in
when I moved the main branch. I re-added that check to configure.in. This was
discovered by Rich Gray.
- Howard, Blaise <Blaise.Howard at factiva.com> pointed out a missing free() in
curl_disconnect() which of course meant libcurl ate memory.
- Brian E. Gallew noted that the HTTP 'Host:' header curl sent did not
properly include the port number if non-default ports were used. This should
now have been fixed.
- HTTP connect errors now return errors earlier. This was most notably causing
problems when the HTTPS certificate had problems and later caused a crash.
Many thanks to Gregory Nicholls <gnicholls at level8.com> for discovering
and suggesting a fix...
Daniel (21 June 2000)
- After a "bug report" I received where the user was using both -F and -I in a
HTTP request (it severly confused the library I should add), I added some
checks to src/main.c that prevents setting more than one HTTP request
command, no matter what the user wants! ;-)
Version 7.0.8beta
Daniel (20 June 2000)
- I did a major replace in many files to use the new curl domain haxx.se
instead of the previous one.
- As Eetu Ojanen suggested, I finally took the step and now libcurl no longer
makes a POST after it has followed a location. When the initial POST has
been done, it'll turned into a GET for the further requests. This is only
interesting when using -L/--location *and* doing a POST at the same time.
While messing with this, I added another weird feature I call 'auto
referer'. If you append ';auto' to the right of a given referer string (or
only use that string as referer), libcurl will automatically set the
previoud URL as refered when it follows a Location: and gets a succeeding
document.
- My hero Rich Gray found the very obscure FTP bug that happened to him only
when passing through a particular firewall and using the PORT command. It
turned out that PORT was the only command in the lib/ftp.c source that
didn't send a proper \r\n sequence but instead used the faulty \n which as
it seemed is supported by most major ftp servers... :-O
Version 7.0.7beta
Daniel (16 June 2000)
- I had avoided this long enough now, so I moved the alternative progress bar
stuff from the lib and added it to the client code. This is now using the
recently added progress callback and it seems to work pretty much like
before. Since it is only one progress bar and you and download and upload at
the same time, this bar shows the combined progress of both directions. This
code was just ported from the old place to this, Lars is still our saviour!
;-) This also made the documentation more accurate since I never removed
this function from any docs! Although I now removed the CURLOPT_PROGRESSMODE
from the library since the lib has only one internal progress meter and it
will never get another. It is although likely that the internal one also
will be moved to the client code in the future (when I have other means of
getting the writeout data and move that too to the client).
- I took the opportunity to verify that standard progress meter works and I
found out it didn't get inited properly. Grrr. I corrected that as well.
Daniel (15 June 2000)
- I thought I'd better verify that the -F option still works in v7 and of
course it didn't... :-/ Anyway, I had the problems I could discover
corrected. About one month of beta testing and not a single person has used
this feature with v7?
- Björn correctly pointed out that the --progress-bar still doesn't work in
v7. Hm.
Daniel (14 June 2000)
- Tim Tassonis discovered that curl 7 didn't handle normal http POST as it
should. I corrected this.
Version 7.0.6beta
Daniel (14 June 2000)
- Björn Stenberg pointed out several problems (related to win32 compiling):
lib/strequal.c had a bad #ifdef for one of the string comparisons (win32)
src/main.c had several minor problems
lib/makefile.m32 had getpass.[co] twice
src/config-win32.h lacked the HAVE_FCNTL_H define
both config-win32.h files now only set the HAVE_UNISTD_H define if the
define MINGW32 is set, and I modified src/makefile.m32 and lib/makefile.m32
to set it.
Version 7.0.5beta
Daniel (14 June 2000)
- Applied Luong Dinh Dung's comments about a few win32 compile problems.
- Applied Björn Stenberg's suggested fix that turns the win32 stdout to
binary. It won't do it if the -B / --use-ascii option is used. That option
is now an extended version of the previous -B /--ftp--ascii. The flag was
already in use be the ldap as well so the new name fits pretty good. The
libcyrl CURLOPT_TRANSFERTEXT was also introduced as an alias to the now
obsolete CURLOPT_FTPASCII. Can't verify this fix myself as I have no win32
compiler around.
Daniel (13 June 2000)
- Luong Dinh Dung <dung at sch.bme.hu> found a problem in curl_easy_cleanup()
since it free()ed the main curl struct *twice*. This is now corrected.
Daniel (9 June 2000)
- Updated the RESOURCES file, added a README.win32 file.
Daniel (8 June 2000)
- So I finally added the progress callback to the *setopt() options and it
should work now. I don't have the energy to write any test program for it
right now.
- Made the callback function typedefs public in curl/curl.h for comfort. Just
in case anyone wanna fiddle with such pointers.
- Updated the curl_easy_setopt() man page accordingly.
Version 7.0.4beta
Daniel (2 June 2000)
- I noticed that when doing Location: following, we lost custom headers in all
but the first request.
- Removed the 'HttpPost' struct and moved the header stuff to the more generic
curl_slist.
- Added some better slist-cleanups in src/main.c
Version 7.0.3beta
Daniel (31 May 2000)
- So I discovered that I released the 7.0.2beta without it being able to
compile under Linux. gethostbyname_r() and gethostbyaddr_r() turned out to
feature a different amount of arguments on different systems so I had to add
a configure check for this and adjust the code slightly.
Version 7.0.2beta
Daniel (29 May 2000)
- Corrected the bits.* assignments when using CURLOPT options that only
toggles one of those bits.
- Applied the huge patches from David LeBlanc <dleblanc at qnx.com> that add
usage of the gethostbyname_r() and similar functions in case they're around,
since that make libcurl much better threadsafe in many systems (such as
solaris). I added the checks for these functions to the configure script.
I can't explain why, but the inet_ntoa_r() function did not appear in my
Solaris include files, I had to add my own include file for this for now.
Daniel (22 May 2000)
- Jörn Hartroth brought me fixes to make the win32 version compile properly as
well as a rename of the 'interface' field in the urldata struct, as it seems
to be reserved in some gcc versions!
- Rich Gray struck back with yet some portability reports. Data General DG/UX
needed a little fix in lib/ldap.c since it doesn't have RTLD_GLOBAL defined.
More fixes are expected as a result of Richies very helpful work.
Version 7.0.1beta
Daniel (21 May 2000)
- Updated lots of #defines, enums and variable type names in the library. No
more weird URG or URLTAG prefixes. All types and names should be curl-
prefixed to avoid name space clashes. The FLAGS-parameter to the former
curl_urlget() has been converted into a bunch of flags to use in separate
setopt calls. I'm still focusing on the easy-interface, as the curl tool is
now using that.
- Bjorn Reese has provided me with an asynchronous name resolver that I plan
to use in upcoming versions of curl to be able to gracefully timeout name
lookups.
Version 7.0beta
Daniel (18 May 2000)
- Introduced LIBCURL_VERSION_NUM to the curl.h include file to better allow
source codes to be dependent on the lib version. This define is now set to
a dexadecimal number, with 8 bits each for major number, minor number and
patch number. In other words, version 1.2.3 would make it 0x010203. It also
makes a larger number a newer version.
Daniel (17 May 2000)
- Martin Kammerhofer correctly pointed out several flaws in the FTP range
option. I corrected them.
- Removed the win32 winsock init crap from the lib to the src/main.c file
in the application instead. They can't be in the lib, especially not for
multithreaded purposes.
Daniel (16 May 2000)
- Rewrote the src/main.c source to use the new easy-interface to libcurl 7.
There is still more work to do, but the first step is now taken.
<curl/easy.h> is the include file to use.
Daniel (14 May 2000)
- FTP URLs are now treated slightly different, more according to RFC 1738.
- FTP sessions are now performed differently, with CWD commands to change
directory instead of RETR/STOR/LIST with the full path. Discussions with
Rich Gray made me notice these problems.
- Janne Johansson discovered and corrected a buffer overflow in the
src/usrglob.c file.
- I had to add a lib/strequal.c file for doing case insensitive string
compares on all platforms.
Daniel (8 May 2000):
- Been working lots on the new lib.
- Together with Rich Gray, I've tried to adjust the configure script to work
better on the NCR MP-RAS Unix.
Daniel (2 May 2000):
- Albert Chin-A-Young pointed out that I had a few too many instructions in
configure.in that didn't do any good.
Daniel (24 April 2000):
- Added a new paragraph to the FAQ about what to do when configure can't
find OpenSSL even though it is installed. Supplied by Bob Allison
Daniel (12 April 2000):
- Started messing around big-time to convert the old library interface to a
better one...
Daniel (8 April 2000):
- Made the progress bar look better for file sizes between 9999 kilobytes
and 100 megabytes. They're now displayed XX.XM.
- I also noticed that ftp fetches through HTTP proxies didn't add the user
agent string. It does now.
- Habibie <habibie at MailandNews.com> supplied a pretty good way to build RPMs
on a Linux machine. It still a) requires me to be root to do it, b) leaves
the rpm packages laying at some odd place on my disk c) doesn't work to
build the ssl version of curl since I didn't install openssl from an rpm
package so now the rpm crap thinks I don't have openssl and refuses to build
a package that depends on ssl... Did I mention I don't get along with RPM?
- Once again I received a bug report about autoconf not setting -L prior to -l
on the command line when checking for libs. In this case it made the native
cc compiler on Solaris 7 to fail the OpenSSL check. This has previously been
reported to cause problems on HP-UX and is a known flaw in autoconf 2.13. It
is a pity there's no newer release around...
Daniel (4 April 2000):
- Marco G. Salvagno supplied me with two fixes that
appearantly makes the OS/2 port work better with multiple URLs.
Daniel (2 April 2000):
- Another Location: fix. This time, when curl connected to a port and then
followed a location with an absolute URL to another port, it misbehaved.
Daniel (27 March 2000):
- H. Daphne Luong pointed out that curl was wrongly
messing up the proxy string when fetching a document through a http proxy,
which screwed up multiple fetches such as in location: followings.
Daniel (23 March 2000):
- Marco G. Salvagno corrected my badly applied patch he
actually already told me about!
- H. Daphne Luong brought me a fix that now makes curl
ignore select() errors in the download if errno is EINTR, which turns out to
happen every now and then when using libcurl multi-threaded...
Daniel (22 March 2000):
- Wham Bang supplied a couple of win32 fixes. HAVE_UNAME
was accidentally #defined in config-win32.h, which it shouldn't have been.
The HAVE_UNISTD_H is not defined when compiling with the Makefile.vc6
makefile for MS VC++.
Daniel (21 March 2000):
- I removed the AC_PROG_INSTALL macro from configure.in, since it appears that
one of the AM_* macros searches for a BSD compatible install already. Janne
Johansson made me aware of this.
Version 6.5.2
Daniel (21 March 2000):
- Paul Harrington quickly pointed out to me that 6.5.1
crashes hard. I upload 6.5.2 now as quickly as possible! The problem was
the -D adjustments in src/main.c.
Version 6.5.1
Daniel (20 March 2000):
- An anonymous post on sourceforge correctly pointed out a possible buffer
overflow in the curl_unescape() function for URL conversions. The main
problem with this bug is that the ftp download uses that function and this
single- byte overflow could lead to very odd bugs (as one reported by Janne
Johansson).
Daniel (19 March 2000):
- Marco G. Salvagno supplied me with a series of patches
that now allows curl to get compiled on OS/2. It even includes a section in
the INSTALL file. Very nice job!
Daniel (17 March 2000):
- Wham Bang supplied a patch for the lib/Makefile.vc6
file. We still need some fixes for the config-win32.h since it appears that
VC++ and mingw32 have different opinions about (at least) unistd.h's
existence.
Daniel (15 March 2000):
- I modified the -D/--dump-header workings so that it doesn't write anything
to the file until it needs to. This way, you can actually use -b and -D
on the same file if you want repeated invokes to store and read the cookies
in that one single file.
- Poked around in lots of texts. Added the BUGS file for bug reporting stuff.
Added the classic HTTP POST question to the FAQ, removed some #ifdef WIN32
stuff from the sources (they're covered by the config-win32.h now).
- Pascal Gaudette fixed a missing ldap.c problem in the
Makefile.vc6 file. He also addressed a problem in src/config-win32.h.
Daniel (14 March 2000):
- Paul Harrington pointed out that the 'http_code' variable in the -w output
was never written. I fixed it now.
- Janne Johansson reported the complaints that OpenBSD does
when getdate.c #includes malloc.h. It claims stdlib.h should be included
instead. I added #ifdef HAVE_MALLOC_H code in getdate.y and two checks in
the configure.in for malloc.h and stdlib.h.
Version 6.5
Daniel (13 March 2000):
- <curl at spam.wolvesbane.net> pointed out that the way curl sent cookies in a
single line wasn't enjoyed by IIS4.0 servers. In my view, that is not what
the standards say, but I added a white space between the name/value pairs to
perhaps make them work better.
- Added the perl check back in the configure.in again since the mkhelp.pl
script needs it!
- Made some beautifications in the curl man page.
Daniel (3 March 2000):
- Jörn helped me update the config-win32.h files with HAVE_SETVBUF and
HAVE_STRDUP.
Daniel (3 March 2000):
- Uploaded the 6.5pre2 package.
Daniel (2 March 2000):
- Removed the perl-programs from the distribution, they never made many people
happy and I'll still keep them available on the web.
- Added the -w and -N stuff to the man page. Documented the new progress meter
display in README.curl.
- Jörn Hartroth, Chris <cbayliss at csc.come> and Ulf
Möller from the openssl development team helped bringing me the details for
fixing an OpenSSL usage flaw. It became apparent when they released openssl
0.9.5 since that barfed on curl's bad behavior (not seeding a random number
thing).
- Yet another option: -N/--no-buffer disables buffering in the output stream.
Probably most useful for very slow transfers when you really want to get
every byte curl receives within some preferred time. Andrew <tmr at gci.net>
suggested this.
- Damien Adant mailed me his fixes for making curl compile on Ultrix.
Daniel (24 February 2000):
- Applied Jörn Hartroth's fixes for config-win32.h and lib/Makefile.w32.
I should also make a note here, if nothing else to myself, that when using
the %-syntax for variables in DOS command prompts, you must use two %-
letters for each one since that is an escape letter there! Maybe I should
use another letter instead!
- Added more variables to -w:
'http_code'
'time_namelookup'
'time_connect'
'time_pretransfer'
'url_effective'
- Made -w@filename read the syntax from a file and -w@- reads the syntax from
stdin in the good old "standard" curl way.
Daniel (22 February 2000):
- Released a 6.5pre1 version to get some test and user feedback.
Daniel (21 February 2000):
- I added the -w/--write-out flag and some variables to go with it. -w is a
single string, whatever you enter there will be written out when curl has
completed a successful request. There are some variable substitutions and
they are specified as '%{variable}' (without the quotes). Variables that
exist as of this moment are:
total_time - total transfer time in seconds (with 2 decimals)
size_download - total downloaded amount of bytes
size_upload - total uploaded amount of bytes
speed_download - the average speed of the entire download
speed_upload - the average speed of the entire upload
I will of course add more variables, but I need input on these and others.
- It struck me that the -# progress bar will be hard to just apply on the new
progress bar concept. I need some feedback on this before that'll get re-
introduced! :-/
Daniel (16 February 2000):
- Jörn Hartroth brought me some fixes for the progress meter and I continued
working on it. It seems to work for http download, http post, ftp download
and ftp upload. It should be a pretty good test it works generally good.
- Still need to add the -# progress bar into the new style progress interface.
- Gonna have a go at my new output option parameter next.
Daniel (15 February 2000):
- The progress meter stuff is slowly taking place. There's more left before it
is working ok and everything is tested, but we're reaching there. Slowly!
Daniel (11 February 2000):
- Paul Marquis fixed the config file parsing of curl to
deal with any-length lines, removing the previous limit of 4K.
- Eetu Ojanen's suggestion of supporting the @-style for -b
is implemented. Now -b@<filename> works as well as the old style. -b@- also
similarly reads the cookies from stdin.
- Reminder: -D should not write to the file until it needs to, in the same way
-o does. That would enable curl to use -b and -D on the same file...
- Ellis Pritchard made getdate.y work for MacOS X.
- Paul Harrington helped me out finding the crash in the
cookie parser. He also pointed out curl's habit of sending empty cookies to
the server.
Daniel (8 February 2000):
- Ron Zapp corrected a problem in src/urlglob.c that
prevented curl from getting compiled on sunos 4. The problem had to do
with the difference in sprintf() return code types.
- Transfer() should now be able to download and upload simultaneously. Let's
do some progress meter fixes later this week.
Daniel (31 January 2000):
- Paul Harrington found another core dump in the cookie
parser. Curl doesn't properly recognize the 'version' keyword and I think
that is what caused this. I need to refresh some specs on cookies and see
what else curl lacks to improve this a bit more once and for all.
RFC 2109 clearly specifies how cookies should be dealt with when they are
compliant with that spec. I don't think many servers are though...
- Mark W. Eichin found that while curl is uploading a form
to a web site, it doesn't read incoming data why it'll hang after a while
since the socket "pipe" becomes full.
It took me two hours to rewrite Download() and Upload() into the new
single function Transfer(). It even seems to work! More testing is required
of course... I should get the header-sending together in a kind of queue
and let them get "uploaded" in Transfer() as well.
- Zhibiao Wu pointed out a curl bug in the location: area,
although I did not get a reproducible way to do this why I have to wait
with fixing anything.
- Bob Schader suggested I should implement resume
support for the HTTP PUT operation, and as I think it is a valid suggestion
I'll work on it.
Daniel (25 January 2000):
- M Travis Obenhaus pointed out a manual mixup with -y and -Y that was
corrected.
- Jens Schleusener pointed out a problem to compile
curl on AIX 4.1.4 and gave me a solution. This problem was already fixed
by Jörn's recent #include modifications!
Daniel (19 January 2000):
- Oskar Liljeblad pointed out and corrected a problem
in the Location: following system that made curl following a location: to a
different protocol to fail.
At January 31st I re-considered this fix and the surrounding source code. I
could not really see that the patch did any difference, why I removed it
again for further research and debugging. (It disabled location: following
on server not running on default ports.)
- Jörn Hartroth brought a fix that once again
made it possible to select progress bar.
- Jörn also fixed a few include problems.
Version 6.4
Daniel (17 January 2000):
- Based on suggestions from Björn Stenberg, I made the
progress deal better with larger files and added a "Time" field which shows
the time spent on the download so far.
- I'm now using the CVS repository on sourceforge.net, which also allows web
browsing. See http://curl.haxx.nu.
Daniel (10 January 2000):
- Renumbered some enums in curl/curl.h since tag number 35 was used twice!
- Added "postquote" support to the ftp section that enables post-ftp-transfer
quote commands.
- Now made the -Q/--quote parameter recognize '-' as a prefix, which means
that command will be issued AFTER a successful ftp transfer. This can of
course be used to delete or rename a file after it has been uploaded or
downloaded. Use your imagination! ;-)
- Since I do the main development on solaris 2.6 now, I had to download and
install GNU groff to generate the hugehelp.c file. The solaris nroff cores
on the man page! So, in order to make the solaris configure script find a
better result I made gnroff get checked prior to the regular nroff.
- Added all the curl exit codes to the man page.
- Jim Gallagher properly tracked down a bug in autoconf
2.13. The AC_CHECK_LIB() macro wrongfully uses the -l flag before the -L
flag to 'ld' which causes the HP-UX 10.20 flavour to fail on all libchecks
and therefore you can't make the configure script find the openssl libs!