1
0
mirror of https://github.com/moparisthebest/curl synced 2024-11-16 06:25:03 -05:00
curl/CHANGES

2906 lines
124 KiB
Plaintext
Raw Normal View History

1999-12-29 09:20:26 -05:00
_ _ ____ _
___| | | | _ \| |
/ __| | | | |_) | |
| (__| |_| | _ <| |___
\___|\___/|_| \_\_____|
History of Changes
Daniel (21 November 2000)
- Numerous fixes the test suite has brought into the daylight:
* curl_unescape() could return a too long string
* on ftp transfer failures, there could be memory leaks
* ftp CWD could use bad directory names
* memdebug now uses the mprintf() routines for better portability
* free(NULL) removed when doing resumed transfers
- Added a bunch of test cases for FTP.
- General cleanups to make less warnings with gcc -Wall -pedantic.
- I made the tests/ftpserver.pl work with the most commonly used ftp
operations. PORT, PASV, RETR, STOR, LIST, SIZE, USER, PASS all work now. Now
all I have to do is integrate the ftp server doings in the runtests.pl
script so that ftp tests can be run the same way http tests already run.
Daniel (20 November 2000)
- Made libcurl capable of dealing with any-length URLs. The former limit of
4096 bytes was a bit annoying when people wanted to use curl to really make
life tough on a web server. Now, the command line limit is the most annoying
but that can be circumvented by using a config file.
NOTE: there is still a 4096-byte limit on URLs extracted from Location:
headers.
- Corrected the spelling of 'resolve' in two error messages.
- Alexander Kourakos posted a bug report and a patch that corrected it! It
turned out that lynx and wget support lowercase environment variable names
where curl only looked for the uppercase versions. Now curl will use the
lowercase versions if they exist, but if they don't, it'll use the uppercase
versions.
Daniel (17 November 2000)
- curl_formfree() was added. How come no one missed that one before? I ran the
test suite with the malloc debug enabled and got lots of "nice" warnings on
memory leaks. The most serious one was this. There were also leaks in the
cookie handling, and a few errors when curl failed to connect and similar
things. More tests cases were added to cover up and to verify that these
problems have been removed.
- Mucho updated config file parser (I'm dead tired of all the bug reports and
weird behaviour I get on the former one). It works slightly differently now,
although I doubt many people will notice the differences. The main
difference being that if you use options that require parameters, they must
both be specified on the same line. With this new parser, you can also
specify long options without '--' and you may separate options and
parameters with : or =. It makes a config file line could look like:
user-agent = "foobar and something"
Parameters within quotes may contain spaces. Without quotes, they're
expected to be a single non-space word.
Had to patch the command line argument parser a little to make this work.
- Added --url as an option to allow the URL to be specified this way. It makes
way nicer config files. The previous way of specifying URLs in the config
file doesn't work anymore.
2000-11-16 02:32:45 -05:00
Daniel (15 November 2000)
- Using certain characters in usernames or passwords for HTTP authentication
failed. This was due to the mprintf() that had a silly check for letters,
and if they weren't isprint() they weren't outputed "as-is". This caused
passwords and usernames using '<27>' (for example) to fail.
Version 7.4.2
Daniel (15 November 2000)
- 'tests/runtests.pl' now sorts the test cases properly when 'all' is used.
Daniel (14 November 2000)
- I fell over the draft-ietf-ftpext-mlst-12.txt Internet Draft titled
"Extensions to FTP" that contains a defined way how the ftp command SIZE
could be assumed to work.
- Laurent Papier posted a bug report about using "-C -" and FTP uploading a
file that isn't prsent on the server. The server might then return a 550 and
curl will fail. Should it instead as Laurent Papier suggests, start
uploading from the beginning as a normal upload?
Daniel (13 November 2000)
- Fixed a crash with the followlocation counter.
- While writing test cases for the test suite, I discovered an old limitation
that prevented -o and -T to be used at the same time. I removed this
immediately as this has no relevance in the current libcurl.
- Chris Faherty fixed a free-twice problem in lib/file.c
- I fixed the perl http server problem in the test suite.
Version 7.4.2 pre4
2000-11-10 10:26:48 -05:00
Daniel (10 November 2000)
- I've (finally) started working on the curl test suite. It is in the new
tests/ directory. It requires sh and perl. There's a TCP server in perl and
most of the other stuff running a pretty simple shell script.
I've only made four test cases so far, but it proves the system can work.
- Laurent Papier noticed that curl didn't set TYPE when doing --head checks
for sizes on FTP servers. Some servers seem to return different sizes
depending on whether ASCII or BINARY is used!
- Laurent Papier detected that if you appended a FTP upload and everything was
already uploaded, curl would hang.
- Angus Mackay's getpass_r() in lib/getpass.c is now compliant with the
getpass_r() function it seems some systems actually have.
- Venkataramana Mokkapati detected a bug in the cookie parser and corrected
it. If the cookie was set for the full host name (domain=full.host.com),
the cookie was never sent back because of a faulty length comparison between
the set domain length and the current host name.
Daniel (9 November 2000)
- Added a configure check for gethostbyname in -lsocket (OS/2 seems to need
it). Added a check for RSAglue/rsaref for the cases where libcrypto is found
but libssl isn't. I haven't verified this fix yet though, as I have no
system that requires those libs to build.
Version 7.4.2 pre3
Daniel (7 November 2000)
- Removed perror() outputs from getpass.c. Angus Mackay also agreed to a
slightly modified license of the getpass.c file as the prototype was changed.
2000-11-06 17:56:46 -05:00
Daniel (6 November 2000)
- Added possibility to set a password callback to use instead of the built-in.
They're controled with curl_easy_setopt() of course, the tags are
CURLOPT_PASSWDFUNCTION and CURLOPT_PASSWDDATA.
- Used T. Bharath's thinking and fixed the timers that showed terribly wrong
times when location: headers were followed.
- Emmanuel Tychon discovered that curl didn't really like user names only in
the URL. I corrected this and I also fixed the since long living problem
with URL encoded user names and passwords in the URLs. They should work now.
2000-11-02 09:34:46 -05:00
Daniel (2 November 2000)
- When I added --interface, the new error code that was added with it was
inserted in the wrong place and thus all error codes from 35 and upwards got
increased one step. This is now corrected, we're back at the previous
numbers. All new exit codes should be added at the end.
Daniel (1 November 2000)
- Added a check for signal() in the configure script so that if sigaction()
isn't present, we can use signal() instead.
- I'm having a license discussion going on privately. The issue is yet again
GPL-licensed programs that have problems with MPL. I am leaning towards
making a kind of dual-license that will solve this once and for all...
Daniel (31 October 2000)
- Added the packages/ directory. I intend to let this contain some docs and
templates on how to generate custom-format packages for various platforms.
I've now removed the RPM related curl.spec files from the archive root.
2000-10-30 07:43:08 -05:00
Daniel (30 October 2000)
- T. Bharath brought a set of patches that bring new functionality to
curl_easy_getinfo() and curl_easy_setopt(). Now you can request peer
certificate verification with the *setopt() CURLOPT_SSL_VERIFYPEER option
and then use the CURLOPT_CAINFO to set the certificate to verify the remote
peer against. After an such an operation with a verification request, the
*_getinfo() option CURLINFO_SSL_VERIFYRESULT will return information about
whether the verification succeeded or not.
2000-10-27 06:51:14 -04:00
Daniel (27 October 2000)
- Georg Horn brought us a splendid patch that solves the long-standing
annoying problem with timeouts that made curl exit with silly exit codes
(which as been commented out lately). This solution is sigaction() based and
of course then only works for unixes (and only those unixes that actually
have the sigaction() function).
Daniel (26 October 2000)
- Bj<42>rn Stenberg supplied a patch that fixed the flaw mentioned by Kevin Roth
that made the password get echoed when prompted for interactively. The
getpass() function (now known as my_getpass()) was also fixed to not use any
static buffers. This also means we cannot use the "standard" getpass()
function even for those systems that have it, since it isn't thread-safe.
- Kevin Roth found out that if you'd write a config file with '-v url', the
url would not be used as "default URL" as documented, although if you wrote
it 'url -v' it worked! This has been corrected now.
- Kevin Roth's idea of using multiple -d options on the same command line was
just brilliant, and I couldn't really think of any reason why we shouldn't
support it! The append function always append '&' and then the new -d
chunk. This enables constructs like the following:
curl -d name=daniel -d age=unknown foobarsite.com
Daniel (24 October 2000)
- I fixed the lib/memdebug.c source so that it compiles on Linux and other
systems. It will be useful one day when someone else but me wants to run the
memory debugging system.
Daniel (23 October 2000)
- I modified the maketgz and configure scripts, so that the configure script
will fetch the version number from the include/curl/curl.h header files, and
then the maketgz doesn't have to rebuild the configure script when I build
release-archives.
- Bj<42>rn Stenberg and Linus Nielsen correctly pointed out that curl was silly
enough to not allow @-letters in passwords when they were specified with the
-u or -U flags (CURLOPT_USERPWD and CURLOPT_PROXYUSERPWD). This also
suggests that curl probably should url-decode the password piece of an URL
so that you could pass an encoded @-letter there...
Daniel (20 October 2000)
- Yet another http server barfed on curl's request that include the port
number in the Host: header always. I now only include the port number if it
isn't the default (80 for HTTP, 443 for HTTPS). www.perl.com turned out to
run one of those nasty servers.
- The PHP4 module for curl had problems with referer that seems to have been
corrected just yesterday. (Sterling Hughes of the PHP team confirmed this)
Daniel (17 October 2000)
- Vladimir Oblomov reported that the -Y and -y options didn't work. They
didn't work for me either. This once again proves we should have that test
suite...
- I finally changed the error message libcurl returns if you try a https://
URL when the library wasn't build with SSL enabled. It will now return this
error:
"libcurl was built with SSL disabled, https: not supported!"
I really hope it will make it a bit clearer to users where the actual
problem lies.
2000-10-16 09:52:05 -04:00
Version 7.4.1
2000-10-27 06:51:14 -04:00
2000-10-16 09:52:05 -04:00
Daniel (16 October 2000)
- I forgot to remove some of the malloc debug defines from the makefiles in
the release archive (of course).
Version 7.4
Daniel (16 October 2000)
- The buffer overflow mentioned below was posted to bugtraq on Friday 13th.
2000-10-09 07:25:40 -04:00
2000-10-12 05:14:57 -04:00
Daniel (12 October 2000)
- Colin Robert Phipps elegantly corrected a buffer overflow. It could be used
2000-10-16 09:52:05 -04:00
by an evil ftp server to crash curl. I took the opportunity of replacing a
few other sprintf()s into snprintf()s as well.
2000-10-12 05:14:57 -04:00
Daniel (11 October 2000)
- Found some more memory leaks. This new simple memory debugger has turned out
really useful!
Version 7.4 pre6
Daniel (9 October 2000)
- Florian Koenig pointed out that the bool typedef in the curl/curl.h include
file was breaking PHP 4.0.3 compiling. The bool typedef is not used in the
public interface and was wrongly inserted in that header file.
- J<>rg Hartroth corrected a minor memory leak in the src/urlglob.c stuff. It
didn't harm anyone since the memory is free()ed on exit anyway.
- Corrected the src/main.c. We use the _MPRINTF_REPLACE #define to use our
libcurl-printf() functions. This gives us snprintf() et al on all
platforms. I converted the allocated useragent string to one that uses a
local buffer.
- I've set an #if 0 section around the Content-Transfer-Encoding header
generated in lib/formdata.c. This will hopefully make curl do more
PHP-friendly multi-part posts.
Version 7.4 pre5
2000-10-09 07:25:40 -04:00
Daniel (9 October 2000)
2000-10-12 05:14:57 -04:00
- Nico Baggus found out that curl's ability to force a ASCII download when
using FTP was no longer working! I corrected this. This problem was probably
introduced when I redesigned libcurl for version 7.
2000-10-09 07:25:40 -04:00
- Georg Horn provided a source example that proved a memory leak in libcurl.
I added simple memory debugging facilities and now we can make libcurl log
all memory fiddling functions. An additional perl script is used to analyze
the output logfile and to match malloc()s with free()s etc. The memory leak
Georg found turned out to be the main cookie struct that cookie_cleanup()
2000-10-16 09:52:05 -04:00
didn't free! The perl script is named memanalyze.pl and it is available in
the CVS respository, not in the release archive.
2000-10-09 07:25:40 -04:00
Daniel (8 October 2000)
- Georg Horn found a GetHost() problem. It turned out it never assigned the
pointer in the third argument properly! This could make a crash, or at best
a memory leak!
Version 7.4 pre4
Daniel (6 October 2000)
2000-10-12 05:14:57 -04:00
- Is the -F post following the RFC 1867 spec? We had this dicussion on the
2000-10-09 07:25:40 -04:00
mailing list since it appears curl can't post -F form posts to a PHP
2000-10-12 05:14:57 -04:00
receiver... I've been in touch with the PHP developers about this.
2000-10-09 07:25:40 -04:00
- Domenico Andreoli found out that the long option '--proxy' wasn't working
anymore! The option parser got confused when I added the --proxytunnel for
7.3. This was indeed a very old flaw that hasn't turned up until now...
- J<>rn Hartroth provided patches, updated makefiles and two new files for DLL
stuff on win32. He also pointed out that lib source files were compiled with
-I../src which isn't only wrong but plain stupid!
- Troels Walsted Hansen fixed a problem with HTTP resume. Curl previously used
a local variable badly, that could lead to crashes.
Version 7.4 pre3
Daniel (4 October 2000)
- More docs written. The curl_easy_getinfo.3 man page is now pretty accurate,
as is the -w section in curl.1. I added two options to enable the user to
get information about the received headers' size and the size of the HTTP
request. T. Bharath requested them.
Daniel (3 October 2000)
- Corrected a sever free() before use in the new add_buffer_send()! ;-)
Version 7.4 pre2
Daniel (3 October 2000)
- Jason S. Priebe sent me patches that changed the way curl issues HTTP
requests. The entire request is now issued in one single shot. It didn't do
this previously, and it has turned out that since the common browsers do it
this way, some sites have turned out to work with browsers but not with
curl! Although this is not a client-side problem, we want to be able to
fully emulate browsers, and thus we have now adjusted the networking layer
to slightly more appear as a browser. I adjusted Jason's patch, the faults
are probably mine.
Daniel (2 October 2000)
- Anyone who ever uploaded data with curl on a slow link has noticed that the
progess meter is updated very infrequently. That is due to the large buffer
size curl is using. It reads 50Kb and sends it, updates the progress meter
and loops. 50Kb is very much on a slow link, although it is pretty neat to
use on a fast one.
I've now made an adjustment that makes curl use a 2Kb buffer for uploads to
start with. If curl's average upload speed is faster than buffer size bytes
per second, curl will increase the used buffer size up to max 50Kb. It
should make the progress meter work better.
Version 7.4 pre1
Daniel (29 September 2000)
- Ripped out the -w stuff from the library and put in the curl tool. It gets
all the relevant info from the library using the new curl_easy_getinfo()
function.
- brad at openbsd.org mailed me a patch that corrected my kerberos mistake and
removed a compiler warning from hostip.c that OpenBSD people get.
Daniel (28 September 2000)
- Of course (I should probably get punished somehow) I didn't properly correct
the #include lines for the base64 stuff in the kerberos sources in the just
released 7.3 package. They still include the *_krb.h files! Now, the error
is sooo very easy to spot and fix so I won't bother with a quick bug fix
release. I'll post a patch whenever one is needed instead. It'll be
available in the CVS in a few minutes anyway.
Version 7.3
2000-09-28 06:26:44 -04:00
Daniel (28 September 2000)
- Removed the base64_krb.[ch] files. They've now replaced the former
base64.[ch] files.
Daniel (26 September 2000)
- Updated some docs.
- I changed the OpenSSL fix to work with older versions as well. The posted
patch was only working with 0.9.6 and no older ones.
Version 7.3-pre8
2000-09-25 18:23:17 -04:00
Daniel (25 September 2000)
- Erdmut Pfeifer informed us that curl didn't build with OpenSSL 0.9.6 and
showed us what needed to get patched in order to make it build properly
again.
- Dirk Kruschewski found a bug in the cookie parser. I made an alternative
approach to the solution Dirk himself suggested. The bug made a cookie
header that didn't end with a trailing semicolon to not get parsed.
- I've marked -c and -t deprecated now. If you use any of them, curl will tell
you to use "-C -" or "-T -" instead. I don't think occupying two letters for
nearly identical functions is good use. Also, -T - kind of follows the curl
tradition of using - for stdin where a file name is expected.
Daniel (23 September 2000)
- Martin Hedenfalk provided the patch that finally made the krb4 ftp upload
work!
Daniel (21 September 2000)
- The kerberos code is not quite thread-safe yet. There are a few more globals
that need to be take care of. Let's get the upload working first!
Daniel (20 September 2000)
- Richard Prescott solved another name lookup buffer size problem. I took this
opportunity to rewrite the GetHost() function. With these large buffer
2000-09-28 06:26:44 -04:00
sizes, I think keeping them as local arrays quickly turn ugly. I now use
2000-09-25 18:23:17 -04:00
malloc() to get the buffer memory. Thanks to this, I now can realloc() to a
large buffer in case of demand (errno == ERANGE) in case a solution like
that would become necessary. I still want to avoid that kind of nastiness.
- Tried to compile and run curl on Linux for alpha and FreeBSD for alpha. Went
as smooth as it could.
- Added a docs/examples directory with two tiny example sources that show how
to use libcurl. I hope users will supply me with more useful examples
further on.
- Applied a patch by J<>rn Hartroth to no longer use the word 'inteface' in the
config struct in the src/main.c file since certain compilers have that word
"reservered". I figure that is some kind of C++ decease.
- Updated the curl.1 man page with --interface and --krb4.
- Modified the base64Encode() function to work like the kerberos one, so that
I could remove the use of that. There is no need for *two* base64 encoding
functions! ;-)
Version 7.3pre5
Daniel (19 September 2000)
- The kerberos4-layer source code that is much "influenced" by the original
krb4 source code, through yafc into curl, was using quite a lot of global
variables. libcurl can't work properly with globals like that why I had to
clean up almost every function in the new security.c to make them use
connection specific variables instead of the globals. I just hope I didn't
destroy anything now... :-) configure updated, version string now reflects
krb4 built-in. It almost works now. Only uploads are still being naughty.
Version 7.3pre3
Daniel (18 September 2000)
- Martin Hedenfalk supplied a major patch that introduces krb4-ftp support to
curl. Martin is the primary author of the ftp client named yafc and he did
not hesitate to help us implement this when I asked him. Many and sincere
thanks to a splendid effort. It didn't even take many hours!
- Stephen Kick supplied a big patch that introduces the --interface flag to
the curl tool and CURLOPT_INTERFACE for libcurl. It allows you to specify an
outgoing interface to use for your request. This may not work on all
platforms. This needs testing.
- Richard Prescott noticed that curl on Tru64 unix could core dumped if the
name didn't resolve properly. This was due to the GetHost() function not
returning an error even though it failed on some platforms!
Daniel (15 September 2000)
- Updated all sorts of documents in regards to the new proxytunnel support.
Version 7.3pre2
2000-09-15 09:20:34 -04:00
Daniel (15 September 2000)
- Kai-Uwe Rommel pointed out a problem in the httpproxytunnel stuff for ftp.
Adjusted it. Added better info message when setting up the tunnel and the
pasv message when doing the second connect.
Version 7.3pre1
2000-09-15 09:20:34 -04:00
Daniel (15 September 2000)
- libcurl now allows "httpproxytunnel" to an arbitrary host and port name. The
second connection on ftp needed that.
- TheArtOfHTTPScripting was corrected all over. I both type and spell really
bad at times!
Daniel (14 September 2000)
- -p/--proxytunnel was added to 'curl'. It uses the new
CURLOPT_HTTPPROXYTUNNEL libcurl option that allows "any" protocol to tunnel
through the specified http proxy. At the moment, this should work with ftp.
Daniel (13 September 2000)
- Jochen Schaeuble found that file:// didn't work as expected. Corrected this
and mailed the patch to the mailing list.
Daniel (7 September 2000)
- I changed the #define T() in curl.h since it turned out it wasn't really
a good symbol to use (when you compiled PHP with curl as a module, that
define collided with some IMAP define or something). This was posted to the
PHP bug tracker.
- I added extern "C" stuff in two header files to better allow libcurl usage
in C++ sorces. Discussions on the libcurl list with Danny Horswell lead to
this.
Version 7.2.1
Daniel (31 August 2000)
- Albert Chin-A-Young fixed the configure script *again* and now it seems to
detect Linux name resolving properly! (heard that before?)
- Troels Walsted Hansen pointed out that downloading a file containing the
letter '+' from an ftp server didn't work. It did work from HTTP though and
the reason was my lame URL decoder.
- I happened to notice that -I didn't at all work on ftp anymore. I corrected
that.
2000-08-30 07:50:16 -04:00
Version 7.2
Daniel (30 August 2000)
- Understanding AIX is a hard task. I believe I'll never figure out why they
solve things so differently from the other unixes. Now, I'm left with the
AIX 4.3 run-time warnings about duplicate symbols that according to this
article (http://www.geocrawler.com/archives/3/405/1999/9/0/2593428/) is a
libtool flaw. I tried the mentioned patch, although that stops the linking
completely.
So, if I select to ignore the ld warnings there are compiler warnings that
fill the screen pretty bad when curl compiles. It turns out that if I want
to '#include <arpa/inet.h>', I can get tid of the warnings by include the
following three include files before that one:
#include <net/if_dl.h>
#include <sys/mbuf.h>
#include <netinet/if_ether.h>
Now, is it really sane to add those include files before arpa/inet.h in all
the source files that include it?
Thanks to Albert Chin-A-Young at thewrittenword.com who gave me the AIX
login to try everything on.
Daniel (24 August 2000)
- Jan Schmidt supplied us a new VC6 makefile for Windows as the previous one
was not up to date but lacked several object files.
- More work on the naming.
- Albert Chin-A-Young provided a configure-check for large file support, as
some systems seem to need that for them to work. Had to change the position
for the config.h include file in every .c file in the libcurl dir...
- As suggested on the mailing list (by Troy Engel), I did use a --data-binary
option instead of the messy way I've left described below. It seems to
work. The libcurl fix remained the same as yesterday.
Daniel (23 August 2000)
2000-08-30 07:50:16 -04:00
- Back on the -d stripping newlines thing. The 'plain post' thing was added
when I had no thought of that one could actually post binary data with
it. Now, I have to add this functionality in a graceful manner and I think
I've managed to come up with a way: '-d @file;binary' will thus post the
file binary, exactly as its contents are. It is implemented with a new
*setopt() option (CURLOPT_POSTFIELDSIZE) to set the postfield size, since
libcurl can't strlen() the data in these cases.
- Albert Chin-A-Young made some very serious efforts and all the name
resolving problems seem to have been sorted out now on all the platforms
that previously showed them. I'll make another release now anyday because of
this.
- The FAQ was much enhanced when it comes to the licensing issues thanks to
Bjorn Reese.
Daniel (21 August 2000)
- Rick Welykochy pointed out a problem when you use -d to post and you want to
keep the newlines, as curl strips them off as a bonus before posting...
This needs to be addressed.
2000-08-21 17:56:41 -04:00
Version 7.1.1
Daniel (21 August 2000)
- Got more people involved in the gethostbyname_r() mess. Caolan McNamara sent
me configure-code that turned out to be very similar to my existing tests
which only make me more sure I'm on the right path. I changed the order of
the tests slightly, as it seems that some compilers don't yell error if a
function is used with too many parameters. Thus, the first tested function
will seem ok... Let's hope more compilers think of too-few parameters as bad
manners, as we're now trying the functions in that order; fewer first. I
should also add that Lars Hecking mailed me and volunteered to run tests on
a few odd systems. Coalan is keeping his work over at
http://www.csn.ul.ie/~caolan/publink/gethostbyname_r/. Might be handy in the
future as well.
Daniel (18 August 2000)
- I noticed I hadn't increased the name lookup buffer in lib/ftp.c. I don't
think this is the reason for the continued trouble though.
Daniel (17 August 2000)
- Fred Noz corrected my stupid mistakes in the gethostbyname_r() fluff. It
should affect some AIX, Digital Unix and HPUX 10 systems.
2000-07-29 18:21:10 -04:00
2000-08-15 17:57:47 -04:00
Daniel (15 August 2000)
- Mathieu Legare compiled and build 7.1 without errors on both AIX 4.2 as well
as AIX 4.3. Now why did problems occur before?
- Fred Noz reported a -w/--write-out bug that caused it to malfunction when
used combined with multiple URL retrievales. All but the first display got
screwed up!
Daniel (11 August 2000)
2000-08-21 17:56:41 -04:00
- Jason Priebe and an anonymous friend found some host names the Linux version
2000-08-15 17:57:47 -04:00
of curl could not resolve. It turned out the buffer used to retrieve that
information was too small. Fixed. One could argue about the usefulness of
not having the slightest trace of a man page for gethostbyname_r() on my
Linux Redhat installation...
Daniel (10 August 2000)
2000-08-21 17:56:41 -04:00
- Balaji S Rao was first in line to note the missing possibility to replace
the Content-Type: and Content-Length: headers when doing -d posts. I added
the possibility just now. It seems some people wants to do standard posts
using custom Content-Types.
2000-08-15 17:57:47 -04:00
Daniel (8 August 2000)
2000-08-21 17:56:41 -04:00
- Mike Dowell correctly discovered that curl did not approve of URLs with no
user name but password. As in 'http://:foo@haxx.se'. I corrected this.
2000-08-15 17:57:47 -04:00
Version 7.1
Daniel (7 August 2000)
- My AIX 4 fix does not work. I need help from a AIX 4 hacker.
- I added my new document in the docs directory. It is aimed to become a sort
of tutorial on how to do HTTP scripting with curl.
Daniel (4 August 2000)
- Working with Rich Gray on compiling curl for lots of different platforms.
My fix for AIX 3.2 was not good enough and was slightly changed, I had to
move an include file before another, as is now described in the source.
AIX 4.2 (4.X?) has different gethostbyname_r() and gethostbyaddr_r()
functions that the configure script didn't check for and thus the compile
broke with an error. I have now changed the gethostbyname_r() check in the
configure file to support all three versions of both these functions. My
implementation that uses the AIX-style is though not yet verified and I may
get problems to fix it if it turns out to bug since I don't have access to
any system using that.
For problems like that, I made the configure script allow --disable-thread
to completely switch off the check for threadsafe versions of a few
functions and thus go with the "good old versions" that tend to work
although will break thread-safeness for libcurl. Most people won't use
libcurl for other things than curl though, and curl doesn't need a
thread-safe lib.
- Working on my big tutorial about HTTP scripting with curl.
Daniel (1 August 2000)
- Rich Gray spotted a problem in src/setup.h caused by a #define strequal()
that was just a left-over from passed times. The strequal() is now a true
function supplied by libcurl for a portable case insensitive string
comparison. I added the prototypes in include/curl.h and removed the
now obsolete #define.
- Igor Khristophorov made a fix to allow resumed download from Sun's
JavaWebServer/1.1.1. It seems that their server sends bad Content-Range
headers.
- The makefiles forced a static library build, which is bad since we now use
libtool and thus have excellent shared library support! Albert Chin-A-Young
found out.
2000-07-31 18:42:34 -04:00
Version 7.0.11beta
Daniel (1 August 2000)
- Albert Chin-A-Young pointed out that 'make install' did not properly create
the header include directory, why it failed to install the header files as
it should. Automake isn't really equipped to deal with subdirectories
without Makefiles in any nice way. I had to run ahead and add Makefiles in
both include and include/curl before I managed to create a top-level
makefile that succeeds in install everything properly!
- Ok, no more "features" added now. Let's just verify that there's no major
flaws added now.
Daniel (31 July 2000)
- Both Jeff Schasny and Ketil Froyn asked me how to tell curl not to send one
of those internally generated headers. They didn't settle with the blank
ones you could tell curl to use. I rewrote the header-replace stuff a
little. Now, if you replace an internal header with your own and that new
one is a blank header you will only remove the internal one and not get any
blank. I couldn't figure out any case when you want that blank header.
2000-07-29 18:21:10 -04:00
Daniel (29 July 2000)
- It struck me that the lib used localtime() which is not thread-safe, so now
I use localtime_r() in the systems that has it.
- I went through this entire document and removed all email addresses and left
names only. I've really made an effort to always note who brought be bug
reports or fixes, but more and more people ask me to remove the email
2000-07-31 18:42:34 -04:00
addresses since they become victims for spams this way. Gordon Beaton got me
working on this.
2000-07-29 18:21:10 -04:00
Daniel (27 July 2000)
- J<>rn Hartroth found out that when you specified a HTTP proxy in an
environment variable and used -L, curl failed in the second fetch. I
corrected this problem and posted a patch to the list. No need for an extra
beta release just for this.
Version 7.0.10beta
Daniel (27 July 2000)
- So, libtool replaced two of my files with symbolic links and I forgot to add
the two new libtool files to the release archive (and they were added as
symlinks as well!) This of course lead to that the configure script failed
on 7.0.9...
Version 7.0.9beta
Daniel (25 July 2000)
2000-07-29 18:21:10 -04:00
- Kristian K<>hntopp <kris at koehntopp.de> brought a fix that makes libcurl
libtoolified, just as we've wanted for a while now. He also made the
recently added man pages get installed properly on 'make install' and some
other nice cleanups.
- In a discussion with Eetu Ojanen it struck me that if we use curl to get a
page using a password, and that page then sends a Location: to another
server that curl follows, curl will send the user name and password to that
server as well.
Now, I'll never be able to make curl do Location: following all that perfect
and you're all sooner or later required to write a script to do several
fetches when you're doing advanced stuff, but now I've modified curl to at
least *only* send the user name and password to the original server. Which
means that if get a page from server A with a password, that forwards curl
to server B, curl won't use the password there. If server B then forwards
curl back to server A again, the password will be used again.
This is not a perfect implementation, as in a browser case it would only use
the password if the left-prefix of the first path is the same. I just think
that this fix prevents a somewhat lurky "security hole".
As a side-note in this subject: HTTP passwords are sent in cleartext and
will never be considered to be safe or secure. Use HTTPS for that.
- As discussed on the mailing list, I converted the FTP response reading
function into using select() which then allows timeouts (even under win32!)
if the command-reply session gets too slow or dies completely. I made a
default timeout on 3600 seconds unless anything else is specified, since I
don't think anyone wants to wait more than that for a single character to
get received...
- Torsten Foertsch <torsten.foertsch at gmx.net> brought a set of fixes for
the rfc1867 form posts. He introduced 'name=<file' which brings a means to
suuply very large text chunks read from the given file name. It differs from
'name=@file' in the way that this latter thing is marked in the uploaded
contents as a file upload, while the first is just text (as in a input or
textarea field). Torsten also corrected a bug that would happen if you used
%s or similar in a -F file name.
- As discovered by Nico Baggus <Nico.Baggus at mail.ing.nl>, when transferring
files to/from FTP using type ASCII curl should not expect the transfer to be
the exact size reported by the server as the file size. Since ASCII may very
well mean that the content is translated while transfered, the final size
may very well differ. Therefor, curl now ignores the file size when doing
ASCII transfers in FTP.
2000-07-25 03:39:42 -04:00
Daniel (24 July 2000)
- Added CURLOPT_PROXYPORT to the curl_easy_setopt() call to allow the proxy
port number to be set separately from the proxy host name.
- Andrew <andrew at ugh.net.au> pointed out a netrc manual bug.
- The FTP transfer code now accepts a 250-code as well as the previously
accepted 226, after a successful file transfer. Mohan <mnair at
evergreen-funds.com> pointed this out.
- The check for *both* nsl and socket was never added in the v7 configure.in
when I moved the main branch. I re-added that check to configure.in. This was
discovered by Rich Gray.
- Howard, Blaise <Blaise.Howard at factiva.com> pointed out a missing free() in
curl_disconnect() which of course meant libcurl ate memory.
- Brian E. Gallew noted that the HTTP 'Host:' header curl sent did not
properly include the port number if non-default ports were used. This should
now have been fixed.
- HTTP connect errors now return errors earlier. This was most notably causing
problems when the HTTPS certificate had problems and later caused a crash.
Many thanks to Gregory Nicholls <gnicholls at level8.com> for discovering
and suggesting a fix...
Daniel (21 June 2000)
- After a "bug report" I received where the user was using both -F and -I in a
HTTP request (it severly confused the library I should add), I added some
checks to src/main.c that prevents setting more than one HTTP request
command, no matter what the user wants! ;-)
Version 7.0.8beta
2000-06-20 07:55:38 -04:00
Daniel (20 June 2000)
2000-07-25 03:39:42 -04:00
- I did a major replace in many files to use the new curl domain haxx.se
instead of the previous one.
2000-06-20 07:55:38 -04:00
- As Eetu Ojanen suggested, I finally took the step and now libcurl no longer
makes a POST after it has followed a location. When the initial POST has
been done, it'll turned into a GET for the further requests. This is only
interesting when using -L/--location *and* doing a POST at the same time.
While messing with this, I added another weird feature I call 'auto
referer'. If you append ';auto' to the right of a given referer string (or
only use that string as referer), libcurl will automatically set the
previoud URL as refered when it follows a Location: and gets a succeeding
document.
- My hero Rich Gray found the very obscure FTP bug that happened to him only
when passing through a particular firewall and using the PORT command. It
turned out that PORT was the only command in the lib/ftp.c source that
didn't send a proper \r\n sequence but instead used the faulty \n which as
it seemed is supported by most major ftp servers... :-O
Version 7.0.7beta
2000-06-16 09:29:41 -04:00
Daniel (16 June 2000)
- I had avoided this long enough now, so I moved the alternative progress bar
stuff from the lib and added it to the client code. This is now using the
recently added progress callback and it seems to work pretty much like
before. Since it is only one progress bar and you and download and upload at
the same time, this bar shows the combined progress of both directions. This
code was just ported from the old place to this, Lars is still our saviour!
;-) This also made the documentation more accurate since I never removed
this function from any docs! Although I now removed the CURLOPT_PROGRESSMODE
from the library since the lib has only one internal progress meter and it
will never get another. It is although likely that the internal one also
will be moved to the client code in the future (when I have other means of
getting the writeout data and move that too to the client).
- I took the opportunity to verify that standard progress meter works and I
found out it didn't get inited properly. Grrr. I corrected that as well.
Daniel (15 June 2000)
- I thought I'd better verify that the -F option still works in v7 and of
course it didn't... :-/ Anyway, I had the problems I could discover
corrected. About one month of beta testing and not a single person has used
this feature with v7?
- Bj<42>rn correctly pointed out that the --progress-bar still doesn't work in
v7. Hm.
2000-06-14 13:28:44 -04:00
Daniel (14 June 2000)
- Tim Tassonis discovered that curl 7 didn't handle normal http POST as it
should. I corrected this.
2000-06-14 10:33:54 -04:00
Version 7.0.6beta
Daniel (14 June 2000)
- Bj<42>rn Stenberg pointed out several problems (related to win32 compiling):
lib/strequal.c had a bad #ifdef for one of the string comparisons (win32)
src/main.c had several minor problems
lib/makefile.m32 had getpass.[co] twice
src/config-win32.h lacked the HAVE_FCNTL_H define
both config-win32.h files now only set the HAVE_UNISTD_H define if the
define MINGW32 is set, and I modified src/makefile.m32 and lib/makefile.m32
to set it.
Version 7.0.5beta
Daniel (14 June 2000)
- Applied Luong Dinh Dung's comments about a few win32 compile problems.
- Applied Bj<42>rn Stenberg's suggested fix that turns the win32 stdout to
binary. It won't do it if the -B / --use-ascii option is used. That option
is now an extended version of the previous -B /--ftp--ascii. The flag was
already in use be the ldap as well so the new name fits pretty good. The
libcyrl CURLOPT_TRANSFERTEXT was also introduced as an alias to the now
obsolete CURLOPT_FTPASCII. Can't verify this fix myself as I have no win32
compiler around.
2000-06-13 04:11:45 -04:00
Daniel (13 June 2000)
- Luong Dinh Dung <dung at sch.bme.hu> found a problem in curl_easy_cleanup()
since it free()ed the main curl struct *twice*. This is now corrected.
2000-06-09 08:08:08 -04:00
Daniel (9 June 2000)
- Updated the RESOURCES file, added a README.win32 file.
2000-06-08 11:20:03 -04:00
Daniel (8 June 2000)
- So I finally added the progress callback to the *setopt() options and it
should work now. I don't have the energy to write any test program for it
right now.
- Made the callback function typedefs public in curl/curl.h for comfort. Just
in case anyone wanna fiddle with such pointers.
- Updated the curl_easy_setopt() man page accordingly.
Version 7.0.4beta
Daniel (2 June 2000)
- I noticed that when doing Location: following, we lost custom headers in all
but the first request.
- Removed the 'HttpPost' struct and moved the header stuff to the more generic
curl_slist.
- Added some better slist-cleanups in src/main.c
Version 7.0.3beta
Daniel (31 May 2000)
- So I discovered that I released the 7.0.2beta without it being able to
compile under Linux. gethostbyname_r() and gethostbyaddr_r() turned out to
feature a different amount of arguments on different systems so I had to add
a configure check for this and adjust the code slightly.
Version 7.0.2beta
2000-05-30 12:31:46 -04:00
Daniel (29 May 2000)
- Corrected the bits.* assignments when using CURLOPT options that only
toggles one of those bits.
- Applied the huge patches from David LeBlanc <dleblanc at qnx.com> that add
usage of the gethostbyname_r() and similar functions in case they're around,
since that make libcurl much better threadsafe in many systems (such as
solaris). I added the checks for these functions to the configure script.
I can't explain why, but the inet_ntoa_r() function did not appear in my
Solaris include files, I had to add my own include file for this for now.
2000-05-22 15:11:39 -04:00
Daniel (22 May 2000)
2000-05-22 15:15:18 -04:00
- J<>rn Hartroth brought me fixes to make the win32 version compile properly as
well as a rename of the 'interface' field in the urldata struct, as it seems
to be reserved in some gcc versions!
2000-05-22 15:11:39 -04:00
- Rich Gray struck back with yet some portability reports. Data General DG/UX
needed a little fix in lib/ldap.c since it doesn't have RTLD_GLOBAL defined.
More fixes are expected as a result of Richies very helpful work.
2000-06-08 11:20:03 -04:00
Version 7.0.1beta
2000-05-22 15:11:39 -04:00
2000-05-22 10:12:12 -04:00
Daniel (21 May 2000)
- Updated lots of #defines, enums and variable type names in the library. No
more weird URG or URLTAG prefixes. All types and names should be curl-
prefixed to avoid name space clashes. The FLAGS-parameter to the former
curl_urlget() has been converted into a bunch of flags to use in separate
setopt calls. I'm still focusing on the easy-interface, as the curl tool is
now using that.
- Bjorn Reese has provided me with an asynchronous name resolver that I plan
to use in upcoming versions of curl to be able to gracefully timeout name
lookups.
2000-06-08 11:20:03 -04:00
Version 7.0beta
2000-05-22 10:12:12 -04:00
Daniel (18 May 2000)
- Introduced LIBCURL_VERSION_NUM to the curl.h include file to better allow
source codes to be dependent on the lib version. This define is now set to
a dexadecimal number, with 8 bits each for major number, minor number and
patch number. In other words, version 1.2.3 would make it 0x010203. It also
makes a larger number a newer version.
Daniel (17 May 2000)
- Martin Kammerhofer correctly pointed out several flaws in the FTP range
option. I corrected them.
- Removed the win32 winsock init crap from the lib to the src/main.c file
in the application instead. They can't be in the lib, especially not for
multithreaded purposes.
Daniel (16 May 2000)
- Rewrote the src/main.c source to use the new easy-interface to libcurl 7.
There is still more work to do, but the first step is now taken.
<curl/easy.h> is the include file to use.
2000-05-14 09:36:38 -04:00
Daniel (14 May 2000)
- FTP URLs are now treated slightly different, more according to RFC 1738.
- FTP sessions are now performed differently, with CWD commands to change
directory instead of RETR/STOR/LIST with the full path. Discussions with
Rich Gray made me notice these problems.
- Janne Johansson discovered and corrected a buffer overflow in the
src/usrglob.c file.
- I had to add a lib/strequal.c file for doing case insensitive string
compares on all platforms.
2000-05-14 09:12:11 -04:00
Daniel (8 May 2000):
- Been working lots on the new lib.
2000-05-14 09:36:38 -04:00
- Together with Rich Gray, I've tried to adjust the configure script to work
better on the NCR MP-RAS Unix.
2000-05-14 09:12:11 -04:00
Daniel (2 May 2000):
- Albert Chin-A-Young pointed out that I had a few too many instructions in
configure.in that didn't do any good.
Daniel (24 April 2000):
- Added a new paragraph to the FAQ about what to do when configure can't
find OpenSSL even though it is installed. Supplied by Bob Allison
Daniel (12 April 2000):
- Started messing around big-time to convert the old library interface to a
better one...
Daniel (8 April 2000):
- Made the progress bar look better for file sizes between 9999 kilobytes
and 100 megabytes. They're now displayed XX.XM.
- I also noticed that ftp fetches through HTTP proxies didn't add the user
agent string. It does now.
2000-07-29 18:21:10 -04:00
- Habibie <habibie at MailandNews.com> supplied a pretty good way to build RPMs
on a Linux machine. It still a) requires me to be root to do it, b) leaves
the rpm packages laying at some odd place on my disk c) doesn't work to
build the ssl version of curl since I didn't install openssl from an rpm
package so now the rpm crap thinks I don't have openssl and refuses to build
a package that depends on ssl... Did I mention I don't get along with RPM?
2000-05-14 09:12:11 -04:00
- Once again I received a bug report about autoconf not setting -L prior to -l
on the command line when checking for libs. In this case it made the native
cc compiler on Solaris 7 to fail the OpenSSL check. This has previously been
reported to cause problems on HP-UX and is a known flaw in autoconf 2.13. It
is a pity there's no newer release around...
Daniel (4 April 2000):
2000-07-29 18:21:10 -04:00
- Marco G. Salvagno supplied me with two fixes that
appearantly makes the OS/2 port work better with multiple URLs.
Daniel (2 April 2000):
- Another Location: fix. This time, when curl connected to a port and then
followed a location with an absolute URL to another port, it misbehaved.
Daniel (27 March 2000):
2000-07-29 18:21:10 -04:00
- H. Daphne Luong pointed out that curl was wrongly
messing up the proxy string when fetching a document through a http proxy,
which screwed up multiple fetches such as in location: followings.
Daniel (23 March 2000):
2000-07-29 18:21:10 -04:00
- Marco G. Salvagno corrected my badly applied patch he
actually already told me about!
2000-07-29 18:21:10 -04:00
- H. Daphne Luong brought me a fix that now makes curl
2000-03-23 06:02:08 -05:00
ignore select() errors in the download if errno is EINTR, which turns out to
happen every now and then when using libcurl multi-threaded...
Daniel (22 March 2000):
2000-07-29 18:21:10 -04:00
- Wham Bang supplied a couple of win32 fixes. HAVE_UNAME
was accidentally #defined in config-win32.h, which it shouldn't have been.
The HAVE_UNISTD_H is not defined when compiling with the Makefile.vc6
makefile for MS VC++.
Daniel (21 March 2000):
- I removed the AC_PROG_INSTALL macro from configure.in, since it appears that
one of the AM_* macros searches for a BSD compatible install already. Janne
Johansson made me aware of this.
2000-03-21 10:37:13 -05:00
Version 6.5.2
Daniel (21 March 2000):
2000-07-29 18:21:10 -04:00
- Paul Harrington quickly pointed out to me that 6.5.1
2000-03-21 10:37:13 -05:00
crashes hard. I upload 6.5.2 now as quickly as possible! The problem was
2000-03-23 06:02:08 -05:00
the -D adjustments in src/main.c.
2000-03-21 10:37:13 -05:00
2000-03-21 09:18:39 -05:00
Version 6.5.1
Daniel (20 March 2000):
2000-03-23 06:02:08 -05:00
- An anonymous post on sourceforge correctly pointed out a possible buffer
overflow in the curl_unescape() function for URL conversions. The main
2000-03-21 09:18:39 -05:00
problem with this bug is that the ftp download uses that function and this
single- byte overflow could lead to very odd bugs (as one reported by Janne
Johansson).
2000-01-31 17:21:55 -05:00
Daniel (19 March 2000):
2000-07-29 18:21:10 -04:00
- Marco G. Salvagno supplied me with a series of patches
that now allows curl to get compiled on OS/2. It even includes a section in
the INSTALL file. Very nice job!
Daniel (17 March 2000):
2000-07-29 18:21:10 -04:00
- Wham Bang supplied a patch for the lib/Makefile.vc6
file. We still need some fixes for the config-win32.h since it appears that
VC++ and mingw32 have different opinions about (at least) unistd.h's
2000-03-23 06:02:08 -05:00
existence.
Daniel (15 March 2000):
- I modified the -D/--dump-header workings so that it doesn't write anything
to the file until it needs to. This way, you can actually use -b and -D
on the same file if you want repeated invokes to store and read the cookies
in that one single file.
- Poked around in lots of texts. Added the BUGS file for bug reporting stuff.
Added the classic HTTP POST question to the FAQ, removed some #ifdef WIN32
stuff from the sources (they're covered by the config-win32.h now).
2000-07-29 18:21:10 -04:00
- Pascal Gaudette fixed a missing ldap.c problem in the
Makefile.vc6 file. He also addressed a problem in src/config-win32.h.
Daniel (14 March 2000):
- Paul Harrington pointed out that the 'http_code' variable in the -w output
was never written. I fixed it now.
2000-07-29 18:21:10 -04:00
- Janne Johansson reported the complaints that OpenBSD does
when getdate.c #includes malloc.h. It claims stdlib.h should be included
instead. I added #ifdef HAVE_MALLOC_H code in getdate.y and two checks in
the configure.in for malloc.h and stdlib.h.
Version 6.5
2000-03-13 04:13:07 -05:00
Daniel (13 March 2000):
2000-07-29 18:21:10 -04:00
- <curl at spam.wolvesbane.net> pointed out that the way curl sent cookies in a
2000-03-13 04:13:07 -05:00
single line wasn't enjoyed by IIS4.0 servers. In my view, that is not what
the standards say, but I added a white space between the name/value pairs to
perhaps make them work better.
- Added the perl check back in the configure.in again since the mkhelp.pl
script needs it!
- Made some beautifications in the curl man page.
Daniel (3 March 2000):
- J<>rn helped me update the config-win32.h files with HAVE_SETVBUF and
HAVE_STRDUP.
Daniel (3 March 2000):
- Uploaded the 6.5pre2 package.
Daniel (2 March 2000):
- Removed the perl-programs from the distribution, they never made many people
happy and I'll still keep them available on the web.
- Added the -w and -N stuff to the man page. Documented the new progress meter
display in README.curl.
2000-07-29 18:21:10 -04:00
- J<>rn Hartroth, Chris <cbayliss at csc.come> and Ulf
M<>ller from the openssl development team helped bringing me the details for
2000-03-23 06:02:08 -05:00
fixing an OpenSSL usage flaw. It became apparent when they released openssl
0.9.5 since that barfed on curl's bad behavior (not seeding a random number
thing).
- Yet another option: -N/--no-buffer disables buffering in the output stream.
Probably most useful for very slow transfers when you really want to get
2000-07-29 18:21:10 -04:00
every byte curl receives within some preferred time. Andrew <tmr at gci.net>
suggested this.
2000-07-29 18:21:10 -04:00
- Damien Adant mailed me his fixes for making curl compile on Ultrix.
2000-03-01 17:40:57 -05:00
Daniel (24 February 2000):
- Applied J<>rn Hartroth's fixes for config-win32.h and lib/Makefile.w32.
I should also make a note here, if nothing else to myself, that when using
the %-syntax for variables in DOS command prompts, you must use two %-
letters for each one since that is an escape letter there! Maybe I should
use another letter instead!
- Added more variables to -w:
'http_code'
'time_namelookup'
'time_connect'
'time_pretransfer'
'url_effective'
2000-03-01 17:40:57 -05:00
- Made -w@filename read the syntax from a file and -w@- reads the syntax from
stdin in the good old "standard" curl way.
Daniel (22 February 2000):
- Released a 6.5pre1 version to get some test and user feedback.
Daniel (21 February 2000):
- I added the -w/--write-out flag and some variables to go with it. -w is a
single string, whatever you enter there will be written out when curl has
completed a successful request. There are some variable substitutions and
2000-03-23 06:02:08 -05:00
they are specified as '%{variable}' (without the quotes). Variables that
exist as of this moment are:
total_time - total transfer time in seconds (with 2 decimals)
size_download - total downloaded amount of bytes
size_upload - total uploaded amount of bytes
speed_download - the average speed of the entire download
speed_upload - the average speed of the entire upload
I will of course add more variables, but I need input on these and others.
- It struck me that the -# progress bar will be hard to just apply on the new
progress bar concept. I need some feedback on this before that'll get re-
introduced! :-/
2000-02-15 19:06:29 -05:00
Daniel (16 February 2000):
- J<>rn Hartroth brought me some fixes for the progress meter and I continued
working on it. It seems to work for http download, http post, ftp download
and ftp upload. It should be a pretty good test it works generally good.
- Still need to add the -# progress bar into the new style progress interface.
- Gonna have a go at my new output option parameter next.
Daniel (15 February 2000):
- The progress meter stuff is slowly taking place. There's more left before it
is working ok and everything is tested, but we're reaching there. Slowly!
2000-02-14 18:16:14 -05:00
Daniel (11 February 2000):
2000-07-29 18:21:10 -04:00
- Paul Marquis fixed the config file parsing of curl to
2000-02-14 18:16:14 -05:00
deal with any-length lines, removing the previous limit of 4K.
2000-07-29 18:21:10 -04:00
- Eetu Ojanen's suggestion of supporting the @-style for -b
2000-02-14 18:16:14 -05:00
is implemented. Now -b@<filename> works as well as the old style. -b@- also
2000-03-23 06:02:08 -05:00
similarly reads the cookies from stdin.
2000-02-14 18:16:14 -05:00
- Reminder: -D should not write to the file until it needs to, in the same way
-o does. That would enable curl to use -b and -D on the same file...
2000-07-29 18:21:10 -04:00
- Ellis Pritchard made getdate.y work for MacOS X.
2000-02-14 18:16:14 -05:00
2000-07-29 18:21:10 -04:00
- Paul Harrington helped me out finding the crash in the
2000-03-23 06:02:08 -05:00
cookie parser. He also pointed out curl's habit of sending empty cookies to
2000-02-14 18:16:14 -05:00
the server.
Daniel (8 February 2000):
2000-07-29 18:21:10 -04:00
- Ron Zapp corrected a problem in src/urlglob.c that
prevented curl from getting compiled on sunos 4. The problem had to do
with the difference in sprintf() return code types.
- Transfer() should now be able to download and upload simultaneously. Let's
do some progress meter fixes later this week.
2000-01-31 17:21:55 -05:00
Daniel (31 January 2000):
2000-07-29 18:21:10 -04:00
- Paul Harrington found another core dump in the cookie
2000-01-31 19:42:54 -05:00
parser. Curl doesn't properly recognize the 'version' keyword and I think
that is what caused this. I need to refresh some specs on cookies and see
what else curl lacks to improve this a bit more once and for all.
2000-01-31 17:21:55 -05:00
2000-02-01 18:54:00 -05:00
RFC 2109 clearly specifies how cookies should be dealt with when they are
compliant with that spec. I don't think many servers are though...
2000-07-29 18:21:10 -04:00
- Mark W. Eichin found that while curl is uploading a form
2000-01-31 17:21:55 -05:00
to a web site, it doesn't read incoming data why it'll hang after a while
since the socket "pipe" becomes full.
2000-01-31 19:42:54 -05:00
It took me two hours to rewrite Download() and Upload() into the new
single function Transfer(). It even seems to work! More testing is required
of course... I should get the header-sending together in a kind of queue
and let them get "uploaded" in Transfer() as well.
2000-07-29 18:21:10 -04:00
- Zhibiao Wu pointed out a curl bug in the location: area,
2000-03-23 06:02:08 -05:00
although I did not get a reproducible way to do this why I have to wait
2000-01-31 17:21:55 -05:00
with fixing anything.
2000-07-29 18:21:10 -04:00
- Bob Schader suggested I should implement resume
2000-01-31 17:21:55 -05:00
support for the HTTP PUT operation, and as I think it is a valid suggestion
I'll work on it.
Daniel (25 January 2000):
2000-07-29 18:21:10 -04:00
- M Travis Obenhaus pointed out a manual mixup with -y and -Y that was
corrected.
2000-01-31 17:21:55 -05:00
2000-07-29 18:21:10 -04:00
- Jens Schleusener pointed out a problem to compile
2000-01-31 17:21:55 -05:00
curl on AIX 4.1.4 and gave me a solution. This problem was already fixed
by J<>rn's recent #include modifications!
Daniel (19 January 2000):
2000-07-29 18:21:10 -04:00
- Oskar Liljeblad pointed out and corrected a problem
2000-01-31 17:21:55 -05:00
in the Location: following system that made curl following a location: to a
different protocol to fail.
At January 31st I re-considered this fix and the surrounding source code. I
could not really see that the patch did any difference, why I removed it
again for further research and debugging. (It disabled location: following
on server not running on default ports.)
2000-07-29 18:21:10 -04:00
- J<>rn Hartroth brought a fix that once again
2000-01-31 17:21:55 -05:00
made it possible to select progress bar.
- J<>rn also fixed a few include problems.
1999-12-29 09:20:26 -05:00
Version 6.4
2000-01-31 17:21:55 -05:00
Daniel (17 January 2000):
2000-07-29 18:21:10 -04:00
- Based on suggestions from Bj<42>rn Stenberg, I made the
2000-01-31 17:21:55 -05:00
progress deal better with larger files and added a "Time" field which shows
the time spent on the download so far.
- I'm now using the CVS repository on sourceforge.net, which also allows web
browsing. See http://curl.haxx.nu.
2000-01-10 18:36:14 -05:00
Daniel (10 January 2000):
- Renumbered some enums in curl/curl.h since tag number 35 was used twice!
- Added "postquote" support to the ftp section that enables post-ftp-transfer
quote commands.
- Now made the -Q/--quote parameter recognize '-' as a prefix, which means
that command will be issued AFTER a successful ftp transfer. This can of
course be used to delete or rename a file after it has been uploaded or
downloaded. Use your imagination! ;-)
- Since I do the main development on solaris 2.6 now, I had to download and
install GNU groff to generate the hugehelp.c file. The solaris nroff cores
2000-01-10 18:55:47 -05:00
on the man page! So, in order to make the solaris configure script find a
better result I made gnroff get checked prior to the regular nroff.
2000-01-10 18:36:14 -05:00
- Added all the curl exit codes to the man page.
2000-07-29 18:21:10 -04:00
- Jim Gallagher properly tracked down a bug in autoconf
2000-01-10 18:36:14 -05:00
2.13. The AC_CHECK_LIB() macro wrongfully uses the -l flag before the -L
flag to 'ld' which causes the HP-UX 10.20 flavour to fail on all libchecks
2000-03-23 06:02:08 -05:00
and therefore you can't make the configure script find the openssl libs!
2000-01-10 18:36:14 -05:00
1999-12-29 09:20:26 -05:00
Daniel (28 December 1999):
2000-07-29 18:21:10 -04:00
- Tim Verhoeven correctly identified that curl
1999-12-29 09:20:26 -05:00
doesn't support URL formatted file names when getting ftp. Now, there's a
problem with getting very weird file names off FTP servers. RFC 959 defines
that the file name syntax to use should be the same as in the native OS of
the server. Since we don't know the peer server system we currently just
translate the URL syntax into plain letters. It is still better and with
the solaris 2.6-supplied ftp server it works with spaces in the file names.
Daniel (27 December 1999):
- When curl parsed cookies straight off a remote site, it corrupted the input
data, which, if the downloaded headers were stored made very odd characters
2000-07-29 18:21:10 -04:00
in the saved data. Correctly identified and reported by Paul Harrington.
1999-12-29 09:20:26 -05:00
Daniel (13 December 1999):
- General cleanups in the library interface. There had been some bad kludges
added during times of stress and I did my best to clean them off. It was
both regarding the lib API as well as include file confusions.
Daniel (3 December 1999):
2000-07-29 18:21:10 -04:00
- A small --stderr bug was reported by Eetu Ojanen...
1999-12-29 09:20:26 -05:00
- who also brought the suggestion of extending the -X flag to ftp list as
well. So, now it is and the long option is now --request instead. It is
only for ftp list for now (and the former http stuff too of course).
2000-07-29 18:21:10 -04:00
Lars J. Aas (24 November 1999):
1999-12-29 09:20:26 -05:00
- Patched curl to compile and build under BeOS. Doesn't work yet though!
- Corrected the Makefile.am files to allow putting object files in
different directories than the sources.
Version 6.3.1
Daniel (23 November 1999):
- I've had this major disk crash. My good old trust-worthy source disk died
along with the machine that hosted it. Thank goodness most of all the
things I've done are either backed up elsewhere or stored in this CVS
server!
2000-07-29 18:21:10 -04:00
- Michael S. Steuer pointed out a bug in the -F handling
1999-12-29 09:20:26 -05:00
that made curl hang if you posted an empty variable such as '-F name='. It
was one of those old bugs that never have worked properly...
2000-07-29 18:21:10 -04:00
- Jason Baietto pointed out a general flaw in the HTTP
1999-12-29 09:20:26 -05:00
download. Curl didn't complain if it was prematurely aborted before the
entire download was completed. It does now.
Daniel (19 November 1999):
2000-07-29 18:21:10 -04:00
- Chris Maltby very accurately criticized the lack of
1999-12-29 09:20:26 -05:00
return code checks on the fwrite() calls. I did a thorough check for all
occurrences and corrected this.
Daniel (17 November 1999):
2000-07-29 18:21:10 -04:00
- Paul Harrington pointed out that the -m/--max-time option
1999-12-29 09:20:26 -05:00
doesn't work for the slow system calls like gethostbyname()... I don't have
any good fix yet, just a slightly less bad one that makes curl exit hard
when the timeout is reached.
- Bjorn Reese helped me point out a possible problem that might be the reason
why Thomas Hurst experience problems in his Amiga version.
Daniel (12 November 1999):
- I found a crash in the new cookie file parser. It crashed when you gave
a plain http header file as input...
Version 6.3
Daniel (10 November 1999):
- I kind of found out that the HTTP time-conditional GETs (-z) aren't always
respected by the web server and the document is therefore sent in whole
again, even though it doesn't match the requested condition. After reading
section 13.3.4 of RFC 2616, I think I'm doing the right thing now when I do
my own check as well. If curl thinks the condition isn't met, the transfer
is aborted prematurely (after all the headers have been received).
2000-07-29 18:21:10 -04:00
- After comments from Robert Linden I also rewrote some parts of the man page
to better describe how the -F works.
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- Michael Anti put up a new curl download mirror in
1999-12-29 09:20:26 -05:00
China: http://www.pshowing.com/curl/
- I added the list of download mirrors to the README file
- I did add more explanations to the man page
Daniel (8 November 1999):
- I made the -b/--cookie option capable of reading netscape formatted cookie
files as well as normal http-header files. It should be able to
2000-03-23 06:02:08 -05:00
transparently figure out what kind of file it got as input.
1999-12-29 09:20:26 -05:00
Daniel (29 October 1999):
- Another one of Sebastiaan van Erk's ideas (that has been requested before
but I seem to have forgotten who it was), is to add support for ranges in
FTP downloads. As usual, one request is just a request, when they're two
it is a demand. I've added simple support for X-Y style fetches. X has to
be the lower number, though you may omit one of the numbers. Use the -r/
--range switch (previously HTTP-only).
2000-07-29 18:21:10 -04:00
- Sebastiaan van Erk suggested that curl should be
1999-12-29 09:20:26 -05:00
able to show the file size of a specified file. I think this is a splendid
idea and the -I flag is now working for FTP. It displays the file size in
this manner:
Content-Length: XXXX
As it resembles normal headers, and leaves us the opportunity to add more
info in that display if we can come up with more in the future! It also
makes sense since if you access ftp through a HTTP proxy, you'd get the
file size the same way.
2000-03-23 06:02:08 -05:00
I changed the order of the QUOTE command executions. They're now executed
1999-12-29 09:20:26 -05:00
just after the login and before any other command. I made this to enable
quote commands to run before the -I stuff is done too.
- I found out that -D/--dump-header and -V/--version weren't documented in
the man page.
- Many HTTP/1.1 servers do not support ranges. Don't ask me why. I did add
some text about this in the man page for the range option. The thread in
2000-07-29 18:21:10 -04:00
the mailing list that started this was initiated by Michael Anti.
1999-12-29 09:20:26 -05:00
- I get reports about nroff crashes on solaris 2.6+ when displaying the curl
man page. Switch to gnroff instead, it is reported to work(!). Adam Barclay
2000-07-29 18:21:10 -04:00
reported and brought the suggestion.
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- In a dialogue with Johannes G. Kristinsson we came
1999-12-29 09:20:26 -05:00
up with the idea to let -H/--header specified headers replace the
internally generated headers, if you happened to select to add a header
that curl normally uses by itself. The advantage with this is not entirely
obvious, but in Johannes' case it means that he can use another Host: than
the one curl would set.
Daniel (27 October 1999):
2000-07-29 18:21:10 -04:00
- Jongki Suwandi brought a nice patch for (yet another) crash when following
a location:. This time you had to follow a https:// server's redirect to
get the core.
1999-12-29 09:20:26 -05:00
Version 6.2
Daniel (21 October 1999):
- I think I managed to remove the suspicious (nil) that has been seen just
before the "Host:" in HTTP requests when -v was used.
- I found out that if you followed a location: when using a proxy, without
having specified http:// in the URL, the protocol part was added once again
when moving to the next URL! (The protocol part has to be added to the
URL when going through a proxy since it has no protocol-guessing system
such as curl has.)
2000-07-29 18:21:10 -04:00
- Benjamin Ritcey reported a core dump under solaris 2.6
1999-12-29 09:20:26 -05:00
with OpenSSL 0.9.4. It turned out this was due to a bad free() in main.c
that occurred after the download was done and completed.
- Benjamin found ftp downloads to show the first line of the download meter
to get written twice, and I removed that problem. It was introduced with
the multiple URL support.
2000-07-29 18:21:10 -04:00
- Dan Zitter correctly pointed out that curl 6.1 and earlier versions didn't
honor RFC 2616 chapter 4 section 2, "Message Headers": "...Field names are
case-insensitive..." HTTP header parsing assumed a certain casing. Dan
also provided me with a patch that corrected this, which I took the liberty
of editing slightly.
1999-12-29 09:20:26 -05:00
- Dan Zitter also provided a nice patch for config.guess to better recognize
the Mac OS X
- Dan also corrected a minor problem in the lib/Makefile that caused linking
to fail on OS X.
Daniel (19 October 1999):
2000-07-29 18:21:10 -04:00
- Len Marinaccio came up with some problems with curl. Since Windows has a
crippled shell, it can't redirect stderr and that causes trouble. I added
--stderr today which allows the user to redirect the stderr stream to a
file or stdout.
1999-12-29 09:20:26 -05:00
Daniel (18 October 1999):
- The configure script now understands the '--without-ssl' flag, which now
totally disable SSL/https support. Previously it wasn't possible to force
the configure script to leave SSL alone. The previous functionality has
been retained. Troy Engel helped test this new one.
Version 6.1
Daniel (17 October 1999):
- I ifdef'ed or commented all the zlib stuff in the sources and configure
script. It turned out we needed to mock more with zlib than I initially
thought, to make it capable of downloading compressed HTTP documents and
uncompress them on the fly. I didn't mean the zlib parts of curl to become
more than minor so this means I halt the zlib expedition for now and wait
until someone either writes the code or zlib gets updated and better
adjusted for this kind of usage. I won't get into details here, but a
short a summary is suitable:
- zlib can't automatically detect whether to use zlib or gzip
decompression methods.
- zlib is very neat for reading gzipped files from a file descriptor,
although not as nice for reading buffer-based data such as we would
want it.
- there are still some problems with the win32 version when reading from
a file descriptor if that is a socket
Daniel (14 October 1999):
- Moved the (external) include files for libcurl into a subdirectory named
curl and adjusted all #include lines to use <curl/XXXX> to maintain a
better name space and control of the headers. This has been requested.
Daniel (12 October 1999):
- I modified the 'maketgz' script to perform a 'make' too before a release
archive is put together in an attempt to make the time stamps better and
hopefully avoid the double configure-running that use to occur.
Daniel (11 October 1999):
- Applied J<>rn's patches that fixes zlib for mingw32 compiles as well as
some other missing zlib #ifdef and more text on the multiple URL docs in
the man page.
Version 6.1beta
Daniel (6 October 1999):
2000-07-29 18:21:10 -04:00
- Douglas E. Wegscheid sent me a patch that made the exact same thing as I
just made: the -d switch is now capable of reading post data from a named
file or stdin. Use it similarly to the -F. To read the post data from a
given file:
1999-12-29 09:20:26 -05:00
curl -d @path/to/filename www.postsite.com
or let curl read it out from stdin:
curl -d @- www.postit.com
J<>rn Hartroth (3 October 1999):
- Brought some more patches for multiple URL functionality. The MIME
separation ideas are almost scrapped now, and a custom separator is being
used instead. This is still compile-time "flagged".
Daniel
- Updated curl.1 with multiple URL info.
Daniel (30 September 1999):
2000-07-29 18:21:10 -04:00
- Felix von Leitner brought openssl-check fixes for configure.in to work
out-of-the-box when the openssl files are installed in the system default
dirs.
1999-12-29 09:20:26 -05:00
Daniel (28 September 1999)
- Added libz functionality. This should enable decompressing gzip, compress
or deflate encoding HTTP documents. It also makes curl send an accept that
it accepts that kind of encoding. Compressed contents usually shortens
download time. I *need* someone to tell me a site that uses compressed HTTP
documents so that I can test this out properly.
- As a result of the adding of zlib awareness, I changed the version string
a little. I plan to add openldap version reporting in there too.
Daniel (17 September 1999)
- Made the -F option allow stdin when specifying files. By using '-' instead
of file name, the data will be read from stdin.
Version 6.0
Daniel (13 September 1999)
- Added -X/--http-request <request> to enable any HTTP command to be sent.
Do not that your server has to support the exact string you enter. This
should possibly a string like DELETE or TRACE.
- Applied Douglas' mingw32-fixes for the makefiles.
Daniel (10 September 1999)
2000-07-29 18:21:10 -04:00
- Douglas E. Wegscheid pointed out a problem. Curl didn't check the FTP
servers return code properly after the --quote commands were issued. It
took anything non 200 as an error, when all 2XX codes should be accepted as
OK.
1999-12-29 09:20:26 -05:00
- Sending cookies to the same site in multiple lines like curl used to do
turned out to be bad and breaking the cookie specs. Curl now sends all
cookies on a single Cookie: line. Curl is not yet RFC 2109 compliant, but I
doubt that many servers do use that syntax (yet).
Daniel (8 September 1999)
- J<>rn helped me make sure it still compiles nicely with mingw32 under win32.
Daniel (7 September 1999)
- FTP upload through proxy is now turned into a HTTP PUT. Requested by
2000-07-29 18:21:10 -04:00
Stefan Kanthak.
1999-12-29 09:20:26 -05:00
- Added the ldap files to the .m32 makefile.
Daniel (3 September 1999)
- Made cookie matching work while using HTTP proxy.
2000-07-29 18:21:10 -04:00
Bjorn Reese (31 August 1999)
1999-12-29 09:20:26 -05:00
- Passed his ldap:// patch. Note that this requires the openldap shared
library to be installed and that LD_LIBRARY_PATH points to the
directory where the lib will be found when curl is run with a
ldap:// URL.
2000-07-29 18:21:10 -04:00
J<>rn Hartroth (31 August 1999)
1999-12-29 09:20:26 -05:00
- Made the Mingw32 makefiles into single files.
- Made file:// work for Win32. The same code is now used for unix as well for
performance reasons.
2000-07-29 18:21:10 -04:00
Douglas E. Wegscheid (30 August 1999)
1999-12-29 09:20:26 -05:00
- Patched the Mingw32 makefiles for SSL builds.
2000-07-29 18:21:10 -04:00
Matthew Clarke (30 August 1999)
1999-12-29 09:20:26 -05:00
- Made a cool patch for configure.in to allow --with-ssl to specify the
root dir of the openssl installation, as in
./configure --with-ssl=/usr/ssl_here
- Corrected the 'reconf' script to work better with some shells.
2000-07-29 18:21:10 -04:00
J<>rn Hartroth (26 August 1999)
1999-12-29 09:20:26 -05:00
- Fixed the Mingw32 makefiles in lib/ and corrected the file.c for win32
compiles.
Version 5.11
Daniel (25 August 1999)
2000-07-29 18:21:10 -04:00
- John Weismiller pointed out a bug in the header-line
1999-12-29 09:20:26 -05:00
realloc() system in download.c.
- I added lib/file.[ch] to offer a first, simple, file:// support. It
probably won't do much good on win32 system at this point, but I see it
as a start.
- Made the release archives get a Makefile in the root dir, which can be
used to start the compiling/building process easier. I haven't really
changed any INSTALL text yet, I wanted to get some feed-back on this
first.
Daniel (17 August 1999)
- Another Location: bug. Curl didn't do proper relative locations if the
original URL had cgi-parameters that contained a slash. Nusu's page
again.
- Corrected the NO_PROXY usage. It is a list of substrings that if one of
them matches the tail of the host name it should connect to, curl should
2000-07-29 18:21:10 -04:00
not use a proxy to connect there. Pointed out to me by Douglas
E. Wegscheid. I also changed the README text a little regarding this.
1999-12-29 09:20:26 -05:00
Daniel (16 August 1999)
- Fixed a memory bug with http-servers that sent Location: to a Location:
page. Nusu's page showed this too.
- Made cookies work a lot better. Setting the same cookie name several times
used to add more cookies instead of replacing the former one which it
2000-07-29 18:21:10 -04:00
should've. Nusu <nus at intergorj.ro> brought me an URL that made this
1999-12-29 09:20:26 -05:00
painfully visible...
Troy (15 August 1999)
- Brought new .spec files as well as a patch for configure.in that lets the
configure script find the openssl files better, even when the include
files are in /usr/include/openssl
Version 5.10
Daniel (13 August 1999)
- SSL_CTX_set_default_passwd_cb() has been modified in the 0.9.4 version of
OpenSSL. Now why couldn't they simply add a *new* function instead of
modifying the parameters of an already existing function? This way, we get
a compiler warning if compiling with 0.9.4 but not with earlier. So, I had
2000-03-23 06:02:08 -05:00
to come up with a #if construction that deals with this...
1999-12-29 09:20:26 -05:00
- Made curl output the SSL version number get displayed properly with 0.9.4.
Troy (12 August 1999)
- Added MingW32 (GCC-2.95) support under Win32. The INSTALL file was also
a bit rearranged.
Daniel (12 August 1999)
- I had to copy a good <arpa/telnet.h> include file into the curl source
tree to enable the silly win32 systems to compile. The distribution rights
allows us to do that as long as the file remains unmodified.
- I corrected a few minor things that made the compiler complain when
-Wall -pedantic was used.
- I'm moving the official curl web page to http://curl.haxx.nu. I think it
will make it easier to remember as it is a lot shorter and less cryptic.
The old one still works and shows the same info.
Daniel (11 August 1999)
- Albert Chin-A-Young mailed me another correction for NROFF in the
configure.in that is supposed to be better for IRIX users.
Daniel (10 August 1999)
2000-07-29 18:21:10 -04:00
- Albert Chin-A-Young helped me with some stupid Makefile things, as well as
some fiddling with the getdate.c stuff that he had problems with under
HP-UX v10. getdate.y will now be compiled into getdate.c if the appropriate
yacc or bison is found by the configure script. Since this is slightly new,
we need to test the output getdate.c with win32 systems to make sure it
still compiles there.
1999-12-29 09:20:26 -05:00
Daniel (5 August 1999)
- I've just setup a new mailing list with the intention to keep discussions
around libcurl development in it. I mainly expect it to be for thoughts and
brainstorming around a "next generation" library, rather than nitpicking
about the current implementation or details in the current libcurl.
To join our happy bunch of future-looking geeks, enter 'subscribe
<address>' in the body of a mail and send it to
libcurl-request@listserv.fts.frontec.se. Curl bug reports, the usual curl
talk and everything else should still be kept in this mailing list. I've
started to archive this mailing list and have put the libcurl web page at
www.fts.frontec.se/~dast/libcurl/.
2000-07-29 18:21:10 -04:00
- Stefan Kanthak contacted me regarding a few problems in the configure
script which he discovered when trying to make curl compile and build under
Siemens SINIX-Z V5.42B2004!
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- Marcus Klein very accurately informed me that src/version.h was not present
in the CVS repository. Oh, how silly...
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- Linus Nielsen rewrote the telnet:// part and now curl offers limited telnet
support. If you run curl like 'curl telnet://host' you'll get all output on
the screen and curl will read input from stdin. You'll be able to login and
run commands etc, but since the output is buffered, expect to get a little
weird output.
1999-12-29 09:20:26 -05:00
This is still in its infancy and it might get changed. We need your
feed-back and input in how this is best done.
WIN32 NOTE: I bet we'll get problems when trying to compile the current
lib/telnet.c on win32, but I think we can sort them out in time.
2000-07-29 18:21:10 -04:00
- David Sanderson reported that FORCE_ALLOCA_H or HAVE_ALLOCA_H must be
defined for getdate.c to compile properly on HP-UX 11.0. I updated the
configure script to check for alloca.h which should make it.
1999-12-29 09:20:26 -05:00
Daniel (4 August 1999)
- I finally got to understand Marcus Klein's ftp download resume problem,
which turns out to be due to different outputs from different ftp
servers. It makes ftp download resuming a little trickier, but I've made
some modifications I really believe will work for most ftp servers and I do
hope you report if you have problems with this!
- Added text about file transfer resuming to README.curl.
Daniel (2 August 1999)
2000-07-29 18:21:10 -04:00
- Applied a progress-bar patch from Lars J. Aas. It offers
1999-12-29 09:20:26 -05:00
a new styled progress bar enabled with -#/--progress-bar.
2000-07-29 18:21:10 -04:00
T. Yamada <tai at imasy.or.jp> (30 July 1999)
1999-12-29 09:20:26 -05:00
- It breaks with segfault when 1) curl is using .netrc to obtain
2000-03-23 06:02:08 -05:00
username/password (option '-n'), and 2) is automatically redirected to
1999-12-29 09:20:26 -05:00
another location (option '-L').
There is a small bug in lib/url.c (block starting from line 641), which
tries to take out username/password from user- supplied command-line
argument ('-u' option). This block is never executed on first attempt since
CONF_USERPWD bit isn't set at first, but curl later turns it on when it
checks for CONF_NETRC bit. So when curl tries to redo everything due to
redirection, it segfaults trying to access *data->userpwd.
Version 5.9.1
Daniel (30 July 1999)
2000-07-29 18:21:10 -04:00
- Steve Walch pointed out that there is a memory leak in the formdata
functions. I added a FormFree() function that is now used and supposed to
correct this flaw.
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- Mark Wotton reported:
1999-12-29 09:20:26 -05:00
'curl -L https://www.cwa.com.au/' core dumps. I managed to cure this by
correcting the cleanup procedure. The bug seems to be gone with my OpenSSL
0.9.2b, although still occurs when I run the ~100 years old SSLeay 0.8.0. I
don't know whether it is curl or SSLeay that is to blame for that.
2000-07-29 18:21:10 -04:00
- Marcus Klein:
1999-12-29 09:20:26 -05:00
Reported an FTP upload resume bug that I really can't repeat nor understand.
I leave it here so that it won't be forgotten.
Daniel (29 July 1999)
2000-07-29 18:21:10 -04:00
- Costya Shulyupin suggested support for longer URLs when following Location:
and I could only agree and fix it!
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- Leigh Purdie found a problem in the upload/POST department. It turned out
that http.c accidentaly cleared the pointer instead of the byte counter
when supposed to.
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- Costya Shulyupin pointed out a problem with port numbers and Location:. If
you had a server at a non-standard port that redirected to an URL using a
standard port number, curl still used that first port number.
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- Ralph Beckmann pointed out a problem when using both CONF_FOLLOWLOCATION
and CONF_FAILONERROR simultaneously. Since the CONF_FAILONERROR exits on
the 302-code that the follow location header outputs it will never show any
html on location: pages. I have now made it look for >=400 codes if
CONF_FOLLOWLOCATION is set.
1999-12-29 09:20:26 -05:00
- 'struct slist' is now renamed to 'struct curl_slist' (as suggested by Ralph
Beckmann).
2000-07-29 18:21:10 -04:00
- Joshua Swink and Rick Welykochy were the first to point out to me that the
latest OpenSSL package now have moved the standard include path. It is now
in /usr/local/ssl/include/openssl and I have now modified the --enable-ssl
1999-12-29 09:20:26 -05:00
option for the configure script to use that as the primary path, and I
leave the former path too to work with older packages of OpenSSL too.
Daniel (9 June 1999)
- I finally understood the IRIX problem and now it seem to compile on it!
I am gonna remove those #define strcasecmp() things once and for all now.
Daniel (4 June 1999)
- I adjusted the FTP reply 227 parser to make the PASV command work better
with more ftp servers. Appearantly the Roxen Challanger server replied
something curl 5.9 could deal with! :-( Reported by Ashley Reid-Montanaro
2000-07-29 18:21:10 -04:00
and Mark Butler brought a solution for it.
1999-12-29 09:20:26 -05:00
Daniel (26 May 1999)
- Rearranged. README is new, the old one is now README.curl and I added a
2000-07-29 18:21:10 -04:00
README.libcurl with text I got from Ralph Beckmann.
1999-12-29 09:20:26 -05:00
- I also updated the INSTALL text.
Daniel (25 May 1999)
2000-07-29 18:21:10 -04:00
- David Jonathan Lowsky correctly pointed out that curl didn't properly deal
with form posting where the variable shouldn't have any content, as in curl
-F "form=" www.site.com. It was now fixed.
1999-12-29 09:20:26 -05:00
Version 5.9
Daniel (22 May 1999)
2000-07-29 18:21:10 -04:00
- I've got a bug report from Aaron Scarisbrick in which he states he has some
problems with -L under FreeBSD 3.0. I have previously got another bug
report from Stefan Grether which points at an error with similar sympthoms
1999-12-29 09:20:26 -05:00
when using win32. I made the allocation of the new url string a bit faster
and different, don't know if it actually improves anything though...
Daniel (20 May 1999)
- Made the cookie parser deal with CRLF newlines too.
Daniel (19 May 1999)
2000-07-29 18:21:10 -04:00
- Download() didn't properly deal with failing return codes from the sread()
function. Adam Coyne found the problem in the win32 version, and Troy Engel
helped me out isolating it.
1999-12-29 09:20:26 -05:00
Daniel (16 May 1999)
2000-07-29 18:21:10 -04:00
- Richard Adams pointed out a bug I introduced in 5.8. --dump-header doesn't
work anymore! :-/ I fixed it now.
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- After a suggestion by Joshua Swink I added -S / --show-error to force curl
to display the error message in case of an error, even if -s/--silent was
used.
1999-12-29 09:20:26 -05:00
Daniel (10 May 1999)
- I moved the stuff concerning HTTP, DICT and TELNET it their own source
files now. It is a beginning on my clean-up of the sources to make them
layer all those protocols better to enable more to be added easier in the
future!
2000-07-29 18:21:10 -04:00
- Leon Breedt sent me some files I've not put into the main curl
archive. They're for creating the Debian package thingie. He also sent me a
debian package that I've made available for download at the web page
1999-12-29 09:20:26 -05:00
Daniel (9 May 1999)
- Made it compile on cygwin too.
Troy Engel (7 May 1999)
- Brought a series of patches to allow curl to compile smoothly on MSVC++ 6
again!
Daniel (6 May 1999)
- I changed the #ifdef HAVE_STRFTIME placement for the -z code so that it
will be easier to discover systems that don't have that function and thus
can't use -z successfully. Made the strftime() get used if WIN32 is defined
too.
Version 5.8
Daniel (5 May 1999)
- I've had it with this autoconf/automake mess. It seems to work allright
for most people who don't have automake installed, but for those who have
there are problems all over.
I've got like five different bug reports on this only the last
2000-07-29 18:21:10 -04:00
week... Claudio Neves and Federico Bianchi and root <duggerj001 at
hawaii.rr.com> are some of them reporting this.
1999-12-29 09:20:26 -05:00
Currently, I have no really good fix since I want to use automake myself to
generate the Makefile.in files. I've found out that the @SHELL@-problems
can often be fixed by manually invoking 'automake' in the archive root
before you run ./configure... I've hacked my maketgz script now to fiddle
a bit with this and my tests seem to work better than before at least!
Daniel (4 May 1999)
- mkhelp.pl has been doing badly lately. I corrected a case problem in
the regexes.
- I've now remade the -o option to not touch the file unless it needs to.
I had to do this to make -z option really fine, since now you can make a
curl fetch and use a local copy's time when downloading to that file, as
in:
curl -z dump -o dump remote.site.com/file.html
This will only get the file if the remote one is newer than the local.
I'm aware that this alters previous behaviour a little. Some scripts out
there may depend on that the file is always touched...
- Corrected a bug in the SSLv2/v3 selection.
2000-07-29 18:21:10 -04:00
- Felix von Leitner requested that curl should be able to send
"If-Modified-Since" headers, which indeed is a fair idea. I implemented it
right away! Try -z <expression> where expression is a full GNU date
expression or a file name to get the date from!
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
Stephan Lagerholm (30 Apr 1999)
1999-12-29 09:20:26 -05:00
- Pointed out a problem with the src/Makefile for FreeBSD. The RM variable
isn't set and causes the make to fail.
Daniel (26 April 1999)
2000-07-29 18:21:10 -04:00
- Am I silly or what? Irving Wolfe pointed out to me that the curl version
number was not set properly. Hasn't been since 5.6. This was due to a bug
in my maketgz script!
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
David Eriksson (25 Apr 1999)
1999-12-29 09:20:26 -05:00
- Found a bug in cookies.c that made it crash at times.
Version 5.7.1
2000-07-29 18:21:10 -04:00
Doug Kaufman (23 Apr 1999)
1999-12-29 09:20:26 -05:00
- Brought two sunos 4 fixes. One of them being the hostip.c fix mentioned
below and the other one a correction in include/stdcheaders.h
- Added a paragraph about compiling with the US-version of openssl to the
INSTALL file.
Daniel
- New mailing list address. Info updated on the web page as well as in the
README file
2000-07-29 18:21:10 -04:00
Greg Onufer (20 Apr 1999)
1999-12-29 09:20:26 -05:00
- hostip.c didn't compile properly on SunOS 5.5.1.
It needs an #include <sys/types.h>
Version 5.7
Daniel (Apr 20 1999)
- Decided to upload a non-beta version right now!
- Made curl support any-length HTTP headers. The destination buffer is now
simply enlarged every time it turns out to be too small!
- Added the FAQ file to the archive. Still a bit smallish, but it is a
start.
2000-07-29 18:21:10 -04:00
Eric Thelin (15 Apr 1999)
1999-12-29 09:20:26 -05:00
- Made -D accept '-' instead of filename to write to stdout.
Version 5.6.3beta
Daniel (Apr 12 1999)
- Changed two #ifdef WIN32 to better #ifdef <errorcode> when connect()ing
in url.c and ftp.c. Makes cygwin32 deal with them better too. We should
try to get some decent win32-replacement there. Anyone?
- The old -3/--crlf option is now ONLY --crlf!
- I changed the "SSL fix" to a more lame one, but that doesn't remove as
much functionality. Now I've enabled the lib to select what SSL version it
should try first. Appearantly some older SSL-servers don't like when you
talk v3 with them so you need to be able to force curl to talk v2 from the
start. The fix dated April 6 and posted on the mailing list forced curl to
use v2 at all times using a modern OpenSSL version, but we don't really
want such a crippled solution.
2000-07-29 18:21:10 -04:00
- Marc Boucher sent me a patch that corrected a math error for the
"Curr.Speed" progress meter.
1999-12-29 09:20:26 -05:00
2000-07-29 18:21:10 -04:00
- Eric Thelin sent me a patch that enables '-K -' to read a config file from
stdin.
1999-12-29 09:20:26 -05:00
- I found out we didn't close the file properly before so I added it!
Daniel (Apr 9 1999)
2000-07-29 18:21:10 -04:00
- Yu Xin pointed out a problem with ftp download resume. It didn't work at
all! ;-O
1999-12-29 09:20:26 -05:00
Daniel (Apr 6 1999)
- Corrected the version string part generated for the SSL version.
- I found a way to make some other SSL page work with openssl 0.9.1+ that
previously didn't (ssleay 0.8.0 works with it though!). Trying to get
some real info from the OpenSSL guys to see how I should do to behave the
best way. SSLeay 0.8.0 shouldn't be that much in use anyway these days!
Version 5.6.2beta
Daniel (Apr 4 1999)
- Finally have curl more cookie "aware". Now read carefully. This is how
it works.
To make curl read cookies from an already existing file, in plain header-
format (like from the headers of a previous fetch) invoke curl with the
-b flag like:
curl -b file http://site/foo.html
Curl will then use all cookies it finds matching. The old style that sets
a single cookie with -b is still supported and is used if the string
following -b includes a '=' letter, as in "-b name=daniel".
To make curl read the cookies sent in combination with a location: (which
sites often do) point curl to read a non-existing file at first (i.e
to start with no existing cookies), like:
curl -b nowhere http://site/setcookieandrelocate.html
- Added a paragraph in the TODO file about the SSL problems recently
reported. Evidently, some kind of SSL-problem curl may need to address.
- Better "Location:" following.
2000-07-29 18:21:10 -04:00
Douglas E. Wegscheid (Tue, 30 Mar 1999)
1999-12-29 09:20:26 -05:00
- A subsecond display patch.
Daniel (Mar 14 1999)
- I've separated the version number of libcurl and curl now. To make
things a little easier, I decided to start the curl numbering from
5.6 and the former version number known as "curl" is now the one
set for libcurl.
- Removed the 'enable-no-pass' from configure, I doubt anyone wanted
that.
- Made lots of tiny adjustments to compile smoothly with cygwin under
win32. It's a killer for porting this to win32, bye bye VC++! ;-)
Compiles and builds out-of-the-box now. See the new wordings in
INSTALL for details.
- Beginning experiments with downloading multiple document from a http
server while remaining connected.
Version 5.6beta
Daniel (Mar 13 1999)
2000-07-29 18:21:10 -04:00
- Since I've changed so much, I thought I'd just go ahead and implement the
suggestion from Douglas E. Wegscheid. -D or --dump-header is now storing
HTTP headers separately in the specified file.
1999-12-29 09:20:26 -05:00
- Added new text to INSTALL on what to do to build this on win32 now.
- Aaargh. I had to take a step back and prefix the shared #include files
in the sources with "../include/" to please VC++...
Daniel (Mar 12 1999)
- Split the url.c source into many tiny sources for better readability
and smaller size.
Daniel (Mar 11 1999)
- Started to change stuff for a move to make libcurl and a more separate
curl application that uses the libcurl. Made the libcurl sources into
the new lib directory while the curl application will remain in src as
before. New makefiles, adjusted configure script and so.
libcurl.a built quickly and easily. I better make a better interface to
the lib functions though.
The new root dir include/ is supposed to contain the public information
about the new libcurl. It is a little ugly so far :-)
Daniel (Mar 1 1999)
2000-07-29 18:21:10 -04:00
- Todd Kaufmann sent me a good link to Netscape's cookie spec as well as the
info that RFC 2109 specifies how to use them. The link is now in the
README and the RFC in the RESOURCES.
1999-12-29 09:20:26 -05:00
Daniel (Feb 23 1999)
- Finally made configure accept --with-ssl to look for SSL libs and includes
in the "standard" place /usr/local/ssl...
Daniel (Feb 22 1999)
- Verified that curl linked fine with OpenSSL 0.9.1c which seems to be
the most recent.
2000-07-29 18:21:10 -04:00
Henri Gomez (Fri Feb 5 1999)
1999-12-29 09:20:26 -05:00
- Sent in an updated curl-ssl.spec. I still miss the script that builds an
RPM automatically...
Version 5.5.1
2000-07-29 18:21:10 -04:00
Mark Butler (27 Jan 1999)
1999-12-29 09:20:26 -05:00
- Corrected problems in Download().
Danitel Stenberg (25 Jan 1999)
2000-07-29 18:21:10 -04:00
- Jeremie Petit pointed out a few flaws in the source that prevented it from
compile warning free with the native compiler under Digital Unix v4.0d.
1999-12-29 09:20:26 -05:00
Version 5.5
Daniel Stenberg (15 Jan 1999)
- Added Bjorns small text to the README about the DICT protocol.
Daniel Stenberg (11 Jan 1999)
2000-07-29 18:21:10 -04:00
- <jswink at softcom.net> reported about the win32-versioin: "Doesn't use
1999-12-29 09:20:26 -05:00
ALL_PROXY environment variable". Turned out to be because of the static-
buffer nature of the win32 environment variable calls!
2000-07-29 18:21:10 -04:00
Bjorn Reese (10 Jan 1999)
1999-12-29 09:20:26 -05:00
- I have attached a simple addition for the DICT protocol (RFC 2229).
It performs dictionary lookups. The output still needs to be better
formatted.
To test it try (the exact format, and more examples are described in
the RFC)
dict://dict.org/m:hello
dict://dict.org/m:hello::soundex
2000-07-29 18:21:10 -04:00
Vicente Garcia (10 Jan 1999)
1999-12-29 09:20:26 -05:00
- Corrected the progress meter for files larger than 20MB.
Daniel Stenberg (7 Jan 1999)
- Corrected the -t and -T help texts. They claimed to be FTP only.
Version 5.4
Daniel Stenberg
(7 Jan 1999)
2000-07-29 18:21:10 -04:00
- Irving Wolfe reported that curl -s didn't always supress the progress
reporting. It was the form post that autoamtically always switched it on
again. This is now corrected!
1999-12-29 09:20:26 -05:00
(4 Jan 1999)
2000-07-29 18:21:10 -04:00
- Andreas Kostyrka suggested I'd add PUT and he helped me out to test it. If
you use -t or -T now on a http or https server, PUT will be used for file
upload.
1999-12-29 09:20:26 -05:00
I removed the former use of -T with HTTP. I doubt anyone ever really used
that.
(4 Jan 1999)
2000-07-29 18:21:10 -04:00
- Erik Jacobsen found a width bug in the mprintf() function. I corrected it
now.
1999-12-29 09:20:26 -05:00
(4 Jan 1999)
2000-07-29 18:21:10 -04:00
- As John V. Chow pointed out to me, curl accepted very limited URL sizes. It
should now accept path parts that are up to at least 4096 bytes.
1999-12-29 09:20:26 -05:00
- Somehow I screwed up when applying the AIX fix from Gilbert Ramirez, so
I redid that now.
Version 5.3a (win32 only)
Troy Engel
- Corrected a win32 bug in the environment variable part.
Version 5.3
Gilbert Ramirez Jr. (21 Dec 1998)
- I have implemented the "quote" function of FTP clients. It allows you to
send arbitrary commands to the remote FTP server. I chose the -Q/--quote
command-line arguments.
You can have more than one quoted string, and curl will apply them in
order. This is what I use for my MVS upload:
curl -B --crlf -Q "site lrecl=80" -Q "site blk=8000" -T file ftp://os390/test
Curl will send the two quoted "site" commands in the proper order.
- Made it compile smoothly on AIX.
2000-07-29 18:21:10 -04:00
Gilbert Ramirez Jr. (18 Dec 1998)
1999-12-29 09:20:26 -05:00
- Brought an MVS patch: -3/--mvs, for ftp upload to the MVS ftp server.
2000-07-29 18:21:10 -04:00
Troy Engel (17 Dec 1998)
1999-12-29 09:20:26 -05:00
- Brought a correction that fixes the win32 curl bug.
Daniel Stenberg
2000-07-29 18:21:10 -04:00
- A bug, pointed out to me by Dr H. T. Leung, caused curl to crash on the -A
flag on certain systems. Actually, all systems should've!
1999-12-29 09:20:26 -05:00
- Added a few defines to make directories/file names get build nicer (with _
instead of . and \ instead of / in win32).
2000-07-29 18:21:10 -04:00
- steve <fisk at polar.bowdoin.edu> reported a weird bug that occured if the
1999-12-29 09:20:26 -05:00
ftp server response line had a parenthesis on the line before the (size)
info. I hope it works better now!
Version 5.2.1
2000-07-29 18:21:10 -04:00
Steven G. Johnson (Dec 14, 1998)
1999-12-29 09:20:26 -05:00
- Brought a fix that corrected a crash in 5.2 due to bad treatment of the
environment variables.
Version 5.2
Daniel Stenberg (Dec 14, 1998)
- Rewrote the mkhelp script and now, the mkhelp.pl script generates the
hugehelp.c file from the README *and* the man page file curl.1. By using
both files, I no longer need to have double information in both the man
page and the README as well. So, win32-users will only have the hugehelp.c
file for all info, but then, they download the plain binary most times
anyway.
- gcc2.8.1 with the -Wall flag complaints a lot on subscript has type `char'
if I don't explicitly typecast the argument to isdigit() or isspace() to
int. So I did to compile warning free with that too.
- Added checks for 'long double' and 'long long' in the configure script. I
need those for the mprintf.c source to compile well on non long long
comforming systems!
Version 5.1 (not publicly released)
Daniel Stenberg (Dec 10, 1998)
- I got a request for a pre-compiled NT Alpha version. Anyone?
- Added Lynx/CERN www lib proxy environment variable support. That means curl
now reads and understands the following environment variables:
HTTP_PROXY, HTTPS_PROXY, FTP_PROXY, GOPHER_PROXY
They should be set for protocol-specific proxies. General proxy should be
set with
ALL_PROXY
And a comma-separated list of host names that shouldn't go through any
proxy is set in (only an asterisk, '*' matches all hosts).
NO_PROXY
The usage of the -x/--proxy flag overrides the environment variables.
- Proxy can now be specified with a procotol:// prefix.
- Wrote the curl.1 man page.
- Introduced a whole new dynamic buffer system for all sprintf()s. It is
based on the *printf() package by yours truly and Bjorn Reese. Hopefully,
there aren't that many buffer overflow risks left now.
- Ah, I should mention I've compiled and built curl successfully under
solaris 2.6 with gcc now, gcc 2.7.2 won't work but 2.8.1 did ok.
2000-07-29 18:21:10 -04:00
Oren Tirosh (Dec 3, 1998)
1999-12-29 09:20:26 -05:00
- Brought two .spec files, to use when creating (Linux) Redhat style RPM
packages. They're named curl.spec and curl-ssl.spec.
2000-07-29 18:21:10 -04:00
Troy Engel
1999-12-29 09:20:26 -05:00
- Supplied the src/Makefile.vc6 for easy compiling with VC++ under Win32.
Version 5.0
Daniel Stenberg (Dec 1, 1998)
- Not a single bug report in ages.
- Corrected getpass.c and main.c to compile warning and error free with the
Win32 VC++ crap.
Version 5.0 beta 24
Daniel Stenberg (Nov 20, 1998)
HOW TO BUILD A RELEASE ARCHIVE:
* Pre-requisite software:
What To build what Reads data from
==== ============= ===============
GNU automake Makefile.in, aclocal.m4 configure.in
GNU make(1) - " -
GNU gcc(1) - " -
GNU autoconf configure configure.in
GNU autoheader(2) config.h.in configure.in, acconfig.h
* Make sure all files that should be part of the archive are put in FILES.
* Run './maketgz' and enter version number of the new to become archive.
maketgz does:
- Enters the newly created version number in url.h.
- (If you don't have automake, this script will warn about that, but unless
you have changed the Makefile.am files, that is nothing to care about.)
If you have it, it'll run it.
- If you have autoconf, the configure.in will be edited to get the newly
created version number and autoconf will be run.
- Creates a new directory named curl-<version>. (Actually, it uses the base
name of the current directory up to the first '-'.)
- Copies all files mentioned in FILES to the new directory. Saving
permissions and directory structure.
- Uses tar to create an archive of it all, named curl-<version>.tar.gz
- gzips the archive
- Removes the new directory and all its contents.
* When done, you have an archive stored in your directory named
curl-<version>.tar.gz.
Done!
(1) They're required to make automake run properly.
(2) It is distributed as a part of the GNU autoconf archive.
Daniel Stenberg (Nov 18, 1998)
- I changed the TAG-system. If you ever used urlget() from this package in
another product, you need to recompile with the new headers. I did this
new stuff to better deal with different compilers and system with different
variable sizes. I think it makes it a little more portable. This proves
to compile warning free with the problematic IRIX compiler!
- Win32 compiled with a silly error. Corrected now.
2000-07-29 18:21:10 -04:00
- Brian Chaplin reported yet another problem in
1999-12-29 09:20:26 -05:00
multiline FTP responses. I've tried to correct it. I mailed him a new
version and I hope he gets back soon with positive feedback!
- Improved the 'maketgz' to create a temporary directory tree which it makes
an archive from instead of the previous renaming of the current one.
- Mailing list opened (see README).
- Made -v more verbose on the PASV section of ftp transfers. Now it tells
host name and IP of the new host (and port number). I also added a section
about PORT vs PASV in the README.
Version 5.0 beta 21
Angus Mackay (Nov 15, 1998)
- Introduced automake stuff.
Daniel Stenberg (Nov 13, 1998)
- Just made a successful GET of a document from an SSL-server using my own
private certificate for authentication! The certificate has to be in PEM
format. You do that the easiest way (although not *that* easy) by
downloading the SSLyeay PKCS#12-patch by Dr Stephen N. Henson from his site
at: http://www.drh-consultancy.demon.co.uk/. Using his tool, you can
convert any modern Netscape or (even) MSIE certificate to PEM-format. Use
it with 'curl -E <certificate:password> https://site.com'. If this isn't a
cool feature, then I don't know what cool features look like! ;-)
- Working slowly on telnet connections. #define TRY_TELNET to try it out.
(curl -u user:passwd "telnet://host.com/cat .login" is one example) I do
have problem to define how it should work. The prime purpose for this must
be to get (8bit clean) files via telnet, and it really isn't that easy to
get files this way. Still having problems with \n being converted to \r\n.
Angus Mackay (Nov 12, 1998)
- Corrected another bug in the long parameter name parser.
- Modified getpass.c (NOTE: see the special licensing in the top of that
source file).
Daniel Stenberg (Nov 12, 1998)
- We may have removed the silly warnings from url.c when compiled under IRIX.
2000-07-29 18:21:10 -04:00
Thanks again to Bjorn Reese and Martin Staael.
1999-12-29 09:20:26 -05:00
- Wrote formfind.pl which is a new perl script intended to help you find out
how a FORM submission should be done. This needs a little more work to get
really good.
Daniel Stenberg (Nov 11, 1998)
- Made the HTTP header-checker accept white spaces before the HTTP/1.? line.
Appearantly some proxies/sites add such at times (my test proxy did when I
downloaded a gopher page with it)!
- Moved the former -h to -M and made -h show the short help text instead. I
had to enable a forced help text option. Now an even shorter help text will
be presented when an unknown option and similar, is used.
- stdcheaders.h didn't work with IRIX 6.4 native cc compiler. I hope my
changes don't make other versions go nuts instead.
Daniel Stenberg (Nov 10, 1998)
- Added a weird check in the configure script to check for the silly AIX
warnings about my #define strcasecmp() stuff. I do that define to prevent
me and other contributors to accidentaly use that function name instead
of strequal()...
- I bugfixed Angus's getpass.c very little.
- Fixed the verbose flag names to getopt-style, i.e 'curl --loc' will be
sufficient instead of --location as "loc" is a unique prefix. Also, anything
after a '--' is treated as an URL. So if you do have a host with a weeeird
name you can do 'curl -- -host.com'.
- Another getopt-adjust; curl now accepts flags after the URL on the command
line. 'curl www.foo.com -O' is perfectly valid.
- Corrected the .curlrc parser so that strtok() is no longer used and I
believe it works better. Even URLs can be specified in it now.
Angus Mackay (Nov 9, 1998)
- Replaced getpass.c with a newly written one, not under GPL license
- Changed OS to a #define in config.h instead of compiler flag
- Makefile now uses -DHAVE_CONFIG_H
Daniel Stenberg (Nov 9, 1998)
- Ok, I expanded the tgz-target to update the version string on each occation
I build a release archive!
- I reacted on Angus Mackay's initiative and remade the parameter parser to
be more getopt compliant. Curl now supports "merged" flags as in
curl -lsv ftp.site.com
Do note that I had to move three short-names of the options. Parameters
that needs an additional string such as -x must be stand-alone or the
last in a merged sequence:
curl -lsx my-proxy ftp.site.com
is ok, but using the flags in a different order like '-lxs' would cause
unexpected results (as the 's' option would be skipped).
- I've changed the headers in all files that are subject to the MozPL
license, as they are supposed to look like when conforming.
- Made the configure script make the config.h. The former config.h is now
setup.h.
- The RESOURCES and TODO files have been added to the archive.
2000-07-29 18:21:10 -04:00
Angus Mackay (Nov 5, 1998)
1999-12-29 09:20:26 -05:00
- Fixed getpass.c and various configure stuff
Daniel Stenberg (Nov 3, 1998)
- Use -H/--header for custom HTTP-headers. Lets you pass on your own
specified headers to the remote server. I wouldn't recommend trying to use
a header with a defined usage according to standards. Use this flag once
for every custom header you want to add.
- Use -B/--ftp-ascii to force ftp to use ASCII mode when transfering files.
- Corrected the 'getlinks.pl' script, I accidentally left my silly proxy
usage in there! Since the introduction of the .curlrc file, it is easier to
write scripts that use curl since proxies and stuff should be in the
.curlrc file anyway.
- Introducing the new -F flag for HTTP POST. It supports multipart/form-data
which means it is gonna be possible to upload files etc through HTTP POST.
2000-07-29 18:21:10 -04:00
Shiraz Kanga asked for the feature and my brother,
Bj<42>rn Stenberg helped me design the user
1999-12-29 09:20:26 -05:00
interface for this beast. This feature requires quite some docs,
since it has turned out not only quite capable, but also complicated! :-)
- A note here, since I've received mail about it. SSLeay versions prior to
0.8 will *not* work with curl!
2000-07-29 18:21:10 -04:00
- Wil Langford reported a bug that occurred since curl
1999-12-29 09:20:26 -05:00
did not properly use CRLF when issuing ftp commands. I fixed it.
- Rearranged the order config files are read. .curlrc is now *always* read
first and before the command line flags. -K config files then act as
additional config items.
- Use -q AS THE FIRST OPTION specified to prevent .curlrc from being read.
- You can now disable a proxy by using -x "". Useful if the .curlrc file
specifies a proxy and you wanna fetch something without going through
that.
- I'm thinking of dropping the -p support. Its really not useful since ports
could (and should?) be specified as :<port> appended on the host name
instead, both in URLs and to proxy host names.
2000-07-29 18:21:10 -04:00
- Martin Staael reports curl -L bugs under Windows NT
1999-12-29 09:20:26 -05:00
(test with URL http://come.to/scsde). This bug is not present in this
version anymore.
- Added support for the weird FTP URL type= thing. You can download a file
using ASCII transfer by appending ";type=A" to the right of it. Other
available types are type=D for dir-list (NLST) and type=I for binary
transfer. I can't say I've ever seen anyone use this kind of URL though!
:-)
2000-07-29 18:21:10 -04:00
- Troy Engel pointed out a bug in my getenv("HOME")
1999-12-29 09:20:26 -05:00
usage for win32 systems. I introduce getenv.c to better cope with
this. Mr Engel helps me with the details around that...
- A little note to myself and others, I should make the win32-binary built
with SSL support...
2000-07-29 18:21:10 -04:00
- Ryan Nelson sent me comments about building curl
1999-12-29 09:20:26 -05:00
with SSL under FreeBSD. See the Makefile for details. Using the configure
script, it should work better and automatically now...
- Cleaned up in the port number mess in the source. No longer stores and uses
proxy port number separate from normal port number.
- 'configure' script working. Confirmed compiles on:
Host SSL Compiler
SunOS 5.5 no gcc
SunOS 5.5.1 yes gcc
SunOS 5.6 no cc (with gcc, it has the "gcc include files" problem)
SunOS 4.1.3 no gcc (without ANSI C headers)
SunOS 4.1.2 no gcc (native compiler failed)
Linux 2.0.18 no gcc
Linux 2.0.32 yes gcc
Linux 2.0.35 no gcc (with glibc)
IRIX 6.2 no gcc (cc compiles generate a few warnings)
IRIX 6.4 no cc (generated warnings though)
Win32 no Borland
OSF4.0 no ?
- Ooops. The 5beta (and 4.10) under win32 failed if the HOME variable wasn't
set.
- When using a proxy, curl now guesses and uses the protocol part in cases
like:
curl -x proxy:80 www.site.com
Proxies normally go nuts unless http:// is prepended to the host name, so
if curl is used like this, it guesses protocol and appends the protocol
string before passing it to the proxy. It already did this when used
without proxy.
- Better port usage with SSL through proxy now. If you specified a different
https-port when accessing through a proxy, it didn't use that number
correctly. I also rewrote the code that parses the stuff read from the
proxy when you wanna connect through it with SSL.
2000-07-29 18:21:10 -04:00
- Bjorn Reese helped me work around one of the compiler
1999-12-29 09:20:26 -05:00
warnings on IRIX native cc compiles.
Version 4.10 (Oct 26, 1998)
Daniel Stenberg
2000-07-29 18:21:10 -04:00
- John A. Bristor suggested a config file switch,
1999-12-29 09:20:26 -05:00
and since I've been having that idea kind of in the background for a long
time I rewrote the parameter parsing function a little and now I introduce
the -K/--config flag. I also made curl *always* (unless -K is used) try to
load the .curlrc file for command line parameters. The syntax for the
config file is the standard command line argument style. Details in 'curl
-h' or the README.
- I removed the -k option. Keep-alive isn't really anything anyone would
want to enable with curl anyway.
2000-07-29 18:21:10 -04:00
- Martin Staael helped me add the 'irix' target. Now
1999-12-29 09:20:26 -05:00
"make irix" should build curl successfully on non-gcc SGI machines.
- Single switches now toggle behaviours. I.e if you use -v -v the second
will switch off the verbose mode the first one enabled. This is so that
you can disable a default setting a .curlrc file enables etc.
Version 4.9 (Oct 7, 1998)
Daniel Stenberg
2000-07-29 18:21:10 -04:00
- Martin Staael suggested curl would support cookies.
1999-12-29 09:20:26 -05:00
I added -b/--cookie to enable free-text cookie data to be passed. There's
also a little blurb about general cookie stuff in the README/help text.
2000-07-29 18:21:10 -04:00
- dmh <dmh at jet.es> suggested HTTP resume capabilities. Although you could
1999-12-29 09:20:26 -05:00
manually get curl to resume HTTP documents, I made the -c resume flag work
for HTTP too (unless -r is used too, which would be very odd anyway).
- Added checklinks.pl to the archive. It is a still experimental perl script
that checks all links of a web page by using curl.
- Rearranged the archive hierarchy a little. Build the executable in the
src/ dir from now on!
- Version 4.9 and hereafter, is no longer released under the GPL license.
I have now updated the LEGAL file etc and now this is released using the
Mozilla Public License to avoid the plague known as "the GPL virus". You
must make the source available if you decide to change and/or redistribute
curl, but if you decide to use curl within something else you do not need
to offer the world the source to that too.
- Curl did not like HTTP servers that sent no headers at all on a GET
request. It is a violation of RFC2068 but appearantly some servers do
2000-07-29 18:21:10 -04:00
that anyway. Thanks to Gordon Beaton for the report!
- -L/--location was added after a suggestion from Martin Staael. This makes
curl ATTEMPT to follow the Location: redirect if one is present in the HTTP
headers. If -i or -I is used with this flag, you will see headers from all
sites the Location: points to. Do note that the first server can point to a
second that points to a third etc. It seems the Location: parameter (said
to be an AbsoluteURI in RFC2068) isn't always absolute.. :-/ Anyway, I've
made curl ATTEMPT to do the best it can to deal with the reality.
1999-12-29 09:20:26 -05:00
- Added getlinks.pl to the archive. getlinks.pl selectively downloads
files that a web page links to.
Version 4.8.4
Daniel Stenberg
2000-07-29 18:21:10 -04:00
- As Julian Romero Nieto reported, curl reported wrong version number.
- As Teemu Yli-Elsila pointed out, the win32 version of 4.8 (and probably all
other versions for win32) didn't work with binary files since I'm too used
to the UNIX style fopen() where binary and text don't differ...
- Ralph Beckmann brought me some changes that lets curl compile error and
warning free with -Wall -pedantic with g++. I also took the opportunity to
clean off some unused variables and similar.
- Ralph Beckmann made me aware of a really odd bug now corrected. When curl
read a set of headers from a HTTP server, divided into more than one read
and the first read showed a full line *exactly* (i.e ending with a
newline), curl did not behave well.
1999-12-29 09:20:26 -05:00
Version 4.8.3
Daniel Stenberg
- I was too quick to release 4.8.2 with too little testing. One of the
changes is now reverted slightly to the 4.8.1 way since 4.8.2 couldn't
upload files. I still think both problems corrected in 4.8.2 remain
2000-07-29 18:21:10 -04:00
corrected. Reported by Julian Romero Nieto.
1999-12-29 09:20:26 -05:00
Version 4.8.2
Daniel Stenberg
2000-07-29 18:21:10 -04:00
- Bernhard Iselborn reported two FTP protocol errors curl did. They're now
corrected. Both appeared when getting files from a MS FTP server! :-)
1999-12-29 09:20:26 -05:00
Version 4.8.1
Daniel Stenberg
- Added a last update of the progress meter when the transfer is done. The
final output on the screen didn't have to be the final size transfered
which made it sometimes look odd.
2000-07-29 18:21:10 -04:00
- Thanks to David Long I got rid of a silly bug that happened if a HTTP-page
had nothing but header. Appearantly Solaris deals with negative sizes in
fwrite() calls a lot better than Linux does... =B-]
1999-12-29 09:20:26 -05:00
Version 4.8
Daniel Stenberg
- Continue FTP file transfer. -c is the switch. Note that you need to
specify a file name if you wanna resume a download (you can't resume a
download sent to stdout). Resuming upload may be limited by the server
since curl is then using the non-RFC959 command SIZE to get the size of
the target file before upload begins (to figure out which offset to
use). Use -C to specify the offset yourself! -C is handy if you're doing
the output to something else but a plain file or when you just want to get
the end of a file.
- recursiveftpget.pl now features a maximum recursive level argument.
Version 4.7
Daniel Stenberg
- Added support to abort a download if the speed is below a certain amount
(speed-limit) bytes per second for a certain (speed-time) time.
- Wrote a perl script 'recursiveftpget.pl' to recursively use curl to get a
whole ftp directory tree. It is meant as an example of how curl can be
used. I agree it isn't the wisest thing to do to make a separate new
connection for each file and directory for this.
Version 4.6
Daniel Stenberg
- Added a first attempt to optionally parse the .netrc file for login user
and password. If used with http, it enables user authentication. -n is
the new switch.
- Removed the extra newlines on the default user-agent string.
- Corrected the missing ftp upload error messages when it failed without the
verbose flag set. Gary W. Swearingen found it.
- Now using alarm() to enable second-precision timeout even on the name
resolving/connecting phase. The timeout is although reset after that first
2000-07-29 18:21:10 -04:00
sequence. (This should be corrected.) Gary W. Swearingen reported.
1999-12-29 09:20:26 -05:00
- Now spells "Unknown" properly, as in "Unknown option 'z'"... :-)
- Added bug report email address in the README.
- Added a "current speed" field to the progress meter. It shows the average
speed the last 5 seconds. The other speed field shows the average speed of
the entire transfer so far.
Version 4.5.1
Linas Vepstas
- SSL through proxy fix
- Added -A to allow User-Agent: changes
Daniel Stenberg
- Made the -A work when SSL-through-proxy.
Version 4.5
2000-07-29 18:21:10 -04:00
Linas Vepstas
1999-12-29 09:20:26 -05:00
- More SSL corrections
- I've added a port to AIX.
- running SSL through a proxy causes a chunk of code to be executred twice.
one of those blocks needs to be deleted.
Daniel Stenberg
- Made -i and -I work again
Version 4.4
2000-07-29 18:21:10 -04:00
Linas Vepstas
1999-12-29 09:20:26 -05:00
- -x can now also specify proxyport when used as in 'proxyhost:proxyport'
- SSL fixes
Version 4.3
Daniel Stenberg
- Adjusted to compile under win32 (VisualC++ 5). The -P switch does not
support network interface names in win32. I couldn't figure out how!
Version 4.2
Linas Vepstas / Sampo Kellomaki
- Added SSL / SSLeay support (https://)
- Added the -T usage for HTTP POST.
Daniel Stenberg
- Bugfixed the SSL implementation.
- Made -P a lot better to use other IP addresses. It now accepts a following
parameter that can be either
interface - i.e "eth0" to specify which interface's IP address you
want to use
IP address - i.e "192.168.10.1" to specify exact IP number
host name - i.e "my.host.domain" to specify machine
"-" - (any single-letter string) to make it pick the machine's
default
- The Makefile is now ready to compile for solaris, sunos4 and linux right
out of the box.
- Better generated version string seen with 'curl -V'
Version 4.1
Daniel Stenberg
- The IP number returned by the ftp server as a reply to PASV does no longer
have to DNS resolve. In fact, no IP-number-only addresses have to anymore.
- Binds better to available port when -P is used.
- Now LISTs ./ instead of / when used as in ftp://ftp.funet.fi/. The reason
for this is that exactly that site, ftp.funet.fi, does not allow LIST /
while LIST ./ is fine. Any objections?
Version 4 (1998-03-20)
Daniel Stenberg
- I took another huge step and changed both version number and project name!
The reason for the new name is that there are just one too many programs
named urlget already and this program already can a lot more than merely
getting URLs, and the reason for the version number is that I did add the
pretty big change in -P and since I changed name I wanted to start with
something fresh!
- The --style flags are working better now.
- Listing directories with FTP often reported that the file transfer was
incomplete. Wrong assumptions were too common for directories, why no
size will be attempted to get compared on them from now on.
- Implemented the -P flag that let's the ftp control issue a PORT command
instead of the standard PASV.
- -a for appending FTP uploads works.
***************************************************************************
Version 3.12 (14 March 1998)
1999-12-29 09:20:26 -05:00
Daniel Stenberg
- End-of-header tracking still lacked support for \r\n or just \n at the
end of the last header line.
2000-07-29 18:21:10 -04:00
Sergio Barresi
1999-12-29 09:20:26 -05:00
- Added PROXY authentication.
Rafael Sagula
- Fixed some little bugs.
Version 3.11
Daniel Stenberg
- The header parsing was still not correct since the 3.2 modification...
Version 3.10
Daniel Stenberg
- 3.7 and 3.9 were simultaneously developed and merged into this version.
- FTP upload did not work correctly since 3.2.
Version 3.9
Rafael Sagula
- Added the "-e <url> / --referer <url>" option where we can specify
the referer page. Obviously, this is necessary only to fool the
server, but...
Version 3.7
Daniel Stenberg
- Now checks the last error code sent from the ftp server after a file has
been received or uploaded. Wasn't done previously.
- When 'urlget <host>' is used without a 'protocol://' first in the host part,
it now checks for host names starting with ftp or gopher and if it does,
it uses that protocol by default instead of http.
Version 3.6
Daniel Stenberg
- Silly mistake made the POST bug. This has now also been tested to work with
proxy.
Version 3.5
Daniel Stenberg
- Highly inspired by Rafael Sagula's changes to the 3.1 that added an almost
functional POST, I applied his changes into this version and made them work.
(It seems POST requires the Content-Type and Content-Length headers.) It is
now usable with the -d switch.
Version 3.3 - 3.4
Passed to avoid confusions
Version 3.2
Daniel Stenberg
- Major rewrite of two crucial parts of this code: upload and download.
They are both now using a select() switch, that allows much better
progress meter and time control.
- alarm() usage removed completely
- FTP get can now list directory contents if the path ends with a slash '/'.
Urlget on a ftp-path that doesn't end with a slash means urlget will
attempt getting it as a file name.
- FTP directory view supports -l for "list-only" which lists the file names
only.
- All operations support -m for max time usage in seconds allowed.
- FTP upload now allows the size of the uploaded file to be provided, and
thus it can better check it actually uploaded the whole file. It also
makes the progress meter for uploads much better!
- Made the parameter parsing fail in cases like 'urlget -r 900' which
previously tried to connect to the host named '900'.
Version 3.1
Kjell Ericson
- Pointed out how to correct the 3 warnings in win32-compiles.
Daniel Stenberg
- Removed all calls to exit().
- Made the short help text get written to stdout instead of stderr.
- Made this file instead of keeping these comments in the source.
- Made two callback hooks, that enable external programs to use urlget()
easier and to grab the output/offer the input easier.
- It is evident that Win32-compiles are painful. I watched the output from
the Borland C++ v5 and it was awful. Just ignore all those warnings.
Version 3.0
Daniel Stenberg
- Added FTP upload capabilities. The name urlget gets a bit silly now
when we can put too... =)
- Restructured the source quite a lot.
Changed the urlget() interface. This way, we will survive changes much
better. New features can come and old can be removed without us needing
to change the interface. I've written a small explanation in urlget.h
that explains it.
- New flags include -t, -T, -O and -h. The -h text is generated by the new
mkhelp script.
Version 2.9
Remco van Hooff
- Added a fix to make it compile smoothly on Amiga using the SAS/C
compiler.
Daniel Stenberg
- Believe it or not, but the STUPID Novell web server seems to require
that the Host: keyword is used, so well I use it and I (re-introduce) the
urlget User-Agent:. I still have to check that this Host: usage works with
proxies... 'Host:' is required for HTTP/1.1 GET according to RFC2068.
Version 2.8
Rafael Sagula
- some little modifications
Version 2.7
Daniel Stenberg
- Removed the -l option and introduced the -f option instead. Now I'll
rewrite the former -l kludge in an external script that'll use urlget to
fetch multipart files like that.
- '-f' is introduced, it means Fail without output in case of HTTP server
errors (return code >=300).
- Added support for -r, ranges. Specify which part of a document you
want, and only that part is returned. Only with HTTP/1.1-servers.
- Split up the source in 3 parts. Now all pure URL functions are in
urlget.c and stuff that deals with the stand-alone program is in main.c.
- I took a few minutes and wrote an embryo of a README file to explain
a few things.
Version 2.6
Daniel Stenberg
- Made the -l (loop) thing use the new CONF_FAILONERROR which makes
urlget() return error code if non-successful. It also won't output anything
then. Now finally removed the HTTP 1.0 and error 404 dependencies.
- Added -I which uses the HEAD request to get the header only from a
http-server.
Version 2.5
Rafael Sagula
- Made the progress meter use HHH:MM:SS instead of only seconds.
Version 2.4
Daniel Stenberg
- Added progress meter. It appears when downloading > BUFFER SIZE and
mute is not selected. I found out that when downloading large files from
really really slow sites, it is desirable to know the status of the
download. Do note that some downloads are done unawaring of the size, which
makes the progress meter less thrilling ;) If the output is sent to a tty,
the progress meter is shut off.
- Increased buffer size used for reading.
- Added length checks in the user+passwd parsing.
- Made it grok user+passwd for HTTP fetches. The trick is to base64
encode the user+passwd and send an extra header line. Read chapter 11.1 in
RFC2068 for details. I added it to be used just like the ftp one. To get a
http document from a place that requires user and password, use an URL
like:
http://user:passwd@www.site.to.leach/doc.html
I also added the -u flag, since WHEN USING A PROXY YOU CAN'T SPECIFY THE
USER AND PASSWORD WITH HTTP LIKE THAT. The -u flag works for ftp too, but
not if used with proxy. To do the same as the above one, you can invoke:
urlget -u user:passwd http://www.site.to.leach/doc.html
Version 2.3
Rafael Sagula
- Added "-o" option (output file)
- Added URG_HTTP_NOT_FOUND return code.
(Daniel's note:)
Perhaps we should detect all kinds of errors and instead of writing that
custom string for the particular 404-error, use the error text we actually
get from the server. See further details in RFC2068 (HTTP 1.1
definition). The current way also relies on a HTTP/1.0 reply, which newer
servers might not do.
- Looping mode ("-l" option). It's easier to get various split files.
(Daniel's note:)
Use it like 'urlget -l 1 http://from.this.site/file%d.html', which will
make urlget to attempt to fetch all files named file1.html, file2.html etc
until no more files are found. This is only a modification of the
STAND_ALONE part, nothing in the urlget() function was modfified for this.
Daniel Stenberg
- Changed the -h to be -i instead. -h should be preserved to help use.
- Bjorn Reese indicated that Borland _might_ use '_WIN32' instead of the
VC++ WIN32 define and therefore I added a little fix for that.
Version 2.2
Johan Andersson
- The urlget function didn't set the path to url when using proxy.
- Fixed bug with IMC proxy. Now using (almost) complete GET command.
Daniel Stenberg
- Made it compile on Solaris. Had to reorganize the includes a bit.
(so Win32, Linux, SunOS 4 and Solaris 2 compile fine.)
- Made Johan's keepalive keyword optional with the -k flag (since it
makes a lot of urlgets take a lot longer time).
- Made a '-h' switch in case you want the HTTP-header in the output.
Version 2.1
Daniel Stenberg and Kjell Ericson
- Win32-compilable
- No more global variables
- Mute option (no output at all to stderr)
- Full range of return codes from urlget(), which is now written to be a
function for easy-to-use in [other] programs.
- Define STAND_ALONE to compile the stand alone urlget program
- Now compiles with gcc options -ansi -Wall -pedantic ;)
Version 2.0
- Introducing ftp GET support. The FTP URL type is recognized and used.
- Renamed the project to 'urlget'.
- Supports the user+passwd in the FTP URL (otherwise it tries anonymous
login with a weird email address as password).
Version 1.5
Daniel Stenberg
- The skip_header() crap messed it up big-time. By simply removing that
one we can all of a sudden download anything ;)
- No longer requires a trailing slash on the URLs.
- If the given URL isn't prefixed with 'http://', HTTP is assumed and
given a try!
- 'void main()' is history.
Version 1.4
Daniel Stenberg
- The gopher source used the ppath variable instead of path which could
lead to disaster.
Version 1.3
Daniel Stenberg
- Well, I added a lame text about the time it took to get the data. I also
fought against Johan to prevent his -f option (to specify a file name
that should be written instead of stdout)! =)
- Made it write 'connection refused' for that particular connect()
problem.
- Renumbered the version. Let's not make silly 1.0.X versions, this is
a plain 1.3 instead.
Version 1.2
Johan Andersson
- Discovered and fixed the problem with getting binary files. puts() is
now replaced with fwrite(). (Daniel's note: this also fixed the buffer
overwrite problem I found in the previous version.)
2000-07-29 18:21:10 -04:00
Rafael Sagula
1999-12-29 09:20:26 -05:00
- Let "-p" before "-x".
2000-07-29 18:21:10 -04:00
Daniel Stenberg
1999-12-29 09:20:26 -05:00
- Bugfixed the proxy usage. It should *NOT* use nor strip the port number
from the URL but simply pass that information to the proxy. This also
made the user/password fields possible to use in proxy [ftp-] URLs.
(like in ftp://user:password@ftp.my.site:8021/README)
2000-07-29 18:21:10 -04:00
Johan Andersson
1999-12-29 09:20:26 -05:00
- Implemented HTTP proxy support.
- Receive byte counter added.
2000-07-29 18:21:10 -04:00
Bjorn Reese
1999-12-29 09:20:26 -05:00
- Implemented URLs (and skipped the old syntax).
- Output is written to stdout, so to achieve the above example, do:
httpget http://143.54.10.6/info_logo.gif > test.gif
Version 1.1
2000-07-29 18:21:10 -04:00
Daniel Stenberg
1999-12-29 09:20:26 -05:00
- Adjusted it slightly to accept named hosts on the command line. We
wouldn't wanna use IP numbers for the rest of our lifes, would we?
Version 1.0
2000-07-29 18:21:10 -04:00
Rafael Sagula
1999-12-29 09:20:26 -05:00
- Wrote the initial httpget, which started all this!