Instead of reopening the downloaded file, fsetxattr uses the (already
open) file descriptor to attach extended attributes. This makes the
procedure more robust against errors caused by moved or deleted files.
CURLOPT_RESOLVE is a new option that sends along a curl_slist with
name:port:address sets that will populate the DNS cache with entries so
that request can be "fooled" to use another host than what otherwise
would've been used. Previously we've encouraged the use of Host: for
that when dealing with HTTP, but this new feature has the added bonus
that it allows the name from the URL to be used for TLS SNI and server
certificate name checks as well.
This is a first change. Surely more will follow to make it decent.
setxattr is a glibc call to set extended attributes, so configure now
checks for it and the code is adapted to only build when the
functionality is present.
It is often convinient to track back the source of a once downloaded
file; this patch makes curl store the source URL and other metadata
alongside the retrieved file by using the extended attributes (if
supported by the file system and enabled by --xattr).
These haven't worked in at least 8 years due to missing source
files, and most active RiscOS developers these days apparently
cross-compile anyway.
Signed-off-by: James Bursa <james@zamez.org>
Some options, such as the automatic decompression and some SSL related
ones now will bail out if the underlying libcurl doesn't have support
for the particular feature needed.
If the filename contains a backslash, only use filename portion. The
idea is that even systems that don't handle backslashes as path
separators probably want that path removed for convenience.
This flaw is considered a security problem, see the curl security
vulnerability http://curl.haxx.se/docs/adv_20101013.html
Having an open brace without a closing brace caused a segfault.
Having a closing brace too many caused a silent error to occur, which
caused curl to bail out and return an error code but no error message
was shown. It does now!
All error message outputs no longer wrongly get _two_ newlines written
after the error message.
Reported by: Vlad Ureche
Bug: http://curl.haxx.se/bug/view.cgi?id=3083942
Renamed SDK_* to NDK_*; made NDK_* defines overwriteable from
environment; removed now obsolete YACC macro;
moved some curl_config.h defines to IPv6 section since they
are only needed when IPv6 is enabled - this makes libcurl compile
with older NDKs too which were not IPv6-aware.
It was introduced in commit eeb2cb05 along with the -F type=
change. Also fixed a typo in the name of the magic filename=
parameter. Tweaked tests 39 and 173 to better test this path.
The -F option allows some custom parameters within the given string, and
those strings are separated with semicolons. You can for example specify
"name=daniel;type=text/plain" to set content-type for the
field. However, the use of semicolons like that made it not work fine if
you specified one within the content-type, like for:
"name=daniel;type=text/plain;charset=UTF-8"
... as the second one would be seen as a separator and "charset" is no
parameter curl knows anything about so it was just silently discarded.
The new logic now checks if the semicolon and following keyword looks
like a parameter it knows about and if it isn't it is assumed to be
meant to be used within the content-type string itself.
I modified test case 186 to verify that this works as intended.
Reported by: Larry Stone
Bug: http://curl.haxx.se/bug/view.cgi?id=3048988
Added the -br switch to dynamic builds which fixes the issue I saw
with curl's --version output. Added debug info and symfile for debug
builds to linker opts. Added DLL loader for wlink back, but this time
dependend on wlink version.
Patch posted to the list by malak.jiri AT gmail.com.
The var %MAKEFLAGS is only set in 3 cases: if set as environment
var or as macro definition from commandline, and either with the
-u or -ms switch. Since all these cases are unlikely for the average
user it should be safe to only test if %MAKEFLAGS is defined; this
has the benefit that now all other switches can be used again in
addition to the -u which was formerly not possible.
The --retry logic does retry HTTP when some specific response codes are
returned, but because the -f option sets the CURLOPT_FAILONERROR to
libcurl, the return codes are different for such situations and then the
curl tool failed to consider it for retrying.
Reported by: Mike Power
Bug: http://curl.haxx.se/bug/view.cgi?id=3037362
lib/Makefile.Watcom works fine already, for src/Makefile.Watcom we
need first to tweak src/Makefile.inc a bit - therefore the handtweaked
list still exists for now.
- make both libcurl and curl makefiles use register calling convention
(previously libcurl had stack calling convention).
- added include paths to the Watcom headers so its no longer required
to set the environment vars for this.
- added -wcd=201 to supress compiler warning about unreachable code.
- use macros for all tools, and removed dependency on GNU tools like rm.
- make ipv6 and debug builds controlable via env vars and so make them
optional instead of default.
- commented WINLDAPAPI and WINBERAPI since they broke with OW 1.8, and
it seems they're not needed (anymore?).
- added rule for hugehelp.c.cvs so that it will be created when not
already exist - this is required for building from a release tarball
since there we have no hugehelp.c.cvs, thus compilation broke.
- removed C_ARG creation from lib/Makefile.Watcom and use CFLAGS
directly as done too in src/Makefile.Watcom - this has the benefit
that we will see all active cflags and defines during compile.
- added LINK-ARG to src/Makefile.Watcom in order to better control
linker input.
- a couple of other minor makefile tweaks here and there ...
- added largefile support for Watcom builds to config-win32.h. Not yet
tested if it really works, but should since Win32 supports it.
- added loaddll stuff to speed up builds if supported.
This passes -Werror to gcc when building curl and libcurl,
allowing easy dection of compile warnings.
Signed-off-by: Ben Greear <greearb@candelatech.com>
The --remote-header-name option for the command-line tool assumes that
everything beyond the filename= field is part of the filename, but that
might not always be the case, for example:
Content-Disposition: attachment; filename=file.txt; modification-date=...
This fix chops the filename off at the next semicolon, if there is one.
When getting multiple URLs, curl didn't properly reset the byte counter
after a successful transfer so if the subsequent transfer failed it
would wrongly use the previous byte counter and behave badly (segfault)
because of that. The code assumes that the byte counter and the 'stream'
pointer is well in synch.
Reported by: Jon Sargeant
Bug: http://curl.haxx.se/bug/view.cgi?id=3028241
Since uploading from stdin is very likely to not work with anyauth and
its multi-phase probing for what authentication to actually use, alert
the user about it. Multi-phase negotiate almost certainly will involve
sending data and thus libcurl will need to rewind the stream to send
again, and it cannot do that with stdin.
I think the [REMARK] and commented function calls cluttered the code a
bit too much and made the generated code ugly to read. Now we instead
track the remarks one specially and just lists them at the end of the
generated code more as additional information.
And additionally, don't show function or object pointers actual value
since they make no sense to anyone. Show 'functionpointer' and
'objectpointer' instead.
In the generated code --libcurl makes, all calls to curl_easy_setopt()
that use *_LARGE options now have the value typecasted to curl_off_t, so
that it works correctly for 32bit systems with 64bit curl_off_t type.
curl didn't properly handle escaping characters in a URL with the use of
backslash. It did an attempt, but that failed as reported in bug
3022551. The described example was using the URL
"http://example.com?{AB,C\,D}".
I've now removed the special-handling of letters following the backslash
and I also removed the bad extra check that triggered this particular
bug.
Bug: http://curl.haxx.se/bug/view.cgi?id=3022551
Reported by: Jon Sargeant
The main change is to allow input from user-specified methods,
when they are specified with CURLOPT_READFUNCTION.
All calls to fflush(stdout) in telnet.c were removed, which makes
using 'curl telnet://foo.com' painful since prompts and other data
are not always returned to the user promptly. Use
'curl --no-buffer telnet://foo.com' instead. In general,
the user should have their CURLOPT_WRITEFUNCTION do a fflush
for interactive use.
Also fix assumption that reading from stdin never returns < 0.
Old code could crash in that case.
Call progress functions in telnet main loop.
Signed-off-by: Ben Greear <greearb@candelatech.com>
The feature that uses the file name given in a
Content-disposition: header didn't properly skip trailing
carriage returns and linefeed characters from the end of the file
name when it was given without quotes.
(http://curl.haxx.se/bug/view.cgi?id=2958074) that curl on Windows with
option --trace-time did not use local time when timestamping trace lines.
This could also happen on other systems depending on time souurce.
versions --ftp-ssl and --ftp-ssl-reqd as these options are now used to
control SSL/TLS for IMAP, POP3 and SMTP as well in addition to FTP. The old
option names are still working but the new ones are the prefered ones
(listed and documented).
command is a special "hack" used by the drftpd server, but even though it is
a custom extension I've deemed it fine to add to libcurl since this server
seems to survive and people keep using it and want libcurl to support
it. The new libcurl option is named CURLOPT_FTP_USE_PRET, and it is also
usable from the curl tool with --ftp-pret. Using this option on a server
that doesn't support this command will make libcurl fail.
on FTP errors in the transient 5xx range. Transient FTP errors are in the
4xx range. The code itself only tried on 5xx errors that occured _at login_.
Now the retry code retries on all FTP transfer failures that ended with a
4xx response.
(http://curl.haxx.se/bug/view.cgi?id=2911279)
rework patch that now integrates TFTP properly into libcurl so that it can
be used non-blocking with the multi interface and more. BLKSIZE also works.
The --tftp-blksize option was added to allow setting the TFTP BLKSIZE from
the command line.
though it failed to write a very small download to disk (done in a single
fwrite call). It turned out to be because fwrite() returned success, but
there was insufficient error-checking for the fclose() call which tricked
curl to believe things were fine.
by user 'koresh' introduced the --crlfile option to curl, which makes curl
tell libcurl about a file with CRL (certificate revocation list) data to
read.
read stdin in a non-blocking fashion. This also brings back -T- (minus) to
the previous blocking behavior since it could break stuff for people at
times.