Since automake 1.12.4, the warnings are issued on running automake:
warning: 'INCLUDES' is the old name for 'AM_CPPFLAGS' (or '*_CPPFLAGS')
Avoid INCLUDES and roll these flags into AM_CPPFLAGS.
Compile tested on:
Ubuntu 10.04 (automake 1:1.11.1-1)
Ubuntu 12.04 (automake 1:1.11.3-1ubuntu2)
Arch Linux (automake 1.12.4)
Previously the Metalink code used Apple's CommonCrypto library only if
curl was built using the --with-darwinssl option. Now we use CommonCrypto
on all Apple operating systems including Tiger or later, or iOS 5 or
later, so you don't need to build --with-darwinssl anymore. Also rolled
out this change to libcurl's md5 code.
The makefile is designed to build against a libmetalink devel package;
therefore is does not matter what will change inside libmetalink.
Add OpenSSL includes and defines for libmetalink-aware OpenSSL builds.
In Metalink v3, the type attribute of url element indicates the
type of the resource the URL points to. It can include URL to the
meta data, such as BitTorrent metainfo file. In Curl, we are not
interested in these meta data URLs. Instead, we are only
interested in the HTTP and FTP URLs. This change filters out
non-HTTP and FTP URLs. If we don't filter out them, it will be
downloaded by curl and hash check will fail if hash is provided
and next URL will be tried. This change will cut this useless
network transfer.
Since Metalink support requires a crypto library for hash functions
and Windows comes with the builtin CryptoAPI, this patch adds that
API as a fallback to the supported crypto libraries.
It is automatically used on Windows if no other library is provided.
Since Windows/MinGW threat 0x1A as the EOF character, reading binary
files which contain that byte does not work using text mode.
The read function will only read until the first 0x1A byte. This
means that the hash is not computed from the whole file and the
final validation check using hash comparision fails.
While validating a new Clang diagnostic (-Wnon-literal-null-conversion -
yes, the name isn't quite correct in this case, but it suffices) I found
a few violations of it in Curl.
1 - str2offset() no longer accepts negative numbers since offsets are by
nature positive.
2 - introduced str2unum() for the command line parser that accepts
numericals which are not supposed to be negative, so that it will
properly complain on apparent bad uses and mistakes.
Bug: http://curl.haxx.se/mail/archive-2012-07/0013.html
Print "parsing (...) OK" only when no warnings are generated. If
no file is found in Metalink, treat it FAILED.
If no digest is provided, print WARNING in parse_metalink().
Also print validating FAILED after download.
These changes make tests 2012 to 2016 pass.
Including headers in response body will break Metalink XML parser.
If it is included in the file described in Metalink XML, hash check
will fail. Therefore, --include should be ignored if --metalink is
used.
The noprogress and isatty in Configurable are global, in a sense
that they persist in one curl invocation. Currently once one
download writes its response data to tty, they are set to FALSE
and they are not restored on successive downloads. This change
first backups the current noprogress and isatty, and restores
them when download does not write its data to tty.
In this change, --metalink option no longer takes argument. If
it is specified, given URIs are processed as Metalink XML file.
If given URIs are remote (e.g., http URI), curl downloads it
first. Regardless URI is local file (e.g., file URI scheme) or
remote, Metalink XML file is not written to local file system and
the received data is fed into Metalink XML parser directly. This
means with --metalink option, filename related options like -O
and -o are ignored.
Usage examples:
$ curl --metalink http://example.org/foo.metalink
This will download foo.metalink and parse it and then download
the URI described there.
$ curl --metalink file://foo.metalink
This will parse local file foo.metalink and then download the URI
described there.
When creating metalink_checksum from metalink_checksum_t, first
check hex digest is valid for the given hash function. We do
this check in the order of digest_aliases so that first good
match will be chosen (strongest hash function available). As a
result, the metalinkfile now only contains at most one
metalink_checksum because other entries are just redundant.
Version number is removed in order to make this info consistent with
how we do it with other MS and Linux system libraries for which we don't
provide this info.
Identifier changed from 'WinSSPI' to 'schannel' given that this is the
actual provider of the SSL/TLS support. libcurl can still be built with
SSPI and without SCHANNEL support.
Added Windows SSPI version information to the curl version string when
SCHANNEL SSL is not enabled, as the version of the library should also
be included when SSPI is used to generate security contexts.
Removed SSPI from the feature list as the features are GSS-Negotiate,
NTLM and SSL depending on the usage of the SSPI library.
Additionally, make hash checking ability mandatory in order to allow metalink
support in curl.
A command line option could be introduced to skip hash checking at runtime,
but the ability to check hashes should always be built-in when providing
metalink support.
Metalink file contains several hash types of checksums, such as
md5, sha-1, sha-256, etc. To deal with these checksums, I created
abstraction layer based on lib/curl_md5.h and
lib/md5.c. Basically, they are almost the same but I changed the
code so that it is not hash type dependent. Currently,
GNUTLS(nettle or gcrypt) and OpenSSL functions are supported.
Checksum checking is done by reopening download file. If there
is an I/O error, the current implementation just prints error
message and does not try next resource.
In this patch, the supported hash types are: md5, sha-1 and sha-256.
Filenames contained in Metalink file can include directory information.
Filenames are unique in Metalink file, taking into account the directory
information. So we need to create the directory hierarchy.
Curl has --create-dirs option, but we create directory hierarchy for
Metalink downloads regardless of the option value.
This patch also put metalink int variable outside of HAVE_LIBMETALINK
guard. This reduces the number of #ifdefs.
This change adds experimental Metalink support to curl.
To enable Metalink support, run configure with --with-libmetalink.
To feed Metalink file to curl, use --metalink option like this:
$ curl -O --metalink foo.metalink
We use libmetalink to parse Metalink files.
To achieve this, first new structure HeaderData is defined to hold
necessary data to perform header-related work. Then tool_header_cb now
receives HeaderData pointer as userdata. All header-related work
(currently, dumping header and Content-Disposition inspection) are done
in this callback function. HeaderData.outs->config is used to determine
whether each work is done.
Unit tests were also updated because after this change, curl code always
sets CURLOPT_HEADERFUNCTION and CURLOPT_HEADERDATA.
Tested with -O -J -D, -O -J -i and -O -J -D -i and all worked fine.
Explicit conversion to 'long' of curl_easy_setopt() third argument for options
CURLOPT_HTTPAUTH and CURLOPT_PROXYAUTH given that this is how its bitmasks are
docummented to be used.
By comparing if a different "progress point" is reached or not since the
previous update, the progress function callback for this now avoids many
superfluous screen updates. This has the nice side-effect that it fixes
a problem that causes a second progress meter line.
The second line output happened because when we use the -# progress
meter, we force a newline output after the transfer in the main loop in
curl, but when libcurl calls the progress callback from
curl_easy_cleanup() it would then output the progress display
again. Possibly the naive newline output is wrong but this optimization
was suitable anyway...
Reported by: Daniel Theron
Bug: http://curl.haxx.se/bug/view.cgi?id=3517418
BUILDING_LIBCURL and CURL_STATICLIB are no longer defined in curl_config.h,
configure will generate appropriate conditionals so that mentioned symbols
get defined and used in Makefiles at compilation time
Configuration files such as curl_config.h and all config-*.h no longer exist
nor are generated/copied into 'src' directory, now these only exist in 'lib'
directory from where curl tool sources uses them.
Additionally old src/setup.h has been refactored into src/tool_setup.h which
now pulls lib/setup.h
The possibility of a makefile needing an include path adjustment exists.
By modifying the parameter list for ourWriteOut() and passing the
OutStruct that collects data in tool_operate, we get access to the
remote name that we're writing to. Shell scripters should find this
useful when used in conjuntion with the --remote-header-name option.
This patch improves the output of curl's --libcurl option by
generating code which builds curl_httppost and curl_slist lists, and
uses symbolic names for enum and flag values. Variants of the
my_setopt macro in tool_setopt.h are added in order to pass extra type
information to the code-generation step in tool_setopt.c.
If curl is configured with --disable-libcurl-option then the macros
call curl_easy_setopt directly.
Fix the str2num() function to not check if the input string starts with a
digit, since strtol() supports numbers prepended with '-' (and '+') too.
This makes the --max-redirs option work as documented.
This new option tells curl to not work around a security flaw in the
SSL3 and TLS1.0 protocols. It uses the new libcurl option
CURLOPT_SSL_OPTIONS with the CURLSSLOPT_ALLOW_BEAST bit set.
Use the new library CURLOPT_TCP_KEEPALIVE rather than disabling this via
the sockopt callback. If --keepalive-time is used, apply the value to
CURLOPT_TCP_KEEPIDLE and CURLOPT_TCP_KEEPINTVL.
We want to continue to the next URL to try even on failures returned
from libcurl. This makes -f with ranges still get subsequent URLs even
if occasional ones return error. This was a regression as it used to
work and broke in the 7.23.0 release.
Added test case 1328 to verify the fix.
Bug: http://curl.haxx.se/bug/view.cgi?id=3481223
Reported by: Juan Barreto
Test case 1315 was added to verify this functionality. When passing in
multiple files to a single -F, the parser would get all confused if one
of the specified files had a custom type= assigned.
Reported by: Colin Hogben
Skip a floating point addition operation when integral part of time difference
is zero. This avoids potential floating point addition rounding problems while
preserving decimal part value.
A regression between 7.22.0 and 7.23.0 -- downloading a file with the
flags -O and -J results in the content being written to stdout if and
only if there was no Content-Disposition header in the http response. If
there is a C-D header with a filename attribute, the output is correctly
written.
Reported by: Dave Reisner
Bug: http://curl.haxx.se/mail/archive-2011-11/0030.html
The progress bar output function would blindly use the terminal width
without bounds checking. When using a very wide terminal that caused a
buffer overflow and segfault.
We now limit the max bar with to 255 columns, and I simplified the code
to avoid an extra snprintf and buffer.
Bug: http://curl.haxx.se/bug/view.cgi?id=3435710
Reported by: Alexey Zakhlestin
As commit 5850cc4808 clarifies, libcurl can deliver header lines that
are longer than CURL_MAX_WRITE_SIZE, only body data is limited to that
size. The curl tool has check (when built debug-enabled) that made the
wrong checks and this new test 1205 verifies that larger headers work.
Previously we required that -S/--show-error was used _after_
-s/--silent. This was slightly confusing since we strive to make
arguments as position independent as possible.
Now, you can use them in any order and the result should still be the
same.
Bug: http://curl.haxx.se/bug/view.cgi?id=3424286
Reported by: Andreas Olsson
Maximum amount of data a header callback is supposed to get in
a single call from libcurl is limited by the lowest value of
CURL_MAX_WRITE_SIZE and CURL_MAX_HTTP_HEADER.