Cokie with the same domain but different tailmatching property are now
considered different and do not replace each other. If header contains
following lines then two cookies will be set: Set-Cookie: foo=bar;
domain=.foo.com; expires=Thu Mar 3 GMT 8:56:27 2033 Set-Cookie: foo=baz;
domain=foo.com; expires=Thu Mar 3 GMT 8:56:27 2033
This matches Chrome, Opera, Safari, and Firefox behavior. When sending
stored tokens to foo.com Chrome, Opera, Firefox store send them in the
stored order, while Safari pre-sort the cookies.
Closes#1050
Add the new option CURLOPT_KEEP_SENDING_ON_ERROR to control whether
sending the request body shall be completed when the server responds
early with an error status code.
This is suitable for manual NTLM authentication.
Reviewed-by: Jay Satiro
Closes https://github.com/curl/curl/pull/904
.. and add that --proto-redir and CURLOPT_REDIR_PROTOCOLS do not
override protocols denied by --proto and CURLOPT_PROTOCOLS.
- Add a test to enforce: --proto deny must override --proto-redir allow
Closes https://github.com/curl/curl/pull/1031
... like when a HTTP/0.9 response comes back without any headers at all
and just a body this now prevents that body from being sent to the
callback etc.
Adapted test 1144 to verify.
Fixes#973
Assisted-by: Ray Satiro
This only excludes building unit tests from default build ( 'all' Make
target or "Build Solution" in VisualStudio). The projects and Make
targets will still be generated and shown in supporting IDEs.
Fixes https://github.com/curl/curl/issues/981
Reported-by: Randy Armstrong
Closes https://github.com/curl/curl/pull/990
Detect support for compiler symbol visibility flags and apply those
according to CURL_HIDDEN_SYMBOLS option.
It should work true to the autotools build except it tries to unhide
symbols on Windows when requested and prints warning if it fails.
Ref: https://github.com/curl/curl/issues/981#issuecomment-242665951
Reported-by: Daniel Stenberg
Since we're using CURLE_FTP_WEIRD_SERVER_REPLY in imap, pop3 and smtp as
more of a generic "failed to parse" introduce an alias without FTP in
the name.
Closes https://github.com/curl/curl/pull/975
With HTTP/2 each transfer is made in an indivial logical stream over the
connection, making most previous errors that caused the connection to get
forced-closed now instead just kill the stream and not the connection.
Fixes#941
This fixes tests that were added after 113f04e664 as the tests would
fail otherwise.
We bring back "Proxy-Connection: Keep-Alive" now unconditionally to fix
regressions with old and stupid proxies, but we could possibly switch to
using it only for CONNECT or only for NTLM in a future if we want to
gradually reduce it.
Fixes#954
Reported-by: János Fekete
CMake build now using BUILD_TESTING=ON/OFF (default is OFF) to build
tests and enabling CTest integration. Options BUILD_CURL_TESTS and
BUILD_DASHBOARD_REPORTS was removed.
Closes#882
Reviewed-by: Brad King
The HTTP/2 tests brought with commit bf05606ef1 were using the internal
name 'http2' for the HTTP/2 server, while in fact that name was already
used for the second instance of the HTTP server. This made tests using
the second instance (like test 2050) fail after a HTTP/2 test had run.
The server is now known as HTTP/2 internally and within the <server>
section in test cases. 1700, 1701 and 1702 were updated accordingly.
It requires that 'nghttpx' is in the PATH, and it will run the tests
using nghttpx as a front-end proxy in front of the standard HTTP/1 test
server. This uses HTTP/2 over plain TCP.
If you like me have nghttpx installed in a custom path, you can run test 1700
like this:
$ PATH=$PATH:$HOME/build-nghttp2/bin/ ./runtests.pl 1700
Mostly in order to support broken web sites that redirect to broken URLs
that are accepted by browsers.
Browsers are typically even more leniant than this as the WHATWG URL
spec they should allow an _infinite_ amount. I tested 8000 slashes with
Firefox and it just worked.
Added test case 1141, 1142 and 1143 to verify the new parser.
Closes#791
Prior to this change a width arg could be erroneously output, and also
width and precision args could not be used together without crashing.
"%0*d%s", 2, 9, "foo"
Before: "092"
After: "09foo"
"%*.*s", 5, 2, "foo"
Before: crash
After: " fo"
Test 557 is updated to verify this and more
curl_printf.h defines printf to curl_mprintf, etc. This can cause
problems with external headers which may use
__attribute__((format(printf, ...))) markers etc.
To avoid that they cause problems with system includes, we include
curl_printf.h after any system headers. That makes the three last
headers to always be, and we keep them in this order:
curl_printf.h
curl_memory.h
memdebug.h
None of them include system headers, they all do funny #defines.
Reported-by: David Benjamin
Fixes#743
It does open up a miniscule risk that one of the other protocols that
libcurl could use would send back a Content-Disposition header and then
curl would act on it even if not HTTP.
A future mitigation for this risk would be to allow the callback to ask
libcurl which protocol is being used.
Verified with test 1312
Closes#760
This script now also scans src/tool_getparam.c, docs/curl.1 and
src/tool_help.c and will warn if any of them lists a command line option
not mentioned in one of the other places.
While being debated (in #716) and a violation of RFC 7230 section 5.4,
this test verifies that the existing functionality works as intended. It
strips the dot from the host name and uses the host without dot
throughout the internals.
... for checking ability to receive full HTTP response when POST request
is used with slow read callback function.
This test checks for bug #657 and verifies the work-around from
72d5e144fb.
Closes#720
warning: implicit declaration of function 'sprintf_was_used'
[-Wimplicit-function-declaration]
Follow up to the modications made to tests/libtest in commit 55452ebdff
as we prefer not to use sprintf() now.
The define is not in our name space and is therefore not protected by
our API promises.
It was only really used by libcurl internals but was mostly erased from
there already in 8aabbf5 (March 2015). This is supposedly the final
death blow to that define from everywhere.
As a side-effect, making sure _MPRINTF_REPLACE is gone and not used, I
made the lib tests in tests/libtest/ use curl_printf.h for its redefine
magic and then subsequently the use of sprintf() got banned in the tests
as well (as it is in libcurl internals) and I then replaced them all
with snprintf().
In the unlikely event that any users is actually using this define and
gets sad by this change, it is very easily copied to the user's own
code.
Fixed failed redirection of stderr with some options. At least on Msys2,
perl fails to redirect stderr if $value contains newline or other weird
characters.
It seems we may have some autobuild problems after this commit went
in. Trying to see if a revert helps to get them back.
This reverts commit 2716350d1f.
RFC 6265 section 4.1.1 spells out that the first name/value pair in the
header is the actual cookie name and content, while the following are
the parameters.
libcurl previously had a more liberal approach which causes significant
problems when introducing new cookie parameters, like the suggested new
cookie priority draft.
The previous logic read all n/v pairs from left-to-right and the first
name used that wassn't a known parameter name would be used as the
cookie name, thus accepting "Set-Cookie: Max-Age=2; person=daniel" to be
a cookie named 'person' while an RFC 6265 compliant parser should
consider that to be a cookie named 'Max-Age' with an (unknown) parameter
'person'.
Fixes#709
DSA is no longer supported by OpenSSH 7.0, which causes all SCP/SFTP
test cases to be skipped. Using RSA for host authentication works with
both old and new versions of OpenSSH.
Reported-by: Karlson2k
Closes#676
- Add tests.
- Add an example to CURLOPT_TFTP_NO_OPTIONS.3.
- Add --tftp-no-options to expose CURLOPT_TFTP_NO_OPTIONS.
Bug: https://github.com/curl/curl/issues/481
It turns out Firefox and Chrome both allow spaces in cookie names and
there are sites out there using that.
Turned out the code meant to strip off trailing space from cookie names
didn't work. Fixed now.
Test case 8 modified to verify both these changes.
Closes#639
- Add unit test 1604 to test the sanitize_file_name function.
- Use -DCURL_STATICLIB when building libcurltool for unit testing.
- Better detection of reserved DOS device names.
- New flags to modify sanitize behavior:
SANITIZE_ALLOW_COLONS: Allow colons
SANITIZE_ALLOW_PATH: Allow path separators and colons
SANITIZE_ALLOW_RESERVED: Allow reserved device names
SANITIZE_ALLOW_TRUNCATE: Allow truncating a long filename
- Restore sanitization of banned characters from user-specified outfile.
Prior to this commit sanitization of a user-specified outfile was
temporarily disabled in 2b6dadc because there was no way to allow path
separators and colons through while replacing other banned characters.
Now in such a case we call the sanitize function with
SANITIZE_ALLOW_PATH which allows path separators and colons to pass
through.
Closes https://github.com/curl/curl/issues/624
Reported-by: Octavio Schroeder
It isn't used by the code in current conditions but for safety it seems
sensible to at least not crash on such input.
Extended unit test 1395 to verify this too as well as a plain "/" input.
Before this patch, if a URL does not start with the protocol
name/scheme, effective URLs would be prefixed with upper-case protocol
names/schemes. This behavior might not be expected by library users or
end users.
For example, if `CURLOPT_DEFAULT_PROTOCOL` is set to "https". And the
URL is "hostname/path". The effective URL would be
"HTTPS://hostname/path" instead of "https://hostname/path".
After this patch, effective URLs would be prefixed with a lower-case
protocol name/scheme.
Closes#597
Signed-off-by: Mohammad AlSaleh <CE.Mohammad.AlSaleh@gmail.com>
The request needs to be read and send in binary mode in order to use
CRLF instead of LF. Adding --upload-file - causes curl to read stdin
in binary mode.
Make this the default for the curl tool (if built with HTTP/2 powers
enabled) unless a specific HTTP version is requested on the command
line.
This should allow more users to get HTTP/2 powers without having to
change anything.
Tests 842, 843, 844, 845, 887, 888, 889, 890, 946, 947, 948 and 949 fail
if a custom port number is specified via the -b option of runtests.pl.
Suggested by: Kamil Dudka
Bug: http://curl.haxx.se/mail/lib-2015-12/0003.html
As POP3 final and continuation responses both begin with a + character,
and both the finalcode and contcode variables in SASLprotoc are set as
such, we cannot tell the difference between them when we are expecting
an optional continuation from the server such as the following:
+ something else from the server
+OK final response
Disabled these tests until such a time we can tell the responses apart.
The hashes can vary between architectures (e.g. Sparc differs from x86_64).
This is not a fatal problem but just reduces the coverage of these white-box
tests, as the assumptions about into which hash bucket each key falls are no
longer valid.
- no point in repeating curl features that is already listed as features
from the curl -V output
- remove the port numbers/unix domain path from the output unless
verbose is used, as that is rarely interesting to users.
The tftpd test server now logs all received options and thus all TFTP
test cases need to match them exactly.
Extended test 283 to use and verify --tftp-blksize.
Apparently there are sites out there that do redirects to URLs they
provide in plain UTF-8 or similar. Browsers and wget %-encode such
headers when doing a subsequent request. Now libcurl does too.
Added test 1138 to verify.
Closes#473