Since the connection can be used by many independent requests (using
HTTP/2 or HTTP/3), things like user-agent and other transfer-specific
data MUST NOT be kept connection oriented as it could lead to requests
getting the wrong string for their requests. This struct data was
lingering like this due to old HTTP1 legacy thinking where it didn't
mattered..
Fixes#5566Closes#5567
Instead of discussing if there's value or meaning (implied or not) in
the colors, let's use words without the same possibly negative
associations.
Closes#5546
When the method is updated inside libcurl we must still not change the
method as set by the user as then repeated transfers with that same
handle might not execute the same operation anymore!
This fixes the libcurl part of #5462
Test 1633 added to verify.
Closes#5499
A common set of functions instead of many separate implementations for
creating buffers that can grow when appending data to them. Existing
functionality has been ported over.
In my early basic testing, the total number of allocations seem at
roughly the same amount as before, possibly a few less.
See docs/DYNBUF.md for a description of the API.
Closes#5300
In a debug build, settting the environment variable "CURL_SMALLREQSEND"
will make the first HTTP request send not send more bytes than the set
amount, thus ending up verifying that the logic for handling a split
HTTP request send works correctly.
As we have logic that checks if we get a >= 400 reponse code back before
the upload is done, which then got confused since it wasn't "done" but
yet there was no data to send!
Reported-by: IvanoG on github
Fixes#4996Closes#5002
When doing a request with a body + Expect: 100-continue and the server
responds with a 417, the same request will be retried immediately
without the Expect: header.
Added test 357 to verify.
Also added a control instruction to tell the sws test server to not read
the request body if Expect: is present, which the new test 357 uses.
Reported-by: bramus on github
Fixes#4949Closes#4964
It could accidentally let the connection get used by more than one
thread, leading to double-free and more.
Reported-by: Christopher Reid
Fixes#4544Closes#4557
... and use internally. This function will return TIME_T_MAX instead of
failure if the parsed data is found to be larger than what can be
represented. TIME_T_MAX being the largest value curl can represent.
Reviewed-by: Daniel Gustafsson
Reported-by: JanB on github
Fixes#4152Closes#4651
The 'share object' only sets the storage area for cookies. The "cookie
engine" still needs to be enabled or activated using the normal cookie
options.
This caused the curl command line tool to accidentally use cookies
without having been told to, since curl switched to using shared cookies
in 7.66.0.
Test 1166 verifies
Updated test 506
Fixes#4429Closes#4434
When a username and password are provided in the URL, they were wrongly
removed from the stored URL so that subsequent uses of the same URL
wouldn't find the crendentials. This made doing HTTP auth with multiple
connections (like Digest) mishave.
Regression from 46e164069d (7.62.0)
Test case 335 added to verify.
Reported-by: Mike Crowe
Fixes#4228Closes#4229
RFC 7838 section 5:
When using an alternative service, clients SHOULD include an Alt-Used
header field in all requests.
Removed CURLALTSVC_ALTUSED again (feature is still EXPERIMENTAL thus
this is deemed ok).
You can disable sending this header just like you disable any other HTTP
header in libcurl.
Closes#4199
Even though it cannot fall-back to a lower HTTP version automatically. The
safer way to upgrade remains via CURLOPT_ALTSVC.
CURLOPT_H3 no longer has any bits that do anything and might be removed
before we remove the experimental label.
Updated the curl tool accordingly to use "--http3".
Closes#4197
This is only the libcurl part that provides the information. There's no
user of the parsed value. This change includes three new tests for the
parser.
Ref: #3794
It was used (intended) to pass in the size of the 'socks' array that is
also passed to these functions, but was rarely actually checked/used and
the array is defined to a fixed size of MAX_SOCKSPEREASYHANDLE entries
that should be used instead.
Closes#4169
If using the read callback for HTTP_POST, and POSTFIELDSIZE is not set,
automatically add a Transfer-Encoding: chunked header, same as it is
already done for HTTP_PUT, HTTP_POST_FORM and HTTP_POST_MIME. Update
test 1514 according to the new behaviour.
Closes#4138
USe configure --with-ngtcp2 or --with-quiche
Using either option will enable a HTTP3 build.
Co-authored-by: Alessandro Ghedini <alessandro@ghedini.me>
Closes#3500
With CURLOPT_TIMECONDITION set, a header is automatically added (e.g.
If-Modified-Since). Allow this to be replaced or suppressed with
CURLOPT_HTTPHEADER.
Fixes#4103Closes#4109
There were a leftover few prototypes of Curl_ functions that we used to
export but no longer do, this removes those prototypes and cleans up any
comments still referring to them.
Curl_write32_le(), Curl_strcpy_url(), Curl_strlen_url(), Curl_up_free()
Curl_concat_url(), Curl_detach_connnection(), Curl_http_setup_conn()
were made static in 05b100aee2.
Curl_http_perhapsrewind() made static in 574aecee20.
For the remainder, I didn't trawl the Git logs hard enough to capture
their exact time of deletion, but they were all gone: Curl_splayprint(),
Curl_http2_send_request(), Curl_global_host_cache_dtor(),
Curl_scan_cache_used(), Curl_hostcache_destroy(), Curl_second_connect(),
Curl_http_auth_stage() and Curl_close_connections().
Closes#4096
Reviewed-by: Daniel Stenberg <daniel@haxx.se>
The header buffer size calculation can from static analysis seem to
overlow as it performs an addition between two size_t variables and
stores the result in a size_t variable. Overflow is however guarded
against elsewhere since the input to the addition is regulated by
the maximum read buffer size. Clarify this with a comment since the
question was asked.
Reviewed-by: Daniel Stenberg <daniel@haxx.se>
To make sure a HTTP/2 stream registers the end of stream.
Bug #4043 made me find this problem but this fix doesn't correct the
reported issue.
Closes#4068
Responses with status codes 1xx, 204 or 304 don't have a response body. For
these, don't parse these headers:
- Content-Encoding
- Content-Length
- Content-Range
- Last-Modified
- Transfer-Encoding
This change ensures that HTTP/2 upgrades work even if a
"Content-Length: 0" or a "Transfer-Encoding: chunked" header is present.
Co-authored-by: Daniel Stenberg
Closes#3702Fixes#3968Closes#3977
They serve very little purpose and mostly just add noise. Most of them
have been around for a very long time. I read them all before removing
or rephrasing them.
Ref: #3876Closes#3883