MANUAL: update examples to resolve without redirects

www.netscape.com is redirecting to a cookie consent form on Aol, and
cool.haxx.se isn't responding to FTP anymore. Replace with examples
that resolves in case users try out the commands when reading the
manual.

Closes #6024
Reviewed-by: Daniel Stenberg <daniel@haxx.se>
Reviewed-by: Emil Engler <me@emilengler.com>
This commit is contained in:
Daniel Gustafsson 2020-09-30 21:05:14 +02:00
parent 025b20971c
commit 021f2c25fd
1 changed files with 8 additions and 8 deletions

View File

@ -2,9 +2,9 @@
## Simple Usage ## Simple Usage
Get the main page from Netscape's web-server: Get the main page from a web-server:
curl http://www.netscape.com/ curl https://www.example.com/
Get the README file the user's home directory at funet's ftp-server: Get the README file the user's home directory at funet's ftp-server:
@ -16,7 +16,7 @@ Get a web page from a server using port 8000:
Get a directory listing of an FTP site: Get a directory listing of an FTP site:
curl ftp://cool.haxx.se/ curl ftp://ftp.funet.fi
Get the definition of curl from a dictionary: Get the definition of curl from a dictionary:
@ -24,7 +24,7 @@ Get the definition of curl from a dictionary:
Fetch two documents at once: Fetch two documents at once:
curl ftp://cool.haxx.se/ http://www.weirdserver.com:8000/ curl ftp://ftp.funet.fi/ http://www.weirdserver.com:8000/
Get a file off an FTPS server: Get a file off an FTPS server:
@ -61,13 +61,13 @@ Get a file from an SMB server:
Get a web page and store in a local file with a specific name: Get a web page and store in a local file with a specific name:
curl -o thatpage.html http://www.netscape.com/ curl -o thatpage.html http://www.example.com/
Get a web page and store in a local file, make the local file get the name of Get a web page and store in a local file, make the local file get the name of
the remote document (if no file name part is specified in the URL, this will the remote document (if no file name part is specified in the URL, this will
fail): fail):
curl -O http://www.netscape.com/index.html curl -O http://www.example.com/index.html
Fetch two files and store them with their remote names: Fetch two files and store them with their remote names:
@ -657,11 +657,11 @@ Download with `PORT` but use 192.168.0.10 as our IP address to use:
Get a web page from a server using a specified port for the interface: Get a web page from a server using a specified port for the interface:
curl --interface eth0:1 http://www.netscape.com/ curl --interface eth0:1 http://www.example.com/
or or
curl --interface 192.168.1.10 http://www.netscape.com/ curl --interface 192.168.1.10 http://www.example.com/
## HTTPS ## HTTPS