big last-beta (?) cleanup commit

This commit is contained in:
Daniel Stenberg 2000-07-31 22:42:34 +00:00
parent 513ac758da
commit c3c7739811
17 changed files with 987 additions and 401 deletions

24
CHANGES
View File

@ -7,6 +7,27 @@
History of Changes
Version 7.0.11beta
Daniel (1 August 2000)
- Albert Chin-A-Young pointed out that 'make install' did not properly create
the header include directory, why it failed to install the header files as
it should. Automake isn't really equipped to deal with subdirectories
without Makefiles in any nice way. I had to run ahead and add Makefiles in
both include and include/curl before I managed to create a top-level
makefile that succeeds in install everything properly!
- Ok, no more "features" added now. Let's just verify that there's no major
flaws added now.
Daniel (31 July 2000)
- Both Jeff Schasny and Ketil Froyn asked me how to tell curl not to send one
of those internally generated headers. They didn't settle with the blank
ones you could tell curl to use. I rewrote the header-replace stuff a
little. Now, if you replace an internal header with your own and that new
one is a blank header you will only remove the internal one and not get any
blank. I couldn't figure out any case when you want that blank header.
Daniel (29 July 2000)
- It struck me that the lib used localtime() which is not thread-safe, so now
I use localtime_r() in the systems that has it.
@ -14,7 +35,8 @@ Daniel (29 July 2000)
- I went through this entire document and removed all email addresses and left
names only. I've really made an effort to always note who brought be bug
reports or fixes, but more and more people ask me to remove the email
addresses since they become victims for spams this way.
addresses since they become victims for spams this way. Gordon Beaton got me
working on this.
Daniel (27 July 2000)
- Jörn Hartroth found out that when you specified a HTTP proxy in an

4
FILES
View File

@ -51,5 +51,9 @@ lib/*am
lib/Makefile.vc6
lib/*m32
include/README
include/Makefile.in
include/Makefile.am
include/curl/*.h
include/curl/Makefile.in
include/curl/Makefile.am

View File

@ -6,5 +6,5 @@ AUTOMAKE_OPTIONS = foreign no-dependencies
EXTRA_DIST = curl.spec curl-ssl.spec
SUBDIRS = docs lib src
SUBDIRS = docs lib src include

413
aclocal.m4 vendored
View File

@ -125,3 +125,416 @@ else
fi
AC_SUBST($1)])
# serial 40 AC_PROG_LIBTOOL
AC_DEFUN(AC_PROG_LIBTOOL,
[AC_REQUIRE([AC_LIBTOOL_SETUP])dnl
# Save cache, so that ltconfig can load it
AC_CACHE_SAVE
# Actually configure libtool. ac_aux_dir is where install-sh is found.
CC="$CC" CFLAGS="$CFLAGS" CPPFLAGS="$CPPFLAGS" \
LD="$LD" LDFLAGS="$LDFLAGS" LIBS="$LIBS" \
LN_S="$LN_S" NM="$NM" RANLIB="$RANLIB" \
DLLTOOL="$DLLTOOL" AS="$AS" OBJDUMP="$OBJDUMP" \
${CONFIG_SHELL-/bin/sh} $ac_aux_dir/ltconfig --no-reexec \
$libtool_flags --no-verify $ac_aux_dir/ltmain.sh $lt_target \
|| AC_MSG_ERROR([libtool configure failed])
# Reload cache, that may have been modified by ltconfig
AC_CACHE_LOAD
# This can be used to rebuild libtool when needed
LIBTOOL_DEPS="$ac_aux_dir/ltconfig $ac_aux_dir/ltmain.sh"
# Always use our own libtool.
LIBTOOL='$(SHELL) $(top_builddir)/libtool'
AC_SUBST(LIBTOOL)dnl
# Redirect the config.log output again, so that the ltconfig log is not
# clobbered by the next message.
exec 5>>./config.log
])
AC_DEFUN(AC_LIBTOOL_SETUP,
[AC_PREREQ(2.13)dnl
AC_REQUIRE([AC_ENABLE_SHARED])dnl
AC_REQUIRE([AC_ENABLE_STATIC])dnl
AC_REQUIRE([AC_ENABLE_FAST_INSTALL])dnl
AC_REQUIRE([AC_CANONICAL_HOST])dnl
AC_REQUIRE([AC_CANONICAL_BUILD])dnl
AC_REQUIRE([AC_PROG_RANLIB])dnl
AC_REQUIRE([AC_PROG_CC])dnl
AC_REQUIRE([AC_PROG_LD])dnl
AC_REQUIRE([AC_PROG_NM])dnl
AC_REQUIRE([AC_PROG_LN_S])dnl
dnl
case "$target" in
NONE) lt_target="$host" ;;
*) lt_target="$target" ;;
esac
# Check for any special flags to pass to ltconfig.
libtool_flags="--cache-file=$cache_file"
test "$enable_shared" = no && libtool_flags="$libtool_flags --disable-shared"
test "$enable_static" = no && libtool_flags="$libtool_flags --disable-static"
test "$enable_fast_install" = no && libtool_flags="$libtool_flags --disable-fast-install"
test "$ac_cv_prog_gcc" = yes && libtool_flags="$libtool_flags --with-gcc"
test "$ac_cv_prog_gnu_ld" = yes && libtool_flags="$libtool_flags --with-gnu-ld"
ifdef([AC_PROVIDE_AC_LIBTOOL_DLOPEN],
[libtool_flags="$libtool_flags --enable-dlopen"])
ifdef([AC_PROVIDE_AC_LIBTOOL_WIN32_DLL],
[libtool_flags="$libtool_flags --enable-win32-dll"])
AC_ARG_ENABLE(libtool-lock,
[ --disable-libtool-lock avoid locking (might break parallel builds)])
test "x$enable_libtool_lock" = xno && libtool_flags="$libtool_flags --disable-lock"
test x"$silent" = xyes && libtool_flags="$libtool_flags --silent"
# Some flags need to be propagated to the compiler or linker for good
# libtool support.
case "$lt_target" in
*-*-irix6*)
# Find out which ABI we are using.
echo '[#]line __oline__ "configure"' > conftest.$ac_ext
if AC_TRY_EVAL(ac_compile); then
case "`/usr/bin/file conftest.o`" in
*32-bit*)
LD="${LD-ld} -32"
;;
*N32*)
LD="${LD-ld} -n32"
;;
*64-bit*)
LD="${LD-ld} -64"
;;
esac
fi
rm -rf conftest*
;;
*-*-sco3.2v5*)
# On SCO OpenServer 5, we need -belf to get full-featured binaries.
SAVE_CFLAGS="$CFLAGS"
CFLAGS="$CFLAGS -belf"
AC_CACHE_CHECK([whether the C compiler needs -belf], lt_cv_cc_needs_belf,
[AC_TRY_LINK([],[],[lt_cv_cc_needs_belf=yes],[lt_cv_cc_needs_belf=no])])
if test x"$lt_cv_cc_needs_belf" != x"yes"; then
# this is probably gcc 2.8.0, egcs 1.0 or newer; no need for -belf
CFLAGS="$SAVE_CFLAGS"
fi
;;
ifdef([AC_PROVIDE_AC_LIBTOOL_WIN32_DLL],
[*-*-cygwin* | *-*-mingw*)
AC_CHECK_TOOL(DLLTOOL, dlltool, false)
AC_CHECK_TOOL(AS, as, false)
AC_CHECK_TOOL(OBJDUMP, objdump, false)
;;
])
esac
])
# AC_LIBTOOL_DLOPEN - enable checks for dlopen support
AC_DEFUN(AC_LIBTOOL_DLOPEN, [AC_BEFORE([$0],[AC_LIBTOOL_SETUP])])
# AC_LIBTOOL_WIN32_DLL - declare package support for building win32 dll's
AC_DEFUN(AC_LIBTOOL_WIN32_DLL, [AC_BEFORE([$0], [AC_LIBTOOL_SETUP])])
# AC_ENABLE_SHARED - implement the --enable-shared flag
# Usage: AC_ENABLE_SHARED[(DEFAULT)]
# Where DEFAULT is either `yes' or `no'. If omitted, it defaults to
# `yes'.
AC_DEFUN(AC_ENABLE_SHARED, [dnl
define([AC_ENABLE_SHARED_DEFAULT], ifelse($1, no, no, yes))dnl
AC_ARG_ENABLE(shared,
changequote(<<, >>)dnl
<< --enable-shared[=PKGS] build shared libraries [default=>>AC_ENABLE_SHARED_DEFAULT],
changequote([, ])dnl
[p=${PACKAGE-default}
case "$enableval" in
yes) enable_shared=yes ;;
no) enable_shared=no ;;
*)
enable_shared=no
# Look at the argument we got. We use all the common list separators.
IFS="${IFS= }"; ac_save_ifs="$IFS"; IFS="${IFS}:,"
for pkg in $enableval; do
if test "X$pkg" = "X$p"; then
enable_shared=yes
fi
done
IFS="$ac_save_ifs"
;;
esac],
enable_shared=AC_ENABLE_SHARED_DEFAULT)dnl
])
# AC_DISABLE_SHARED - set the default shared flag to --disable-shared
AC_DEFUN(AC_DISABLE_SHARED, [AC_BEFORE([$0],[AC_LIBTOOL_SETUP])dnl
AC_ENABLE_SHARED(no)])
# AC_ENABLE_STATIC - implement the --enable-static flag
# Usage: AC_ENABLE_STATIC[(DEFAULT)]
# Where DEFAULT is either `yes' or `no'. If omitted, it defaults to
# `yes'.
AC_DEFUN(AC_ENABLE_STATIC, [dnl
define([AC_ENABLE_STATIC_DEFAULT], ifelse($1, no, no, yes))dnl
AC_ARG_ENABLE(static,
changequote(<<, >>)dnl
<< --enable-static[=PKGS] build static libraries [default=>>AC_ENABLE_STATIC_DEFAULT],
changequote([, ])dnl
[p=${PACKAGE-default}
case "$enableval" in
yes) enable_static=yes ;;
no) enable_static=no ;;
*)
enable_static=no
# Look at the argument we got. We use all the common list separators.
IFS="${IFS= }"; ac_save_ifs="$IFS"; IFS="${IFS}:,"
for pkg in $enableval; do
if test "X$pkg" = "X$p"; then
enable_static=yes
fi
done
IFS="$ac_save_ifs"
;;
esac],
enable_static=AC_ENABLE_STATIC_DEFAULT)dnl
])
# AC_DISABLE_STATIC - set the default static flag to --disable-static
AC_DEFUN(AC_DISABLE_STATIC, [AC_BEFORE([$0],[AC_LIBTOOL_SETUP])dnl
AC_ENABLE_STATIC(no)])
# AC_ENABLE_FAST_INSTALL - implement the --enable-fast-install flag
# Usage: AC_ENABLE_FAST_INSTALL[(DEFAULT)]
# Where DEFAULT is either `yes' or `no'. If omitted, it defaults to
# `yes'.
AC_DEFUN(AC_ENABLE_FAST_INSTALL, [dnl
define([AC_ENABLE_FAST_INSTALL_DEFAULT], ifelse($1, no, no, yes))dnl
AC_ARG_ENABLE(fast-install,
changequote(<<, >>)dnl
<< --enable-fast-install[=PKGS] optimize for fast installation [default=>>AC_ENABLE_FAST_INSTALL_DEFAULT],
changequote([, ])dnl
[p=${PACKAGE-default}
case "$enableval" in
yes) enable_fast_install=yes ;;
no) enable_fast_install=no ;;
*)
enable_fast_install=no
# Look at the argument we got. We use all the common list separators.
IFS="${IFS= }"; ac_save_ifs="$IFS"; IFS="${IFS}:,"
for pkg in $enableval; do
if test "X$pkg" = "X$p"; then
enable_fast_install=yes
fi
done
IFS="$ac_save_ifs"
;;
esac],
enable_fast_install=AC_ENABLE_FAST_INSTALL_DEFAULT)dnl
])
# AC_ENABLE_FAST_INSTALL - set the default to --disable-fast-install
AC_DEFUN(AC_DISABLE_FAST_INSTALL, [AC_BEFORE([$0],[AC_LIBTOOL_SETUP])dnl
AC_ENABLE_FAST_INSTALL(no)])
# AC_PROG_LD - find the path to the GNU or non-GNU linker
AC_DEFUN(AC_PROG_LD,
[AC_ARG_WITH(gnu-ld,
[ --with-gnu-ld assume the C compiler uses GNU ld [default=no]],
test "$withval" = no || with_gnu_ld=yes, with_gnu_ld=no)
AC_REQUIRE([AC_PROG_CC])dnl
AC_REQUIRE([AC_CANONICAL_HOST])dnl
AC_REQUIRE([AC_CANONICAL_BUILD])dnl
ac_prog=ld
if test "$ac_cv_prog_gcc" = yes; then
# Check if gcc -print-prog-name=ld gives a path.
AC_MSG_CHECKING([for ld used by GCC])
ac_prog=`($CC -print-prog-name=ld) 2>&5`
case "$ac_prog" in
# Accept absolute paths.
changequote(,)dnl
[\\/]* | [A-Za-z]:[\\/]*)
re_direlt='/[^/][^/]*/\.\./'
changequote([,])dnl
# Canonicalize the path of ld
ac_prog=`echo $ac_prog| sed 's%\\\\%/%g'`
while echo $ac_prog | grep "$re_direlt" > /dev/null 2>&1; do
ac_prog=`echo $ac_prog| sed "s%$re_direlt%/%"`
done
test -z "$LD" && LD="$ac_prog"
;;
"")
# If it fails, then pretend we aren't using GCC.
ac_prog=ld
;;
*)
# If it is relative, then search for the first ld in PATH.
with_gnu_ld=unknown
;;
esac
elif test "$with_gnu_ld" = yes; then
AC_MSG_CHECKING([for GNU ld])
else
AC_MSG_CHECKING([for non-GNU ld])
fi
AC_CACHE_VAL(ac_cv_path_LD,
[if test -z "$LD"; then
IFS="${IFS= }"; ac_save_ifs="$IFS"; IFS="${IFS}${PATH_SEPARATOR-:}"
for ac_dir in $PATH; do
test -z "$ac_dir" && ac_dir=.
if test -f "$ac_dir/$ac_prog" || test -f "$ac_dir/$ac_prog$ac_exeext"; then
ac_cv_path_LD="$ac_dir/$ac_prog"
# Check to see if the program is GNU ld. I'd rather use --version,
# but apparently some GNU ld's only accept -v.
# Break only if it was the GNU/non-GNU ld that we prefer.
if "$ac_cv_path_LD" -v 2>&1 < /dev/null | egrep '(GNU|with BFD)' > /dev/null; then
test "$with_gnu_ld" != no && break
else
test "$with_gnu_ld" != yes && break
fi
fi
done
IFS="$ac_save_ifs"
else
ac_cv_path_LD="$LD" # Let the user override the test with a path.
fi])
LD="$ac_cv_path_LD"
if test -n "$LD"; then
AC_MSG_RESULT($LD)
else
AC_MSG_RESULT(no)
fi
test -z "$LD" && AC_MSG_ERROR([no acceptable ld found in \$PATH])
AC_PROG_LD_GNU
])
AC_DEFUN(AC_PROG_LD_GNU,
[AC_CACHE_CHECK([if the linker ($LD) is GNU ld], ac_cv_prog_gnu_ld,
[# I'd rather use --version here, but apparently some GNU ld's only accept -v.
if $LD -v 2>&1 </dev/null | egrep '(GNU|with BFD)' 1>&5; then
ac_cv_prog_gnu_ld=yes
else
ac_cv_prog_gnu_ld=no
fi])
])
# AC_PROG_NM - find the path to a BSD-compatible name lister
AC_DEFUN(AC_PROG_NM,
[AC_MSG_CHECKING([for BSD-compatible nm])
AC_CACHE_VAL(ac_cv_path_NM,
[if test -n "$NM"; then
# Let the user override the test.
ac_cv_path_NM="$NM"
else
IFS="${IFS= }"; ac_save_ifs="$IFS"; IFS="${IFS}${PATH_SEPARATOR-:}"
for ac_dir in $PATH /usr/ccs/bin /usr/ucb /bin; do
test -z "$ac_dir" && ac_dir=.
if test -f $ac_dir/nm || test -f $ac_dir/nm$ac_exeext ; then
# Check to see if the nm accepts a BSD-compat flag.
# Adding the `sed 1q' prevents false positives on HP-UX, which says:
# nm: unknown option "B" ignored
if ($ac_dir/nm -B /dev/null 2>&1 | sed '1q'; exit 0) | egrep /dev/null >/dev/null; then
ac_cv_path_NM="$ac_dir/nm -B"
break
elif ($ac_dir/nm -p /dev/null 2>&1 | sed '1q'; exit 0) | egrep /dev/null >/dev/null; then
ac_cv_path_NM="$ac_dir/nm -p"
break
else
ac_cv_path_NM=${ac_cv_path_NM="$ac_dir/nm"} # keep the first match, but
continue # so that we can try to find one that supports BSD flags
fi
fi
done
IFS="$ac_save_ifs"
test -z "$ac_cv_path_NM" && ac_cv_path_NM=nm
fi])
NM="$ac_cv_path_NM"
AC_MSG_RESULT([$NM])
])
# AC_CHECK_LIBM - check for math library
AC_DEFUN(AC_CHECK_LIBM,
[AC_REQUIRE([AC_CANONICAL_HOST])dnl
LIBM=
case "$lt_target" in
*-*-beos* | *-*-cygwin*)
# These system don't have libm
;;
*-ncr-sysv4.3*)
AC_CHECK_LIB(mw, _mwvalidcheckl, LIBM="-lmw")
AC_CHECK_LIB(m, main, LIBM="$LIBM -lm")
;;
*)
AC_CHECK_LIB(m, main, LIBM="-lm")
;;
esac
])
# AC_LIBLTDL_CONVENIENCE[(dir)] - sets LIBLTDL to the link flags for
# the libltdl convenience library and INCLTDL to the include flags for
# the libltdl header and adds --enable-ltdl-convenience to the
# configure arguments. Note that LIBLTDL and INCLTDL are not
# AC_SUBSTed, nor is AC_CONFIG_SUBDIRS called. If DIR is not
# provided, it is assumed to be `libltdl'. LIBLTDL will be prefixed
# with '${top_builddir}/' and INCLTDL will be prefixed with
# '${top_srcdir}/' (note the single quotes!). If your package is not
# flat and you're not using automake, define top_builddir and
# top_srcdir appropriately in the Makefiles.
AC_DEFUN(AC_LIBLTDL_CONVENIENCE, [AC_BEFORE([$0],[AC_LIBTOOL_SETUP])dnl
case "$enable_ltdl_convenience" in
no) AC_MSG_ERROR([this package needs a convenience libltdl]) ;;
"") enable_ltdl_convenience=yes
ac_configure_args="$ac_configure_args --enable-ltdl-convenience" ;;
esac
LIBLTDL='${top_builddir}/'ifelse($#,1,[$1],['libltdl'])/libltdlc.la
INCLTDL='-I${top_srcdir}/'ifelse($#,1,[$1],['libltdl'])
])
# AC_LIBLTDL_INSTALLABLE[(dir)] - sets LIBLTDL to the link flags for
# the libltdl installable library and INCLTDL to the include flags for
# the libltdl header and adds --enable-ltdl-install to the configure
# arguments. Note that LIBLTDL and INCLTDL are not AC_SUBSTed, nor is
# AC_CONFIG_SUBDIRS called. If DIR is not provided and an installed
# libltdl is not found, it is assumed to be `libltdl'. LIBLTDL will
# be prefixed with '${top_builddir}/' and INCLTDL will be prefixed
# with '${top_srcdir}/' (note the single quotes!). If your package is
# not flat and you're not using automake, define top_builddir and
# top_srcdir appropriately in the Makefiles.
# In the future, this macro may have to be called after AC_PROG_LIBTOOL.
AC_DEFUN(AC_LIBLTDL_INSTALLABLE, [AC_BEFORE([$0],[AC_LIBTOOL_SETUP])dnl
AC_CHECK_LIB(ltdl, main,
[test x"$enable_ltdl_install" != xyes && enable_ltdl_install=no],
[if test x"$enable_ltdl_install" = xno; then
AC_MSG_WARN([libltdl not installed, but installation disabled])
else
enable_ltdl_install=yes
fi
])
if test x"$enable_ltdl_install" = x"yes"; then
ac_configure_args="$ac_configure_args --enable-ltdl-install"
LIBLTDL='${top_builddir}/'ifelse($#,1,[$1],['libltdl'])/libltdl.la
INCLTDL='-I${top_srcdir}/'ifelse($#,1,[$1],['libltdl'])
else
ac_configure_args="$ac_configure_args --enable-ltdl-install=no"
LIBLTDL="-lltdl"
INCLTDL=
fi
])
dnl old names
AC_DEFUN(AM_PROG_LIBTOOL, [indir([AC_PROG_LIBTOOL])])dnl
AC_DEFUN(AM_ENABLE_SHARED, [indir([AC_ENABLE_SHARED], $@)])dnl
AC_DEFUN(AM_ENABLE_STATIC, [indir([AC_ENABLE_STATIC], $@)])dnl
AC_DEFUN(AM_DISABLE_SHARED, [indir([AC_DISABLE_SHARED], $@)])dnl
AC_DEFUN(AM_DISABLE_STATIC, [indir([AC_DISABLE_STATIC], $@)])dnl
AC_DEFUN(AM_PROG_LD, [indir([AC_PROG_LD])])dnl
AC_DEFUN(AM_PROG_NM, [indir([AC_PROG_NM])])dnl
dnl This is just to silence aclocal about the macro not being used
ifelse([AC_DISABLE_FAST_INSTALL])dnl

View File

@ -82,6 +82,9 @@
/* Define if you have the inet_ntoa_r function. */
#undef HAVE_INET_NTOA_R
/* Define if you have the localtime_r function. */
#undef HAVE_LOCALTIME_R
/* Define if you have the perror function. */
#undef HAVE_PERROR

View File

@ -2,7 +2,7 @@ dnl $Id$
dnl Process this file with autoconf to produce a configure script.
AC_INIT(lib/urldata.h)
AM_CONFIG_HEADER(config.h src/config.h)
AM_INIT_AUTOMAKE(curl,"7.0.10beta")
AM_INIT_AUTOMAKE(curl,"7.0.11test")
AM_PROG_LIBTOOL
dnl
@ -339,6 +339,8 @@ AC_OUTPUT( Makefile \
curl.spec \
curl-ssl.spec \
docs/Makefile \
include/Makefile \
include/curl/Makefile \
src/Makefile \
lib/Makefile )
dnl perl/checklinks.pl \

View File

@ -8,31 +8,41 @@ CONTRIBUTE
To Think About When Contributing Source Code
This document is intended to offer some guidelines that can be useful to
keep in mind when you decide to write a contribution to the project. This
concerns new features as well as corrections to existing flaws or bugs.
This document is intended to offer some guidelines that can be useful to keep
in mind when you decide to write a contribution to the project. This concerns
new features as well as corrections to existing flaws or bugs.
The License Issue
When contributing with code, you agree to put your changes and new code under
the same license curl and libcurl is already using. Curl uses the MozPL, the
Mozilla Public License, which is *NOT* compatible with the well known GPL,
GNU Public License. We can never re-use sources from a GPL program in curl.
If you add a larger piece of code, you can opt to make that file or set of
files to use a different license as long as they don't enfore any changes to
the rest of the package. Such "separate parts" can not be GPL either.
Naming
Try using a non-confusing naming scheme for your new functions and variable
names. It doesn't necessarily have to mean that you should use the same as
in other places of the code, just that the names should be logical,
names. It doesn't necessarily have to mean that you should use the same as in
other places of the code, just that the names should be logical,
understandable and be named according to what they're used for.
Indenting
Please try using the same indenting levels and bracing method as all the
other code already does. It makes the source code a lot easier to follow if
all of it is written using the same style. I don't ask you to like it, I
just ask you to follow the tradition! ;-)
all of it is written using the same style. I don't ask you to like it, I just
ask you to follow the tradition! ;-)
Commenting
Comment your source code extensively. I don't see myself as a very good
source commenter, but I try to become one. Commented code is quality code
and enables future modifications much more. Uncommented code much more risk
being completely replaced when someone wants to extend things, since other
persons' source code can get quite hard to read.
source commenter, but I try to become one. Commented code is quality code and
enables future modifications much more. Uncommented code much more risk being
completely replaced when someone wants to extend things, since other persons'
source code can get quite hard to read.
General Style
@ -41,10 +51,10 @@ General Style
Non-clobbering All Over
When you write new functionality or fix bugs, it is important that you
don't fiddle all over the source files and functions. Remember that it is
likely that other people have done changes in the same source files as you
have and possibly even in the same functions. If you bring completely new
When you write new functionality or fix bugs, it is important that you don't
fiddle all over the source files and functions. Remember that it is likely
that other people have done changes in the same source files as you have and
possibly even in the same functions. If you bring completely new
functionality, try writing it in a new source file. If you fix bugs, try to
fix one bug at a time and send them as separate patches.
@ -61,10 +71,10 @@ Separate Patches Doing Different Things
Document
Writing docs is dead boring and one of the big problems with many open
source projects. Someone's gotta do it. It makes it a lot easier if you
submit a small description of your fix or your new features with every
contribution so that it can be swiftly added to the package documentation.
Writing docs is dead boring and one of the big problems with many open source
projects. Someone's gotta do it. It makes it a lot easier if you submit a
small description of your fix or your new features with every contribution so
that it can be swiftly added to the package documentation.
Write Access to CVS Repository

View File

@ -122,9 +122,9 @@ FAQ
9. Why do I get problems when I use & in the URL?
In general unix shells, the & letter is treated special and when used it
runs the specified command in the background. To safely send the & as a
part of a URL, you should qoute the entire URL by using single (') or
double (") quotes around it.
runs the specified command in the background. To safely send the & as a part
of a URL, you should qoute the entire URL by using single (') or double (")
quotes around it.
An example that would invoke a remote CGI that uses &-letters could be:
@ -132,8 +132,8 @@ FAQ
10. How can I use {, }, [ or ] to specify multiple URLs?
Because those letters have a special meaning to the shell, and to be used
in a URL specified to curl you must quote them.
Because those letters have a special meaning to the shell, and to be used in
a URL specified to curl you must quote them.
An example that downloads two URLs (sequentially) would do:
@ -150,10 +150,10 @@ FAQ
12. Why do I get downloaded data even though the web page doesn't exist?
Curl asks remote servers for the page you specify. If the page doesn't
exist at the server, the HTTP protocol defines how the server should
respond and that means that headers and a "page" will be returned. That's
simply how HTTP works.
Curl asks remote servers for the page you specify. If the page doesn't exist
at the server, the HTTP protocol defines how the server should respond and
that means that headers and a "page" will be returned. That's simply how
HTTP works.
By using the --fail option you can tell curl explicitly to not get any data
if the HTTP return code doesn't say success.
@ -165,26 +165,25 @@ FAQ
10.4.4 403 Forbidden
The server understood the request, but is refusing to fulfill it.
Authorization will not help and the request SHOULD NOT be repeated.
If the request method was not HEAD and the server wishes to make
public why the request has not been fulfilled, it SHOULD describe the
reason for the refusal in the entity. If the server does not wish to
make this information available to the client, the status code 404
(Not Found) can be used instead.
Authorization will not help and the request SHOULD NOT be repeated. If the
request method was not HEAD and the server wishes to make public why the
request has not been fulfilled, it SHOULD describe the reason for the
refusal in the entity. If the server does not wish to make this information
available to the client, the status code 404 (Not Found) can be used
instead.
14. How can I disable the Pragma: nocache header?
You can change all internally generated headers by adding a replacement
with the -H/--header option. By adding a header with empty contents you
safelt disables the headers. Use -H "Pragma:" to disable that specific
header.
You can change all internally generated headers by adding a replacement with
the -H/--header option. By adding a header with empty contents you safelt
disables the headers. Use -H "Pragma:" to disable that specific header.
15. Can you tell me what error code 142 means?
All error codes that are larger than the highest documented error code
means that curl has existed due to a timeout. There is currentl no nice way
for curl to abort from such a condition and that's why it gets this
undocumented error. This is planned to change in a future release.
All error codes that are larger than the highest documented error code means
that curl has existed due to a timeout. There is currentl no nice way for
curl to abort from such a condition and that's why it gets this undocumented
error. This is planned to change in a future release.
16. How do I keep usernames and passwords secret in Curl command lines?
@ -192,7 +191,7 @@ FAQ
The first part is to avoid having clear-text passwords in the command line
so that they don't appear in 'ps' outputs and similar. That is easily
avoided by using the "-K" option that tells curl to read parameters from a
avoided by using the "-K" option tho tell curl to read parameters from a
file or stdin to which you can pass the secret info.
To keep the passwords in your account secret from the rest of the world is
@ -204,8 +203,8 @@ FAQ
To curl, all contents are alike. It doesn't matter how the page was
generated. It may be ASP, PHP, perl, shell-script, SSI or plain
HTML-files. There's no difference to curl and it doesn't even know what
kind of language that generated the page.
HTML-files. There's no difference to curl and it doesn't even know what kind
of language that generated the page.
Javascript is slightly different since that is code embedded in the HTML
that is sent for the client to interpret and curl has no javascript
@ -224,3 +223,37 @@ FAQ
curl -O ftp://download.com/coolfile -Q '-DELE coolfile'
20. Can I use curl/libcurl in my program licensed under XXX?
Curl and libcurl are released under the MPL, the Mozilla Public License. To
get a really good answer to this or other licensing questions, you should
study the MPL license and the license you are about to use and check for
clashes yourself. This is a brief summary for a few cases for which we get
questions:
I have a GPL program, can I use the libcurl library?
No. GPL'd software requires all parts of the final executable to be
licensed under GPL.
I have a closed-source program, can I use the libcurl library?
Yes, libcurl does not put any restrictions on the program that uses the
library. If you end up doing changes to the library, only those changes
must be made available, not the ones to your program.
I have a program that uses LGPL libraries, can I use libcurl?
Yes you can. LGPL libraries don't spread to other libraries the same way
GPL ones do.
Can I modify curl/libcurl for my own program and keep the changes secret?
No, you're not allowed to do that.
Can you please change the curl/libcurl license to XXXX?
No. We carefully picked this license years ago and a large amount of
people have contributed with source code knowing that this is the license
we use. This license puts the restrictions we want on curl/libcurl and it
does not spread to other programs or libraries.

View File

@ -31,7 +31,7 @@ HTTP
- custom HTTP request
- cookie get/send
- understands the netscape cookie file
- custom headers (that can replace internally generated headers)
- custom headers (that can replace/remove internally generated headers)
- custom user-agent string
- custom referer string
- range
@ -57,7 +57,7 @@ FTP
- upload via http-proxy as HTTP PUT
- download resume
- upload resume
- QUOT commands (before and/or after the transfer)
- custom ftp commands (before and/or after the transfer)
- simple "range" support
- via http-proxy

View File

@ -82,7 +82,7 @@ Software
Similar Tools
-------------
wget - ftp://prep.ai.mit.edu/pub/gnu/
wget - http://www.gnu.org/software/wget/wget.html
snarf - http://www.xach.com/snarf/
@ -90,8 +90,6 @@ Similar Tools
swebget - http://www.uni-hildesheim.de/~smol0075/swebget/
fetch - ?
Related Software
----------------
ftpparse - http://cr.yp.to/ftpparse.html parses FTP LIST responses
@ -105,3 +103,5 @@ Related Software
gzip - http://www.gnu.org/software/gzip/gzip.html
tar - http://www.gnu.org/software/tar/tar.html
libtool - http://www.gnu.org/software/libtool/libtool.html

View File

@ -414,8 +414,8 @@ char *curl_getenv(char *variable);
char *curl_version(void);
/* This is the version number */
#define LIBCURL_VERSION "7.0.8beta"
#define LIBCURL_VERSION_NUM 0x070008
#define LIBCURL_VERSION "7.0.11test"
#define LIBCURL_VERSION_NUM 0x07000b
/* linked-list structure for the CURLOPT_QUOTE option */
struct curl_slist {

View File

@ -63,9 +63,14 @@ PRE_UNINSTALL = :
POST_UNINSTALL = :
host_alias = @host_alias@
host_triplet = @host@
AS = @AS@
CC = @CC@
DLLTOOL = @DLLTOOL@
LIBTOOL = @LIBTOOL@
LN_S = @LN_S@
MAKEINFO = @MAKEINFO@
NROFF = @NROFF@
OBJDUMP = @OBJDUMP@
PACKAGE = @PACKAGE@
PERL = @PERL@
RANLIB = @RANLIB@
@ -74,35 +79,37 @@ YACC = @YACC@
AUTOMAKE_OPTIONS = foreign no-dependencies
noinst_LIBRARIES = libcurl.a
lib_LTLIBRARIES = libcurl.la
# Some flags needed when trying to cause warnings ;-)
CFLAGS = -g -Wall #-pedantic
# CFLAGS = -g -Wall #-pedantic
INCLUDES = -I$(top_srcdir)/include
libcurl_a_SOURCES = arpa_telnet.h file.c getpass.h netrc.h timeval.c base64.c file.h hostip.c progress.c timeval.h base64.h formdata.c hostip.h progress.h cookie.c formdata.h http.c sendf.c cookie.h ftp.c http.h sendf.h url.c dict.c ftp.h if2ip.c speedcheck.c url.h dict.h getdate.c if2ip.h speedcheck.h urldata.h download.c getdate.h ldap.c ssluse.c version.c download.h getenv.c ldap.h ssluse.h escape.c getenv.h mprintf.c telnet.c escape.h getpass.c netrc.c telnet.h writeout.c writeout.h highlevel.c strequal.c strequal.h easy.c
libcurl_la_SOURCES = arpa_telnet.h file.c getpass.h netrc.h timeval.c base64.c file.h hostip.c progress.c timeval.h base64.h formdata.c hostip.h progress.h cookie.c formdata.h http.c sendf.c cookie.h ftp.c http.h sendf.h url.c dict.c ftp.h if2ip.c speedcheck.c url.h dict.h getdate.c if2ip.h speedcheck.h urldata.h download.c getdate.h ldap.c ssluse.c version.c download.h getenv.c ldap.h ssluse.h escape.c getenv.h mprintf.c telnet.c escape.h getpass.c netrc.c telnet.h writeout.c writeout.h highlevel.c strequal.c strequal.h easy.c
mkinstalldirs = $(SHELL) $(top_srcdir)/mkinstalldirs
CONFIG_HEADER = ../config.h ../src/config.h
CONFIG_CLEAN_FILES =
LIBRARIES = $(noinst_LIBRARIES)
LTLIBRARIES = $(lib_LTLIBRARIES)
DEFS = @DEFS@ -I. -I$(srcdir) -I.. -I../src
CPPFLAGS = @CPPFLAGS@
LDFLAGS = @LDFLAGS@
LIBS = @LIBS@
libcurl_a_LIBADD =
libcurl_a_OBJECTS = file.o timeval.o base64.o hostip.o progress.o \
formdata.o cookie.o http.o sendf.o ftp.o url.o dict.o if2ip.o \
speedcheck.o getdate.o download.o ldap.o ssluse.o version.o getenv.o \
escape.o mprintf.o telnet.o getpass.o netrc.o writeout.o highlevel.o \
strequal.o easy.o
AR = ar
libcurl_la_LDFLAGS =
libcurl_la_LIBADD =
libcurl_la_OBJECTS = file.lo timeval.lo base64.lo hostip.lo progress.lo \
formdata.lo cookie.lo http.lo sendf.lo ftp.lo url.lo dict.lo if2ip.lo \
speedcheck.lo getdate.lo download.lo ldap.lo ssluse.lo version.lo \
getenv.lo escape.lo mprintf.lo telnet.lo getpass.lo netrc.lo \
writeout.lo highlevel.lo strequal.lo easy.lo
CFLAGS = @CFLAGS@
COMPILE = $(CC) $(DEFS) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS)
LTCOMPILE = $(LIBTOOL) --mode=compile $(CC) $(DEFS) $(INCLUDES) $(AM_CPPFLAGS) $(CPPFLAGS) $(AM_CFLAGS) $(CFLAGS)
CCLD = $(CC)
LINK = $(CCLD) $(AM_CFLAGS) $(CFLAGS) $(LDFLAGS) -o $@
LINK = $(LIBTOOL) --mode=link $(CCLD) $(AM_CFLAGS) $(CFLAGS) $(LDFLAGS) -o $@
DIST_COMMON = Makefile.am Makefile.in
@ -110,28 +117,44 @@ DISTFILES = $(DIST_COMMON) $(SOURCES) $(HEADERS) $(TEXINFOS) $(EXTRA_DIST)
TAR = gtar
GZIP_ENV = --best
SOURCES = $(libcurl_a_SOURCES)
OBJECTS = $(libcurl_a_OBJECTS)
SOURCES = $(libcurl_la_SOURCES)
OBJECTS = $(libcurl_la_OBJECTS)
all: all-redirect
.SUFFIXES:
.SUFFIXES: .S .c .o .s
.SUFFIXES: .S .c .lo .o .s
$(srcdir)/Makefile.in: Makefile.am $(top_srcdir)/configure.in $(ACLOCAL_M4)
cd $(top_srcdir) && $(AUTOMAKE) --foreign lib/Makefile
cd $(top_srcdir) && $(AUTOMAKE) --foreign --include-deps lib/Makefile
Makefile: $(srcdir)/Makefile.in $(top_builddir)/config.status
cd $(top_builddir) \
&& CONFIG_FILES=$(subdir)/$@ CONFIG_HEADERS= $(SHELL) ./config.status
mostlyclean-noinstLIBRARIES:
mostlyclean-libLTLIBRARIES:
clean-noinstLIBRARIES:
-test -z "$(noinst_LIBRARIES)" || rm -f $(noinst_LIBRARIES)
clean-libLTLIBRARIES:
-test -z "$(lib_LTLIBRARIES)" || rm -f $(lib_LTLIBRARIES)
distclean-noinstLIBRARIES:
distclean-libLTLIBRARIES:
maintainer-clean-noinstLIBRARIES:
maintainer-clean-libLTLIBRARIES:
install-libLTLIBRARIES: $(lib_LTLIBRARIES)
@$(NORMAL_INSTALL)
$(mkinstalldirs) $(DESTDIR)$(libdir)
@list='$(lib_LTLIBRARIES)'; for p in $$list; do \
if test -f $$p; then \
echo "$(LIBTOOL) --mode=install $(INSTALL) $$p $(DESTDIR)$(libdir)/$$p"; \
$(LIBTOOL) --mode=install $(INSTALL) $$p $(DESTDIR)$(libdir)/$$p; \
else :; fi; \
done
uninstall-libLTLIBRARIES:
@$(NORMAL_UNINSTALL)
list='$(lib_LTLIBRARIES)'; for p in $$list; do \
$(LIBTOOL) --mode=uninstall rm -f $(DESTDIR)$(libdir)/$$p; \
done
.c.o:
$(COMPILE) -c $<
@ -152,10 +175,27 @@ distclean-compile:
maintainer-clean-compile:
libcurl.a: $(libcurl_a_OBJECTS) $(libcurl_a_DEPENDENCIES)
-rm -f libcurl.a
$(AR) cru libcurl.a $(libcurl_a_OBJECTS) $(libcurl_a_LIBADD)
$(RANLIB) libcurl.a
.c.lo:
$(LIBTOOL) --mode=compile $(COMPILE) -c $<
.s.lo:
$(LIBTOOL) --mode=compile $(COMPILE) -c $<
.S.lo:
$(LIBTOOL) --mode=compile $(COMPILE) -c $<
mostlyclean-libtool:
-rm -f *.lo
clean-libtool:
-rm -rf .libs _libs
distclean-libtool:
maintainer-clean-libtool:
libcurl.la: $(libcurl_la_OBJECTS) $(libcurl_la_DEPENDENCIES)
$(LINK) -rpath $(libdir) $(libcurl_la_LDFLAGS) $(libcurl_la_OBJECTS) $(libcurl_la_LIBADD) $(LIBS)
tags: TAGS
@ -209,7 +249,7 @@ check-am: all-am
check: check-am
installcheck-am:
installcheck: installcheck-am
install-exec-am:
install-exec-am: install-libLTLIBRARIES
install-exec: install-exec-am
install-data-am:
@ -218,13 +258,14 @@ install-data: install-data-am
install-am: all-am
@$(MAKE) $(AM_MAKEFLAGS) install-exec-am install-data-am
install: install-am
uninstall-am:
uninstall-am: uninstall-libLTLIBRARIES
uninstall: uninstall-am
all-am: Makefile $(LIBRARIES)
all-am: Makefile $(LTLIBRARIES)
all-redirect: all-am
install-strip:
$(MAKE) $(AM_MAKEFLAGS) AM_INSTALL_PROGRAM_FLAGS=-s install
installdirs:
$(mkinstalldirs) $(DESTDIR)$(libdir)
mostlyclean-generic:
@ -236,33 +277,39 @@ distclean-generic:
-rm -f config.cache config.log stamp-h stamp-h[0-9]*
maintainer-clean-generic:
mostlyclean-am: mostlyclean-noinstLIBRARIES mostlyclean-compile \
mostlyclean-tags mostlyclean-generic
mostlyclean-am: mostlyclean-libLTLIBRARIES mostlyclean-compile \
mostlyclean-libtool mostlyclean-tags \
mostlyclean-generic
mostlyclean: mostlyclean-am
clean-am: clean-noinstLIBRARIES clean-compile clean-tags clean-generic \
mostlyclean-am
clean-am: clean-libLTLIBRARIES clean-compile clean-libtool clean-tags \
clean-generic mostlyclean-am
clean: clean-am
distclean-am: distclean-noinstLIBRARIES distclean-compile \
distclean-tags distclean-generic clean-am
distclean-am: distclean-libLTLIBRARIES distclean-compile \
distclean-libtool distclean-tags distclean-generic \
clean-am
-rm -f libtool
distclean: distclean-am
maintainer-clean-am: maintainer-clean-noinstLIBRARIES \
maintainer-clean-compile maintainer-clean-tags \
maintainer-clean-generic distclean-am
maintainer-clean-am: maintainer-clean-libLTLIBRARIES \
maintainer-clean-compile maintainer-clean-libtool \
maintainer-clean-tags maintainer-clean-generic \
distclean-am
@echo "This command is intended for maintainers to use;"
@echo "it deletes files that may require special tools to rebuild."
maintainer-clean: maintainer-clean-am
.PHONY: mostlyclean-noinstLIBRARIES distclean-noinstLIBRARIES \
clean-noinstLIBRARIES maintainer-clean-noinstLIBRARIES \
mostlyclean-compile distclean-compile clean-compile \
maintainer-clean-compile tags mostlyclean-tags distclean-tags \
.PHONY: mostlyclean-libLTLIBRARIES distclean-libLTLIBRARIES \
clean-libLTLIBRARIES maintainer-clean-libLTLIBRARIES \
uninstall-libLTLIBRARIES install-libLTLIBRARIES mostlyclean-compile \
distclean-compile clean-compile maintainer-clean-compile \
mostlyclean-libtool distclean-libtool clean-libtool \
maintainer-clean-libtool tags mostlyclean-tags distclean-tags \
clean-tags maintainer-clean-tags distdir info-am info dvi-am dvi check \
check-am installcheck-am installcheck install-exec-am install-exec \
install-data-am install-data install-am install uninstall-am uninstall \

View File

@ -32,6 +32,10 @@
** This code is in the public domain and has no copyright.
*/
#define _REENTRANT /* Necessary to use in Solaris, since the silly guys at Sun
made the localtime_r() prototype dependent on it (or
_POSIX_C_SOURCE or _POSIX_PTHREAD_SEMANTICS). */
#ifdef HAVE_CONFIG_H
# include "config.h"
# ifdef HAVE_ALLOCA_H
@ -215,7 +219,7 @@ static int yyRelSeconds;
static int yyRelYear;
#line 198 "getdate.y"
#line 202 "getdate.y"
typedef union {
int Number;
enum _MERIDIAN Meridian;
@ -298,11 +302,11 @@ static const short yyrhs[] = { -1,
#if YYDEBUG != 0
static const short yyrline[] = { 0,
214, 215, 218, 221, 224, 227, 230, 233, 236, 242,
248, 257, 263, 275, 278, 281, 287, 291, 295, 301,
305, 323, 329, 335, 339, 344, 348, 355, 363, 366,
369, 372, 375, 378, 381, 384, 387, 390, 393, 396,
399, 402, 405, 408, 411, 414, 417, 422, 455, 459
218, 219, 222, 225, 228, 231, 234, 237, 240, 246,
252, 261, 267, 279, 282, 285, 291, 295, 299, 305,
309, 327, 333, 339, 343, 348, 352, 359, 367, 370,
373, 376, 379, 382, 385, 388, 391, 394, 397, 400,
403, 406, 409, 412, 415, 418, 421, 426, 459, 463
};
#endif
@ -926,37 +930,37 @@ yyreduce:
switch (yyn) {
case 3:
#line 218 "getdate.y"
#line 222 "getdate.y"
{
yyHaveTime++;
;
break;}
case 4:
#line 221 "getdate.y"
#line 225 "getdate.y"
{
yyHaveZone++;
;
break;}
case 5:
#line 224 "getdate.y"
#line 228 "getdate.y"
{
yyHaveDate++;
;
break;}
case 6:
#line 227 "getdate.y"
#line 231 "getdate.y"
{
yyHaveDay++;
;
break;}
case 7:
#line 230 "getdate.y"
#line 234 "getdate.y"
{
yyHaveRel++;
;
break;}
case 9:
#line 236 "getdate.y"
#line 240 "getdate.y"
{
yyHour = yyvsp[-1].Number;
yyMinutes = 0;
@ -965,7 +969,7 @@ case 9:
;
break;}
case 10:
#line 242 "getdate.y"
#line 246 "getdate.y"
{
yyHour = yyvsp[-3].Number;
yyMinutes = yyvsp[-1].Number;
@ -974,7 +978,7 @@ case 10:
;
break;}
case 11:
#line 248 "getdate.y"
#line 252 "getdate.y"
{
yyHour = yyvsp[-3].Number;
yyMinutes = yyvsp[-1].Number;
@ -986,7 +990,7 @@ case 11:
;
break;}
case 12:
#line 257 "getdate.y"
#line 261 "getdate.y"
{
yyHour = yyvsp[-5].Number;
yyMinutes = yyvsp[-3].Number;
@ -995,7 +999,7 @@ case 12:
;
break;}
case 13:
#line 263 "getdate.y"
#line 267 "getdate.y"
{
yyHour = yyvsp[-5].Number;
yyMinutes = yyvsp[-3].Number;
@ -1008,53 +1012,53 @@ case 13:
;
break;}
case 14:
#line 275 "getdate.y"
#line 279 "getdate.y"
{
yyTimezone = yyvsp[0].Number;
;
break;}
case 15:
#line 278 "getdate.y"
#line 282 "getdate.y"
{
yyTimezone = yyvsp[0].Number - 60;
;
break;}
case 16:
#line 282 "getdate.y"
#line 286 "getdate.y"
{
yyTimezone = yyvsp[-1].Number - 60;
;
break;}
case 17:
#line 287 "getdate.y"
#line 291 "getdate.y"
{
yyDayOrdinal = 1;
yyDayNumber = yyvsp[0].Number;
;
break;}
case 18:
#line 291 "getdate.y"
#line 295 "getdate.y"
{
yyDayOrdinal = 1;
yyDayNumber = yyvsp[-1].Number;
;
break;}
case 19:
#line 295 "getdate.y"
#line 299 "getdate.y"
{
yyDayOrdinal = yyvsp[-1].Number;
yyDayNumber = yyvsp[0].Number;
;
break;}
case 20:
#line 301 "getdate.y"
#line 305 "getdate.y"
{
yyMonth = yyvsp[-2].Number;
yyDay = yyvsp[0].Number;
;
break;}
case 21:
#line 305 "getdate.y"
#line 309 "getdate.y"
{
/* Interpret as YYYY/MM/DD if $1 >= 1000, otherwise as MM/DD/YY.
The goal in recognizing YYYY/MM/DD is solely to support legacy
@ -1075,7 +1079,7 @@ case 21:
;
break;}
case 22:
#line 323 "getdate.y"
#line 327 "getdate.y"
{
/* ISO 8601 format. yyyy-mm-dd. */
yyYear = yyvsp[-2].Number;
@ -1084,7 +1088,7 @@ case 22:
;
break;}
case 23:
#line 329 "getdate.y"
#line 333 "getdate.y"
{
/* e.g. 17-JUN-1992. */
yyDay = yyvsp[-2].Number;
@ -1093,14 +1097,14 @@ case 23:
;
break;}
case 24:
#line 335 "getdate.y"
#line 339 "getdate.y"
{
yyMonth = yyvsp[-1].Number;
yyDay = yyvsp[0].Number;
;
break;}
case 25:
#line 339 "getdate.y"
#line 343 "getdate.y"
{
yyMonth = yyvsp[-3].Number;
yyDay = yyvsp[-2].Number;
@ -1108,14 +1112,14 @@ case 25:
;
break;}
case 26:
#line 344 "getdate.y"
#line 348 "getdate.y"
{
yyMonth = yyvsp[0].Number;
yyDay = yyvsp[-1].Number;
;
break;}
case 27:
#line 348 "getdate.y"
#line 352 "getdate.y"
{
yyMonth = yyvsp[-1].Number;
yyDay = yyvsp[-2].Number;
@ -1123,7 +1127,7 @@ case 27:
;
break;}
case 28:
#line 355 "getdate.y"
#line 359 "getdate.y"
{
yyRelSeconds = -yyRelSeconds;
yyRelMinutes = -yyRelMinutes;
@ -1134,115 +1138,115 @@ case 28:
;
break;}
case 30:
#line 366 "getdate.y"
#line 370 "getdate.y"
{
yyRelYear += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 31:
#line 369 "getdate.y"
#line 373 "getdate.y"
{
yyRelYear += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 32:
#line 372 "getdate.y"
#line 376 "getdate.y"
{
yyRelYear += yyvsp[0].Number;
;
break;}
case 33:
#line 375 "getdate.y"
#line 379 "getdate.y"
{
yyRelMonth += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 34:
#line 378 "getdate.y"
#line 382 "getdate.y"
{
yyRelMonth += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 35:
#line 381 "getdate.y"
#line 385 "getdate.y"
{
yyRelMonth += yyvsp[0].Number;
;
break;}
case 36:
#line 384 "getdate.y"
#line 388 "getdate.y"
{
yyRelDay += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 37:
#line 387 "getdate.y"
#line 391 "getdate.y"
{
yyRelDay += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 38:
#line 390 "getdate.y"
#line 394 "getdate.y"
{
yyRelDay += yyvsp[0].Number;
;
break;}
case 39:
#line 393 "getdate.y"
#line 397 "getdate.y"
{
yyRelHour += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 40:
#line 396 "getdate.y"
#line 400 "getdate.y"
{
yyRelHour += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 41:
#line 399 "getdate.y"
#line 403 "getdate.y"
{
yyRelHour += yyvsp[0].Number;
;
break;}
case 42:
#line 402 "getdate.y"
#line 406 "getdate.y"
{
yyRelMinutes += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 43:
#line 405 "getdate.y"
#line 409 "getdate.y"
{
yyRelMinutes += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 44:
#line 408 "getdate.y"
#line 412 "getdate.y"
{
yyRelMinutes += yyvsp[0].Number;
;
break;}
case 45:
#line 411 "getdate.y"
#line 415 "getdate.y"
{
yyRelSeconds += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 46:
#line 414 "getdate.y"
#line 418 "getdate.y"
{
yyRelSeconds += yyvsp[-1].Number * yyvsp[0].Number;
;
break;}
case 47:
#line 417 "getdate.y"
#line 421 "getdate.y"
{
yyRelSeconds += yyvsp[0].Number;
;
break;}
case 48:
#line 423 "getdate.y"
#line 427 "getdate.y"
{
if (yyHaveTime && yyHaveDate && !yyHaveRel)
yyYear = yyvsp[0].Number;
@ -1275,13 +1279,13 @@ case 48:
;
break;}
case 49:
#line 456 "getdate.y"
#line 460 "getdate.y"
{
yyval.Meridian = MER24;
;
break;}
case 50:
#line 460 "getdate.y"
#line 464 "getdate.y"
{
yyval.Meridian = yyvsp[0].Meridian;
;
@ -1508,7 +1512,7 @@ yyerrhandle:
}
return 1;
}
#line 465 "getdate.y"
#line 469 "getdate.y"
/* Include this file down here because bison inserts code above which
@ -1518,7 +1522,6 @@ yyerrhandle:
extern struct tm *gmtime ();
extern struct tm *localtime ();
extern struct tm *localtime_r (time_t *, struct tm *);
extern time_t mktime ();
/* Month and day table. */

View File

@ -8,6 +8,10 @@
** This code is in the public domain and has no copyright.
*/
#define _REENTRANT /* Necessary to use in Solaris, since the silly guys at Sun
made the localtime_r() prototype dependent on it (or
_POSIX_C_SOURCE or _POSIX_PTHREAD_SEMANTICS). */
#ifdef HAVE_CONFIG_H
# include "config.h"
# ifdef HAVE_ALLOCA_H
@ -471,7 +475,6 @@ o_merid : /* NULL */
extern struct tm *gmtime ();
extern struct tm *localtime ();
extern struct tm *localtime_r (time_t *, struct tm *);
extern time_t mktime ();
/* Month and day table. */

View File

@ -38,6 +38,10 @@
* ------------------------------------------------------------
****************************************************************************/
#define _REENTRANT /* Necessary to use in Solaris, since the silly guys at Sun
made the localtime_r() prototype dependent on it (or
_POSIX_C_SOURCE or _POSIX_PTHREAD_SEMANTICS). */
/* -- WIN32 approved -- */
#include <stdio.h>
#include <string.h>
@ -62,6 +66,12 @@
#include <netinet/in.h>
#include <sys/time.h>
#ifdef HAVE_TIME_H
#ifdef TIME_WITH_SYS_TIME
#include <time.h>
#endif
#endif
#include <sys/resource.h>
#ifdef HAVE_UNISTD_H
#include <unistd.h>
@ -362,7 +372,6 @@ CURLcode http(struct connectdata *conn)
struct tm *thistime;
#ifdef HAVE_LOCALTIME_R
extern struct tm *localtime_r(time_t *, struct tm *);
/* thread-safe version */
struct tm keeptime;
thistime = localtime_r(&data->timevalue, &keeptime);

View File

@ -80,10 +80,11 @@ puts (
" To store cookies, save the HTTP headers to a file using\n"
" -D/--dump-header!\n"
"\n"
" -B/--ftp-ascii\n"
" (FTP/LDAP) Use ASCII transfer when getting an FTP file\n"
" or LDAP info. For FTP, this can also be enforced by\n"
" using an URL that ends with \";type=A\".\n"
" -B/--use-ascii\n"
" Use ASCII transfer when getting an FTP file or LDAP\n"
" info. For FTP, this can also be enforced by using an\n"
" URL that ends with \";type=A\". This option causes data\n"
" sent to stdout to be in text mode for win32 systems.\n"
"\n"
" -c/--continue\n"
" Continue/Resume a previous file transfer. This\n"
@ -130,8 +131,12 @@ puts (
"\n"
" -e/--referer <URL>\n"
" (HTTP) Sends the \"Referer Page\" information to the HTTP\n"
" server. Some badly done CGIs fail if it's not set. This\n"
" can also be set with the -H/--header flag of course.\n"
" server. This can also be set with the -H/--header flag\n"
" of course. When used with -L/--location you can append\n"
" \";auto\" to the referer URL to make curl automatically\n"
" set the previous URL when it follows a Location:\n"
" header. The \";auto\" string can be used alone, even if\n"
" you don't set an initial referer.\n"
"\n"
" -E/--cert <certificate[:password]>\n"
" (HTTPS) Tells curl to use the specified certificate\n"
@ -156,15 +161,22 @@ puts (
" a user has pressed the submit button. This causes curl\n"
" to POST data using the content-type multipart/form-data\n"
" according to RFC1867. This enables uploading of binary\n"
" files etc. To force the 'content' part to be read from\n"
" a file, prefix the file name with an @ sign. Example,\n"
" to send your password file to the server, where 'pass­\n"
" word' is the name of the form-field to which\n"
" /etc/passwd will be the input:\n"
" files etc. To force the 'content' part to be be a file,\n"
" prefix the file name with an @ sign. To just get the\n"
" content part from a file, prefix the file name with the\n"
" letter <. The difference between @ and < is then that @\n"
" makes a file get attached in the post as a file upload,\n"
" while the < makes a text field and just get the con­\n"
" tents for that text field from a file.\n"
"\n"
" Example, to send your password file to the server,\n"
" where input:\n"
"\n"
" curl -F password=@/etc/passwd www.mypasswords.com\n"
"\n"
" To read the file's content from stdin insted of a file,\n"
" use - where the file name should've been.\n"
" use - where the file name should've been. This goes for\n"
" both @ and < constructs.\n"
"\n"
" -h/--help\n"
" Usage help.\n"
@ -215,6 +227,8 @@ puts (
" header line Location:) this flag will let curl attempt\n"
" to reattempt the get on the new place. If used together\n"
" with -i or -I, headers from all requested pages will be\n"
);
puts(
" shown.\n"
"\n"
" -m/--max-time <seconds>\n"
@ -231,9 +245,7 @@ puts (
" Makes curl scan the .netrc file in the user's home\n"
" directory for login name and password. This is typi­\n"
" cally used for ftp on unix. If used with http, curl\n"
);
puts(
" will enable user authentication. See netrc(5) for\n"
" will enable user authentication. See netrc(4) for\n"
" details on the file format. Curl will not complain if\n"
" that file hasn't the right permissions (it should not\n"
" be world nor group readable). The environment variable\n"
@ -321,6 +333,7 @@ puts (
" ward\n"
"\n"
" 0-0,-1 specifies the first and last byte only(*)(H)\n"
"\n"
" 500-700,600-799\n"
" specifies 300 bytes from offset 500(H)\n"
"\n"
@ -373,6 +386,7 @@ puts (
" Specify user and password to use for Proxy authentica­\n"
" tion. If no password is specified, curl will ask for it\n"
" interactively.\n"
"\n"
" -v/--verbose\n"
" Makes the fetching more verbose/talkative. Mostly\n"
" usable for debugging. Lines starting with '>' means\n"
@ -411,7 +425,6 @@ puts (
"\n"
" http_code The numerical code that was found in the\n"
" last retrieved HTTP(S) page.\n"
"\n"
" time_total The total time, in seconds, that the\n"
" full operation lasted. The time will be\n"
" displayed with millisecond resolution.\n"
@ -424,6 +437,7 @@ puts (
" time_connect The time, in seconds, it took from the\n"
" start until the connect to the remote\n"
" host (or proxy) was completed.\n"
"\n"
" time_pretransfer\n"
" The time, in seconds, it took from the\n"
" start until the file transfer is just\n"
@ -455,6 +469,8 @@ puts (
" 1.1 specification for details and explanations.\n"
"\n"
" (FTP) Specifies a custom FTP command to use instead of\n"
);
puts(
" LIST when doing file lists with ftp.\n"
"\n"
" -y/--speed-time <time>\n"
@ -462,7 +478,6 @@ puts (
" ond during a speed-time period, the download gets\n"
" aborted. If speed-time is used, the default speed-limit\n"
" will be 1 unless set with -y.\n"
"\n"
" -Y/--speed-limit <speed>\n"
" If a download is slower than this given speed, in bytes\n"
" per second, for speed-time seconds it gets aborted.\n"
@ -470,14 +485,13 @@ puts (
"\n"
" -z/--time-cond <date expression>\n"
" (HTTP) Request to get a file that has been modified\n"
);
puts(
" later than the given time and date, or one that has\n"
" been modified before that time. The date expression can\n"
" be all sorts of date strings or if it doesn't match any\n"
" internal ones, it tries to get the time from a given\n"
" file name instead! See the GNU date(1) man page for\n"
" date expression details.\n"
"\n"
" Start the date expression with a dash (-) to make it\n"
" request for a document that is older than the given\n"
" date/time, default is a document that is newer than the\n"
@ -515,7 +529,6 @@ puts (
"\n"
" HTTPS_PROXY [protocol://]<host>[:port]\n"
" Sets proxy server to use for HTTPS.\n"
"\n"
" FTP_PROXY [protocol://]<host>[:port]\n"
" Sets proxy server to use for FTP.\n"
"\n"
@ -529,6 +542,7 @@ puts (
" NO_PROXY <comma-separated list of hosts>\n"
" list of host names that shouldn't go through any proxy.\n"
" If set to a asterisk '*' only, it matches all hosts.\n"
"\n"
" COLUMNS <integer>\n"
" The width of the terminal. This variable only affects\n"
" curl when the --progress-bar option is used.\n"
@ -566,7 +580,6 @@ puts (
"\n"
" 11 FTP weird PASS reply. Curl couldn't parse the reply\n"
" sent to the PASS request.\n"
"\n"
" 12 FTP weird USER reply. Curl couldn't parse the reply\n"
" sent to the USER request.\n"
"\n"
@ -581,6 +594,7 @@ puts (
"\n"
" 16 FTP can't reconnect. Couldn't connect to the host we\n"
" got in the 227-line.\n"
"\n"
" 17 FTP couldn't set binary. Couldn't change transfer\n"
" method to binary.\n"
"\n"
@ -618,7 +632,6 @@ puts (
" 30 FTP PORT failed. The PORT command failed.\n"
"\n"
" 31 FTP couldn't use REST. The REST command failed.\n"
"\n"
" 32 FTP couldn't use SIZE. The SIZE command failed. The\n"
" command is an extension to the original FTP spec RFC\n"
" 959.\n"
@ -632,6 +645,7 @@ puts (
"\n"
" 36 FTP bad download resume. Couldn't continue an earlier\n"
" aborted download.\n"
"\n"
" 37 FILE couldn't read file. Failed to open the file. Per­\n"
" missions?\n"
"\n"
@ -649,19 +663,19 @@ puts (
"\n"
"BUGS\n"
" If you do find any (or have other suggestions), mail Daniel\n"
" Stenberg <Daniel.Stenberg@haxx.nu>.\n"
" Stenberg <Daniel.Stenberg@haxx.se>.\n"
"\n"
"AUTHORS / CONTRIBUTORS\n"
" - Daniel Stenberg <Daniel.Stenberg@haxx.nu>\n"
" - Daniel Stenberg <Daniel.Stenberg@haxx.se>\n"
" - Rafael Sagula <sagula@inf.ufrgs.br>\n"
" - Sampo Kellomaki <sampo@iki.fi>\n"
" - Linas Vepstas <linas@linas.org>\n"
" - Bjorn Reese <breese@mail1.stofanet.dk>\n"
" - Johan Anderson <johan@homemail.com>\n"
" - Kjell Ericson <Kjell.Ericson@haxx,nu>\n"
" - Kjell Ericson <Kjell.Ericson@haxx.se>\n"
" - Troy Engel <tengel@sonic.net>\n"
" - Ryan Nelson <ryan@inch.com>\n"
" - Bjorn Stenberg <Bjorn.Stenberg@haxx.nu>\n"
" - Björn Stenberg <Bjorn.Stenberg@haxx.se>\n"
" - Angus Mackay <amackay@gus.ml.org>\n"
" - Eric Young <eay@cryptsoft.com>\n"
" - Simon Dick <simond@totally.irrelevant.org>\n"
@ -679,9 +693,9 @@ puts (
" - Ralph Beckmann <rabe@uni-paderborn.de>\n"
" - T. Yamada <tai@imasy.or.jp>\n"
" - Lars J. Aas <larsa@sim.no>\n"
" - Jörn Hartroth <Joern.Hartroth@telekom.de>\n"
" - Jörn Hartroth <Joern.Hartroth@computer.org>\n"
" - Matthew Clarke <clamat@van.maves.ca>\n"
" - Linus Nielsen <Linus.Nielsen@haxx.nu>\n"
" - Linus Nielsen <Linus.Nielsen@haxx.se>\n"
" - Felix von Leitner <felix@convergence.de>\n"
" - Dan Zitter <dzitter@zitter.net>\n"
" - Jongki Suwandi <Jongki.Suwandi@eng.sun.com>\n"
@ -695,9 +709,12 @@ puts (
" - Paul Marquis <pmarquis@iname.com>\n"
" - David LeBlanc <dleblanc@qnx.com>\n"
" - Rich Gray at Plus Technologies\n"
" - Luong Dinh Dung <u8luong@lhsystems.hu>\n"
" - Torsten Foertsch <torsten.foertsch@gmx.net>\n"
" - Kristian Köhntopp <kris@koehntopp.de>\n"
"\n"
"WWW\n"
" http://curl.haxx.nu\n"
" http://curl.haxx.se\n"
"\n"
"FTP\n"
" ftp://ftp.sunet.se/pub/www/utilities/curl/\n"
@ -710,7 +727,7 @@ puts (
" You always find news about what's going on as well as the latest versions\n"
" from the curl web pages, located at:\n"
"\n"
" http://curl.haxx.nu\n"
" http://curl.haxx.se\n"
"\n"
"SIMPLE USAGE\n"
"\n"
@ -769,6 +786,8 @@ puts (
" pick a file like:\n"
"\n"
" curl http://name:passwd@machine.domain/full/path/to/file\n"
);
puts(
"\n"
" or specify user and password separately like in\n"
"\n"
@ -798,8 +817,6 @@ puts (
" curl -u user:passwd -x my-proxy:888 http://www.get.this/\n"
"\n"
" Some proxies require special authentication. Specify by using -U as above:\n"
);
puts(
"\n"
" curl -U user:passwd -x my-proxy:888 http://www.get.this/\n"
"\n"
@ -887,7 +904,7 @@ puts (
"\n"
" Store the HTTP headers in a separate file:\n"
"\n"
" curl --dump-header headers.txt curl.haxx.nu\n"
" curl --dump-header headers.txt curl.haxx.se\n"
"\n"
" Note that headers stored in a separate file can be very useful at a later\n"
" time if you want curl to use cookies sent by the server. More about that in\n"
@ -1049,6 +1066,8 @@ puts (
"\n"
" Note that by specifying -b you enable the \"cookie awareness\" and with -L\n"
" you can make curl follow a location: (which often is used in combination\n"
);
puts(
" with cookies). So that if a site sends cookies and a location, you can\n"
" use a non-existing file to trig the cookie awareness like:\n"
"\n"
@ -1069,8 +1088,6 @@ puts (
"\n"
" From left-to-right:\n"
" % - percentage completed of the whole transfer\n"
);
puts(
" Total - total size of the whole expected transfer\n"
" % - percentage completed of the download\n"
" Received - currently downloaded amount of bytes\n"
@ -1245,6 +1262,26 @@ puts (
"\n"
" Otherwise, curl will first attempt to use v3 and then v2.\n"
"\n"
" To use OpenSSL to convert your favourite browser's certificate into a PEM\n"
" formatted one that curl can use, do something like this (assuming netscape,\n"
" but IE is likely to work similarly):\n"
"\n"
" You start with hitting the 'security' menu button in netscape. \n"
"\n"
" Select 'certificates->yours' and then pick a certificate in the list \n"
"\n"
" Press the 'export' button \n"
"\n"
" enter your PIN code for the certs \n"
"\n"
" select a proper place to save it \n"
"\n"
" Run the 'openssl' application to convert the certificate. If you cd to the\n"
" openssl installation, you can do it like:\n"
"\n"
" # ./apps/openssl pkcs12 -certfile [file you saved] -out [PEMfile]\n"
"\n"
"\n"
"RESUMING FILE TRANSFERS\n"
"\n"
" To continue a file transfer where it was previously aborted, curl supports\n"
@ -1302,6 +1339,8 @@ puts (
"\n"
" Aliases for 'm' are 'match' and 'find', and aliases for 'd' are 'define'\n"
" and 'lookup'. For example,\n"
);
puts(
"\n"
" curl dict://dict.org/find:curl\n"
"\n"
@ -1351,8 +1390,6 @@ puts (
"\n"
"\n"
" The usage of the -x/--proxy flag overrides the environment variables.\n"
);
puts(
"\n"
"NETRC\n"
"\n"
@ -1369,7 +1406,7 @@ puts (
"\n"
" A very simple .netrc file could look something like:\n"
"\n"
" machine curl.haxx.nu login iamdaniel password mysecret\n"
" machine curl.haxx.se login iamdaniel password mysecret\n"
"\n"
"CUSTOM OUTPUT\n"
"\n"

View File

@ -1,3 +1,3 @@
#define CURL_NAME "curl"
#define CURL_VERSION "7.0.1beta"
#define CURL_VERSION "7.0.11test"
#define CURL_ID CURL_NAME " " CURL_VERSION " (" OS ") "