1
0
mirror of https://github.com/moparisthebest/wget synced 2024-07-03 16:38:41 -04:00

Removing obsolete and/or incomplete stuff from util/.

This commit is contained in:
Micah Cowan 2008-01-31 01:38:46 -08:00
parent 828af9619f
commit ec866caaf0
8 changed files with 9 additions and 551 deletions

View File

@ -1,3 +1,11 @@
2008-01-31 Micah Cowan <micah@cowan.name>
* util/README, util/dist-wget, util/download-netscape.html,
util/download.html, util/update_po_files.sh, util/wget.spec:
Removed (obsolete and/or incomplete).
* Makefile.am: Removed no-longer-existant util stuff from
extra_DIST (but added the README).
2008-01-28 Micah Cowan <micah@cowan.name>
* po/en@quot.po, po/en@boldquot.po, po/en_US.po: Updated

View File

@ -29,5 +29,4 @@
# Version: @VERSION@
#
EXTRA_DIST = dist-wget download.html download-netscape.html \
rmold.pl update_po_files.sh wget.spec
EXTRA_DIST = README rmold.pl

View File

@ -3,23 +3,6 @@
This directory contains various optional utilities to help you use
Wget.
Socks:
======
Antonio Rosella <antonio.rosella@agip.it> has written a sample HTML
frontend and a Perl script to demonstrate usage of socksified Wget as
web retriever.
To configure Wget to use socks, do a
$ ./configure --with-sox.
download.html and download-netscape.html are examples of how you can
use socksified Wget to schedule the WWW requests. wget.cgi is a
CGI Perl script used in conjunction with download.html, which
schedules request using the "at" command.
To get the script, contact Antonino.
rmold.pl
========
This Perl script is used to check which local files are no longer on

View File

@ -1,182 +0,0 @@
#!/bin/sh
# Copyright (C) 2001, 2007, 2008 Free Software Foundation, Inc.
# This program is free software; you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation; either version 3 of the License, or
# (at your option) any later version.
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
# Additional permission under GNU GPL version 3 section 7
# If you modify this program, or any covered work, by linking or
# combining it with the OpenSSL project's OpenSSL library (or a
# modified version of that library), containing parts covered by the
# terms of the OpenSSL or SSLeay licenses, the Free Software Foundation
# grants you additional permission to convey the resulting work.
# Corresponding Source for a non-source form of such a combination
# shall include the source code for the parts of OpenSSL used as well
# as that of the covered work.
##
#
# This script creates a Wget distribution (wget-VERSION.tar.gz).
# It uses `make dist' to do most of the work, but corrects some
# things that `make dist' doesn't and can't do. Specifically:
#
# * Checks out the clean source from the Subversion repository to a
# temporary directory.
# * Runs autoconf, configure and `make' in the doc and po subdirs to
# make sure that all the generated files, such as `configure',
# `wget.info', and translated PO files, end up in the distribution.
# * Optionally changes src/version.c and doc/version.texi to the
# version forced by `--force-version'.
# * Runs `make dist' to produce the archive.
# * Removes the checkout.
#
# For example, to produce a Wget beta based on the latest sources on
# the trunk, with version changed to "1.23-beta10", run `dist-wget
# --force-version 1.23-beta10'. You can choose which sources will be
# used by specifying `-b PATH' ("trunk" by default) in combination
# with one of `-D DATE' or `-r REVISION' (the latest revision by
# default).
#
# Use the MAKE environment variable to specify a different version of
# make, for example MAKE=gmake dist-wget ...
#
##
set -e
SVNURL=http://svn.dotsrc.org/repo/wget/
SUBDIR=wget.checkout.$$
DEBUG=no
EXPORT_PATH=trunk
EXPORT_REVISION=HEAD
VERSION=
MAKE=${MAKE-make}
if test x"$TMPDIR" = x
then
TMPDIR=/tmp
fi
DEST_DIR=`pwd`
while test x"$*" != x
do
case "$1" in
-d)
DEBUG=yes
;;
-b)
shift
EXPORT_PATH=$1
;;
-D)
shift
# Subversion uses the -r {DATE} syntax for specifying revisions
# based on dates.
EXPORT_REVISION={$1}
;;
-r)
shift
EXPORT_REVISION=$1
;;
--force-version)
shift
VERSION=$1
;;
*)
echo "Usage: $0 [-d] [-b BRANCH-PATH] [-r REVISION | -D DATE]" >&2
exit 1
esac
shift
done
# Resolve echo -n incompatibilities.
e_n=-n
e_c=
if test x"`(echo -n foo; echo bar)`" != xfoobar; then
e_n=
e_c='\c'
fi
# File for output/errors redirection.
O=$DEST_DIR/dist-output
cd $TMPDIR
echo "Building wget dist in $TMPDIR/$SUBDIR."
echo "Output from commands is in $O."
echo "-----------" >$O
# Checkout clean sources from the repository.
echo $e_n "Exporting $SVNURL$EXPORT_PATH/ (-r $EXPORT_REVISION) to $TMPDIR/$SUBDIR... $e_c"
svn export -r "$EXPORT_REVISION" "$SVNURL/$EXPORT_PATH/" $SUBDIR 1>>$O 2>&1
echo "done."
cd $SUBDIR
# Force the version if required.
if test x"$VERSION" != x
then
echo "Forcing version to $VERSION."
echo "char *version_string = \"$VERSION\";" > src/version.c
echo "@set VERSION $VERSION" > doc/version.texi
fi
# Create configure and friends.
if test ! -f configure; then
echo $e_n "Creating \`configure' and \`src/config.h'... $e_c"
./autogen.sh 1>>$O 2>&1
echo "done."
fi
# Remove `Makefile' if it already exists.
if test -f Makefile; then
echo $e_n "Cleaning old Makefiles with \`$MAKE distclean'... $e_c"
$MAKE distclean 1>>$O 2>&1
echo "done."
fi
# Create a new `Makefile'.
echo $e_n "Running configure... $e_c"
CFLAGS=-g ./configure 1>>$O 2>&1
echo "done."
# Now build the MO files.
echo $e_n "Building MO files out of PO files... $e_c"
cd po
$MAKE 1>>$O 2>&1
cd ..
echo "done."
# Now build the Info documentation and the man page.
echo $e_n "Building Info and man documentation... $e_c"
cd doc
$MAKE 1>>$O 2>&1
cd ..
echo "done."
# Create the distribution file.
echo $e_n "Creating distribution tarball... $e_c"
$MAKE dist 1>>$O 2>&1
archive=`echo wget-*.tar.gz`
mv "$archive" $DEST_DIR
echo "$archive"
cd ..
if test $DEBUG = no; then
rm -rf $SUBDIR 1>>$O 2>&1
fi

View File

@ -1,114 +0,0 @@
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 3.2//EN">
<html>
<head>
<title>Wget Gateway</title>
<link rev="made" href="mailto:Antonio.Rosella@agip.it">
</head>
<body>
<center>
<h1>Wget Gateway</h1>
</center>
<p>
Welcome to Wget Gateway, a simple page showing the usage of
socksified wget behind a firewall. In my configuration it is
very useful because:
<ul>
<li>Only few users can exit from firewall
<li>A lot of users need information that can be reached in Internet
<li>I cannot dowload big files during my job time, so, I
have to schedule the requests after the normal work time
</ul>
<p>
With the combination of a socksified wget and a simple cgi
that schedules the requests can I reach the aim. All you need
is:
<ul>
<li> A socksified copy of
<a href="ftp://gnjilux.cc.fer.hr/pub/unix/util/wget/wget.tar.gz">
wget</a>
<li> Perl (available on all the GNU mirroring sites)
<li> cgi-lib.pl (available at
<a href="ftp://ftp.switch.ch/mirror/CPAN/ROADMAP.html">CPAN</a>)
<li> A customized copy of this html
<li> A customized copy of socks.cgi
</ul>
This is my h/s configuration:
<pre>
+----------+ +----------------------------------+ +---------------------+
| Firewall | | Host that can exit from firewall | | Intranet www server |
+----------+ | htceff | +---------------------+
+----------------------------------+ | Wget.html |
| socksified wget | +---------------------+
| cgi-lib.pl |
| perl |
| wget.cgi |
+----------------------------------+
</pre>
<p>
wget.cgi, wget and cgi-lib.pl are located in the usual
cgi-bin directory. The customization of wget.cgi and
wget.html has to reflect you installation, i.e.:
<ul>
<li> download-netscape.html requires wget.cgi
<li> wget.cgi requires Perl, cgi-lib.pl and wget
<li>
wget.cgi has to download the files to a directory writable
by the user submitting the request. At the moment I have an
anonymous ftp installed on <em>htceff</em>, and wget puts
dowloaded files to /pub/incoming directory (if you look at
wget.cgi, it sets the destdir to "/u/ftp/pub/incoming" if
the user leaves it blank).
</ul>
<p>
You can also add other parameters that you want to pass to wget,
but in this case you will also have to modify wget.cgi
<hr>
<form method="get" action="http://localhost/cgi-bin/wget.cgi">
<center>
<table border=1>
<td>Recursive Download
<td><select name=Recursion>
<Option selected value=N>No</Option>
<Option value=Y>Yes</Option>
</select>
</table>
<hr>
<table border=1>
<td>Depth
<td><input type="radio" name=depth value=1 checked> 1
<td><input type="radio" name=depth value=2 > 2
<td><input type="radio" name=depth value=3 > 3
<td><input type="radio" name=depth value=4 > 4
<td><input type="radio" name=depth value=5 > 5
</table>
<hr>
<table>
<td>Url to download: <td><input name="url" size=50><TR>
<td>Destination directory: <td><input name="destdir" size=50><TR>
</table>
<hr>
Now you can
<font color=yellow><input type="submit" value="download"></font>
the requested URL or
<font color=yellow><input type="reset" value="reset"></font>
the form.
</form>
<hr>
Feedback is always useful! Please contact me at
<address>
<a href="mailto:Antonio.Rosella@agip.it">Antonio Rosella&lt;Antonio.Rosella@agip.it&gt;</a>.
</address>
You can send your suggestions or bug reports for Wget to
<address>
<a href="mailto:hniksic@arsdigita.com">Hrvoje Niksic &lt;hniksic@arsdigita.com&gt;</a>.
</address>
<!-- hhmts start -->
Last modified: Mon Oct 23 17:40:03 CEST 2000
<!-- hhmts end -->
</body>
</html>

View File

@ -1,106 +0,0 @@
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML//EN">
<html>
<head>
<title>Wget Gateway</title>
<link rev="made" href="mailto:Antonio.Rosella@agip.it">
</head>
<body>
<h1>Wget Gateway</h1>
<p>
Welcome to Wget Gateway, a simple page showing the usage of
socksified wget behind a firewall. In my configuration it is
very useful because:
<ul>
<li>Only few users can exit from firewall
<li>A lot of users need information that can be reached in Internet
<li>I cannot dowload big files during my job time, so, I
have to schedule the requests after the normal work time
</ul>
<p>
With the combination of a socksified wget and a simple cgi
that schedules the requests can I reach the aim. All you need
is:
<ul>
<li> A socksified copy of
<a href="ftp://gnjilux.cc.fer.hr/pub/unix/util/wget/wget.tar.gz">
wget</a>
<li> Perl (available on all the GNU mirroring sites)
<li> cgi-lib.pl (available at
<a href="ftp://ftp.switch.ch/mirror/CPAN/ROADMAP.html">CPAN</a>)
<li> A customized copy of this html
<li> A customized copy of socks.cgi
</ul>
This is my h/s configuration:
<pre>
+----------+ +----------------------------------+ +---------------------+
| Firewall | | Host that can exit from firewall | | Intranet www server |
+----------+ | htceff | +---------------------+
+----------------------------------+ | Wget.html |
| socksified wget | +---------------------+
| cgi-lib.pl |
| perl |
| wget.cgi |
+----------------------------------+
</pre>
<p>
wget.cgi, wget and cgi-lib.pl are located in the usual
cgi-bin directory. The customization of wget.cgi and
wget.html has to reflect you installation, i.e.:
<ul>
<li> download.html requires wget.cgi
<li> wget.cgi requires Perl, cgi-lib.pl and wget
<li>
wget.cgi has to download the files to a directory writable
by the user submitting the request. At the moment I have an
anonymous ftp installed on <em>htceff</em>, and wget puts
dowloaded files to /pub/incoming directory (if you look at
wget.cgi, it sets the destdir to "/u/ftp/pub/incoming" if
the user leaves it blank).
</ul>
<p>
You can also add other parameters that you want to pass to wget,
but in this case you will also have to modify wget.cgi
<hr>
<form method="get" action="http://localhost/cgi-bin/wget.cgi">
<h3>Downloading (optionally recursive)</h3>
<ul>
<li>
Recursion:
<Select name=Recursion>
<Option selected value=N>No</Option>
<Option value=Y>Yes</Option>
</Select>
<li>
Depth:
<input type="radio" name=depth value=1 checked>1
<input type="radio" name=depth value=2 >2
<input type="radio" name=depth value=3 >3
<input type="radio" name=depth value=4 >4
<input type="radio" name=depth value=5 >5
<li>
Url to download: <input name="url" size=50>
<li>
Destination directory: <input name="destdir" size=50>
</ul>
Now you can <input type="submit" value="download"> the
requested URL or <input type="reset" value="reset"> the form.
</form>
<hr>
Feedback is always useful! Please contact me at
<address>
<a href="mailto:Antonio.Rosella@agip.it">Antonio Rosella&lt;Antonio.Rosella@agip.it&gt;</a>.
</address>
You can send your suggestions or bug reports for Wget to
<address>
<a href="mailto:hniksic@arsdigita.com">Hrvoje Niksic &lt;hniksic@arsdigita.com&gt;</a>.
</address>
<!-- hhmts start -->
Last modified: October 23, 2000
<!-- hhmts end -->
</body>
</html>

View File

@ -1,13 +0,0 @@
#!/bin/bash
for i in *.po
do
mv $i $i.old
wget http://www.iro.umontreal.ca/translation/maint/wget/$i
if test -f $i
then
rm -f $i.old
fi
done

View File

@ -1,117 +0,0 @@
Name: wget
Version: 1.7
Release: 1
Copyright: GPL
Source: ftp://ftp.gnu.org/gnu/wget/wget-%{version}.tar.gz
Url: http://www.gnu.org/software/wget/
Provides: webclient
Prereq: /sbin/install-info
BuildRoot: /var/tmp/%{name}-root
Group: Applications/Internet
Group(cs): Aplikace/Internet
Summary: A utility for retrieving files using the HTTP or FTP protocols.
Summary(cs): Nástroj pro stahování souborù pomocí protokolù HTTP nebo FTP.
%description
GNU Wget is a free network utility to retrieve files from the World
Wide Web using HTTP and FTP protocols. It works non-interactively,
thus enabling work in the background, after having logged off.
Wget supports recursive retrieval of HTML pages, as well as FTP sites.
Wget supports proxy servers, which can lighten the network load, speed
up retrieval and provide access behind firewalls.
It works exceedingly well also on slow or unstable connections,
keeping getting the document until it is fully retrieved. Re-getting
files from where it left off works on servers (both HTTP and FTP) that
support it. Matching of wildcards and recursive mirroring of
directories are available when retrieving via FTP. Both HTTP and FTP
retrievals can be time-stamped, thus Wget can see if the remote file
has changed since last retrieval and automatically retrieve the new
version if it has.
Install wget if you need to retrieve large numbers of files with HTTP or
FTP, or if you need a utility for mirroring web sites or FTP directories.
%description -l cs
%prep
%setup -q
%build
%configure --sysconfdir=/etc
make
%install
rm -rf $RPM_BUILD_ROOT
%makeinstall
gzip $RPM_BUILD_ROOT%{_infodir}/*
%post
/sbin/install-info %{_infodir}/wget.info.gz %{_infodir}/dir
%preun
if [ "$1" = 0 ]; then
/sbin/install-info --delete %{_infodir}/wget.info.gz %{_infodir}/dir
fi
%clean
rm -rf $RPM_BUILD_ROOT
%files
%defattr(-,root,root)
%doc AUTHORS MAILING-LIST NEWS README INSTALL doc/ChangeLog doc/sample.wgetrc
%config /etc/wgetrc
%{_bindir}/wget
%{_infodir}/*
/usr/share/locale/*/LC_MESSAGES/*
%changelog
* Wed Jan 3 2001 Jan Prikryl <prikryl@cg.tuwien.ac.at>
- preliminary version for 1.7
- removed all RedHat patches from 1.5.3 for this moment
* Tue Aug 1 2000 Bill Nottingham <notting@redhat.com>
- setlocale for LC_CTYPE too, or else all the translations think their
characters are unprintable.
* Thu Jul 13 2000 Prospector <bugzilla@redhat.com>
- automatic rebuild
* Sun Jun 11 2000 Bill Nottingham <notting@redhat.com>
- build in new environment
* Mon Jun 5 2000 Bernhard Rosenkraenzer <bero@redhat.com>
- FHS compliance
* Thu Feb 3 2000 Bill Nottingham <notting@redhat.com>
- handle compressed man pages
* Thu Aug 26 1999 Jeff Johnson <jbj@redhat.com>
- don't permit chmod 777 on symlinks (#4725).
* Sun Mar 21 1999 Cristian Gafton <gafton@redhat.com>
- auto rebuild in the new build environment (release 4)
* Fri Dec 18 1998 Bill Nottingham <notting@redhat.com>
- build for 6.0 tree
- add Provides
* Sat Oct 10 1998 Cristian Gafton <gafton@redhat.com>
- strip binaries
- version 1.5.3
* Sat Jun 27 1998 Jeff Johnson <jbj@redhat.com>
- updated to 1.5.2
* Thu Apr 30 1998 Cristian Gafton <gafton@redhat.com>
- modified group to Applications/Networking
* Wed Apr 22 1998 Cristian Gafton <gafton@redhat.com>
- upgraded to 1.5.0
- they removed the man page from the distribution (Duh!) and I added it back
from 1.4.5. Hey, removing the man page is DUMB!
* Fri Nov 14 1997 Cristian Gafton <gafton@redhat.com>
- first build against glibc