Merge pull request #31 from davehome/master

New packages: perl-WWW-RobotRules-6.01 and perl-LWP-6.02.
This commit is contained in:
Juan RP 2011-07-14 11:16:20 -07:00
commit 43b83532bf
2 changed files with 62 additions and 0 deletions

36
srcpkgs/perl-LWP/template Normal file
View file

@ -0,0 +1,36 @@
# Template build file for 'perl-LWP'.
pkgname=perl-LWP
version=6.02
wrksrc="libwww-perl-$version"
distfiles="${CPAN_SITE}/LWP/libwww-perl-$version.tar.gz"
build_style=perl_module
short_desc="LWP - The World-Wide Web library for Perl (libwww-perl)"
maintainer="davehome <davehome@redthumb.info.tm>"
homepage="http://search.cpan.org/~gaas/libwww-perl-6.02/lib/LWP.pm"
license="GPL-2"
checksum=b5193e9e2eb2fa6ff8b7d4d22ec4e9010706f65b6042e86cc537d7f2f362c232
long_desc="
The libwww-perl collection is a set of Perl modules which provides a simple
and consistent application programming interface (API) to the World-Wide Web.
The main focus of the library is to provide classes and functions that allow
you to write WWW clients. The library also contain modules that are of more
general use and even classes that help you implement simple HTTP servers.
Most modules in this library provide an object oriented API. The user agent,
requests sent and responses received from the WWW server are all represented
by objects. This makes a simple and powerful interface to these services. The
interface is easy to extend and customize for your own needs."
noarch=yes
Add_dependency full perl-URI
Add_dependency full perl-LWP-MediaTypes
Add_dependency full perl-Encode-Locale
Add_dependency full perl-HTTP-Message
Add_dependency full perl-File-Listing
Add_dependency full perl-HTTP-Negotiate
Add_dependency full perl-HTTP-Daemon
Add_dependency full perl-Net-HTTP
Add_dependency full perl-HTTP-Cookies
Add_dependency full perl-WWW-RobotRules
Add_dependency full perl

View file

@ -0,0 +1,26 @@
# Template build file for 'perl-WWW-RobotRules'.
pkgname=perl-WWW-RobotRules
version=6.01
wrksrc="WWW-RobotRules-$version"
distfiles="${CPAN_SITE}/WWW/WWW-RobotRules-$version.tar.gz"
build_style=perl_module
short_desc="WWW::RobotRules - database of robots.txt-derived permissions"
maintainer="davehome <davehome@redthumb.info.tm>"
homepage="http://search.cpan.org/~gaas/WWW-RobotRules-6.01/lib/WWW/RobotRules.pm"
license="GPL-2"
checksum=f817e3e982c9d869c7796bcb5737c3422c2272355424acd162d0f3b132bec9d3
long_desc="
This module parses /robots.txt files as specified in
A Standard for Robot Exclusion, at http://www.robotstxt.org/wc/norobots.html
Webmasters can use the /robots.txt file to forbid conforming robots from
accessing parts of their web site.
The parsed files are kept in a WWW::RobotRules object, and this object
provides methods to check if access to a given URL is prohibited. The same
WWW::RobotRules object can be used for one or more parsed /robots.txt files
on any number of hosts."
noarch=yes
Add_dependency full perl-URI
Add_dependency full perl