WWW::CheckSite - OO interface to an iterator that checks a website
use WWW::CheckSite; my $wcs = WWW::CheckSite->new( uri => 'http://www.test-smoke.org/', prefix => 'tsorg', save => 1, ); $wcs->validate; $wcs->write_report;
Or using saved data (skip the real validation):
my $wcs = WWW::CheckSite->load( uri => 'http://www.test-smoke.org/', prefix => 'tsorg', ); $wcs->write_report;
This module implents a spider, that checks the pages on a website. For each page the links and images on that page are checked for availability. After that, the page is validated by W3.ORG.
When the spider is done, one can have a report in HTML written.
WARNING: Although the spider respects /robots.txt on the target site, the validator does not! Use this tool only on your own sites.
Initialize a new instance. Options supported:
Initialize the object from datafile. Supported options:
validate() method collects all the data.
Return a list with all URLs encountered during site-traversal.
Generate the reports.
Load, fill the HTML::Template template and write the reports.
Load, fill the Template Toolkit template and write the reports.
Do a Carp::croak().
Load and fill the HTML::Template.
Please report any bugs or feature requests to
email@example.com, or through the web interface at http://rt.cpan.org. I will be notified, and then you'll automatically be notified of progress on your bug as I make changes.
Copyright MMV Abe Timmerman, All Rights Reserved.
This program is free software; you can redistribute it and/or modify it under the same terms as Perl itself.
This program is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE.