This module parses /robots.txt files which are used to forbid conforming robots from accessing parts of a web site. The parsed files are kept in a WWW::RobotRules object, and this object provides methods to check if access to a given URL is prohibited. WWW: http://search.cpan.org/dist/WWW-RobotRules/