This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.
This one could be pretty trivial to implement just by changing W3C::UserAgent to inherit from LWP::RobotUA instead of LWP::UserAgent. What's needed: - Check if there are any side effects for the superclass change. - Provide an option to behave badly, ie. not respect robots.txt. - Make sure that the user understands whenever this causes 403's.
See also the thread at <http://lists.w3.org/Archives/Public/www-validator/2002Dec/0211.html> for additional discussion and resources.
This is now implemented in CVS, and will be in the next version (3.9.3).