<?xml version="1.0" encoding="UTF-8" standalone="yes" ?>
<!DOCTYPE bugzilla SYSTEM "https://www.w3.org/Bugs/Public/page.cgi?id=bugzilla.dtd">

<bugzilla version="5.0.4"
          urlbase="https://www.w3.org/Bugs/Public/"
          
          maintainer="sysbot+bugzilla@w3.org"
>

    <bug>
          <bug_id>2347</bug_id>
          
          <creation_ts>2005-10-17 13:34:21 +0000</creation_ts>
          <short_desc>Allow the W3C Link validator to check links to the CSS validator</short_desc>
          <delta_ts>2007-07-19 09:24:14 +0000</delta_ts>
          <reporter_accessible>1</reporter_accessible>
          <cclist_accessible>1</cclist_accessible>
          <classification_id>1</classification_id>
          <classification>Unclassified</classification>
          <product>CSSValidator</product>
          <component>Other</component>
          <version>CSS Validator</version>
          <rep_platform>PC</rep_platform>
          <op_sys>All</op_sys>
          <bug_status>RESOLVED</bug_status>
          <resolution>WONTFIX</resolution>
          
          
          <bug_file_loc>http://jigsaw.w3.org/robots.txt</bug_file_loc>
          <status_whiteboard></status_whiteboard>
          <keywords>Usability</keywords>
          <priority>P2</priority>
          <bug_severity>normal</bug_severity>
          <target_milestone>---</target_milestone>
          
          
          <everconfirmed>1</everconfirmed>
          <reporter name="Otto Stolz">Otto.Stolz</reporter>
          <assigned_to name="Olivier Thereaux">ot</assigned_to>
          <cc>Otto.Stolz</cc>
          
          <qa_contact name="qa-dev tracking">www-validator-cvs</qa_contact>

      

      

      

          <comment_sort_order>oldest_to_newest</comment_sort_order>  
          <long_desc isprivate="0" >
    <commentid>6710</commentid>
    <comment_count>0</comment_count>
    <who name="Otto Stolz">Otto.Stolz</who>
    <bug_when>2005-10-17 13:34:22 +0000</bug_when>
    <thetext>The W3C link validator balks on any link to
http://jigsaw.w3.org/css-validator/; quote:
  http://jigsaw.w3.org/css-validator/validator?uri=...&amp;profile=css2
    What to do: The link was not checked due to robots exclusion rules.
    Check the link manually.
    Response status code: (N/A)
    Response message: Forbidden by robots.txt
    Line: 419

Note that the CSS Validator even recommends to include, in the
pages to be checked, a link to itself; yet, the link checker from
the same organisation does not check those recommendend links.

Please include in http://jigsaw.w3.org/robots.txt the following code:
  User-Agent: W3C-checklink
  Disallow:
or, if you want to be more restrictive:
  User-Agent: W3C-checklink
  Disallow: /guest-demos/
  Disallow: /status/
  Disallow: /demos/
  Disallow: /HyperNews/
  Disallow: /cgi-bin/
  Disallow: /Friends/
  Disallow: /api/
  Disallow: /Benoit/Public/DVDDB/
  # Don&apos;t exclude validator and docs</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>10422</commentid>
    <comment_count>1</comment_count>
    <who name="Otto Stolz">Otto.Stolz</who>
    <bug_when>2006-07-07 09:56:43 +0000</bug_when>
    <thetext>After more than 8 months, this simple, yet important, entry
in http://validator.w3.org/robots.txt is still missing!

(Though, meanwhile, the error message points to
&lt;http://validator.w3.org/docs/checklink.html#bot&gt;,
where w3.org has documented what you shhould have done,
before.)</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>10566</commentid>
    <comment_count>2</comment_count>
    <who name="Otto Stolz">Otto.Stolz</who>
    <bug_when>2006-07-18 16:41:05 +0000</bug_when>
    <thetext>Sorry, typo. Of course, I had meant to write:

After more than 8 months, this simple, yet important, entry
in http://jigsaw.w3.org/robots.txt is still missing!</thetext>
  </long_desc><long_desc isprivate="0" >
    <commentid>15878</commentid>
    <comment_count>3</comment_count>
    <who name="Olivier Thereaux">ot</who>
    <bug_when>2007-07-19 09:24:14 +0000</bug_when>
    <thetext>While indeed we give people the choice and tools to give access to checklink, does not mean we want all our services, especially the very heavily-loaded ones like the CSS validator, crawled, by this robot or others.

The W3C link checker will keep mentioning that the link hasn&apos;t been checked, but that is not an error.</thetext>
  </long_desc>
      
      

    </bug>

</bugzilla>