This is an archived snapshot of W3C's public bugzilla bug tracker, decommissioned in April 2019. Please see the home page for more details.

Bug 5080 - forbidden caused by themself
Summary: forbidden caused by themself
Status: NEW
Alias: None
Product: LinkChecker
Classification: Unclassified
Component: checklink (show other bugs)
Version: 4.3
Hardware: All All
: P2 normal
Target Milestone: ---
Assignee: This bug has no owner yet - up for the taking
QA Contact: qa-dev tracking
URL: http://validator.w3.org/checklink?url...
Whiteboard:
Keywords:
Depends on:
Blocks:
 
Reported: 2007-09-26 14:15 UTC by rimy
Modified: 2013-11-03 07:34 UTC (History)
4 users (show)

See Also:


Attachments

Description rimy 2007-09-26 14:15:32 UTC
I just visit the faq for developers on apple.com,
there are some links to check html/css & also link checker
I clicked, bug the results said:

>http://validator.w3.org/check/referer
>    What to do: The link is forbidden! This needs fixing. Usual suspects: a missing index.html or Overview.html, or a missing ACL.
>    Response status code: 403
>    Response message: Checking non-public IP address disallowed by link checker configuration
>    Line: 444
>http://validator.w3.org/
>    What to do: The link is forbidden! This needs fixing. Usual suspects: a missing index.html or Overview.html, or a missing ACL.
>    Response status code: 403
>    Response message: Checking non-public IP address disallowed by link checker configuration
>    Line: 441

suddenly I remember that yesterday I was added two links to check html & css
just now, it reports to me two broken links on my gs.rimy.org

>http://validator.w3.org/check?uri=referer
>    What to do: The link is forbidden! This needs fixing. Usual suspects: a missing index.html or Overview.html, or a missing ACL.
>    Response status code: 403
>    Response message: Checking non-public IP address disallowed by link checker configuration
>    Line: 36
>http://jigsaw.w3.org/css-validator/check/referer
>    What to do: The link was not checked due to robots exclusion rules. Check the link manually.
>    Response status code: (N/A)
>    Response message: Forbidden by robots.txt
>    Line: 39 

that's not my mistake.
Comment 1 Ville Skyttä 2007-09-26 16:17:41 UTC
Hmm, I suppose the validator.w3.org instance you're hitting sits in a private subnet somewhere, and resolves its own hostname to a private IP address.  This is not a bug in checklink, although its (or the host's name resolver) configuration might not be an optimal one.

Olivier, thoughts?

The "forbidden by robots.txt" issue is a separate one, and tracked in bug 2346.
Comment 2 Alex Stuart 2008-02-14 16:10:49 UTC
This might not be a bug in checklink, but the messages relating to the links are misleading.

Line: is OK

Response message: provides useful information

Response status: is it actually the server that responds with a 403 error?

What to do: It may be better to explain that the link resolves to a private IP which may be real issue (e.g a link to 10.0.10.06) or which may be an issue with the instance's name resolver/configuration.
Comment 3 Ville Skyttä 2008-02-14 23:46:56 UTC
(In reply to comment #2)

> Response status: is it actually the server that responds with a 403 error?

Good point, it's not.

> What to do: It may be better to explain that the link resolves to a private IP
> which may be real issue (e.g a link to 10.0.10.06) or which may be an issue
> with the instance's name resolver/configuration.

Done in CVS, thanks for the suggestion.