Validator's own robots.txt filters cause errors for link checker itself.
Here's the link: https://validator.w3.org/robots.txt
For example, a webpage with a link like: https://validator.w3.org/checklink?uri=xyz.com
Produces this error: Status: (N/A) Forbidden by robots.txt. The link was not checked due to robots exclusion rules. Check the link manually.
And, to solve this issue, the link checker gives you an advice there: https://validator.w3.org/checklink/docs/checklink.html#bot
Validator's own robots.txt filters cause errors for link checker itself.
Here's the link: https://validator.w3.org/robots.txt
For example, a webpage with a link like: https://validator.w3.org/checklink?uri=xyz.com
Produces this error: Status: (N/A) Forbidden by robots.txt. The link was not checked due to robots exclusion rules. Check the link manually.
And, to solve this issue, the link checker gives you an advice there: https://validator.w3.org/checklink/docs/checklink.html#bot