Difference between revisions of "User:Linkcheckbot"
From PUBLIC DOMAIN PROJECT MEDIAPOOL
(Created page with "weblinkchecker.py is a script from the Pywikibot which finds broken external links. It will only check HTTP and HTTPS links, and it will leave...") |
|||
Line 1: | Line 1: | ||
− | [[ | + | [[mediawikiwiki:Manual:Pywikibot/weblinkchecker.py|weblinkchecker.py]] is a script from the Pywikibot which finds broken external links. |
It will only check HTTP and HTTPS links, and it will leave out URLs inside comments and nowiki tags. | It will only check HTTP and HTTPS links, and it will leave out URLs inside comments and nowiki tags. | ||
The bot will not remove external links by itself, it will only report them; removal would require strong artificial intelligence. It will only report dead links if they have been found unresponsive at least twice, with a default period of at least one week of waiting between the first and the last time. This should help prevent users from removing links due to temporary server failure. Please keep in mind that the bot cannot differentiate between local failures and a server failures, so make sure you're on a stable Internet connection. | The bot will not remove external links by itself, it will only report them; removal would require strong artificial intelligence. It will only report dead links if they have been found unresponsive at least twice, with a default period of at least one week of waiting between the first and the last time. This should help prevent users from removing links due to temporary server failure. Please keep in mind that the bot cannot differentiate between local failures and a server failures, so make sure you're on a stable Internet connection. |
Latest revision as of 22:42, 24 August 2018
weblinkchecker.py is a script from the Pywikibot which finds broken external links.
It will only check HTTP and HTTPS links, and it will leave out URLs inside comments and nowiki tags.
The bot will not remove external links by itself, it will only report them; removal would require strong artificial intelligence. It will only report dead links if they have been found unresponsive at least twice, with a default period of at least one week of waiting between the first and the last time. This should help prevent users from removing links due to temporary server failure. Please keep in mind that the bot cannot differentiate between local failures and a server failures, so make sure you're on a stable Internet connection.