Get all urls from a page, then check whether all the urls is working or not.
Get all urls from a page - including images, then check whether all the urls is working or not.
npm install urls-checker
const { urlsChecker } = require('urls-checker');
urlsChecker('https://example.com', 'main-domain' - optional) // http / https is important
.then(res => {
console.log(res);
})
.catch(err => console.log(err));
The result is an object of ok, fail and error urls
{
ok: ['list-of-working-urls'], // status code: 200
fail: [['url', 'status-code'], [...]], // status code will not be 200
error: [['url', 'message'], [...]], // Could be certificate / authenticate error
}
This method may not get all of the images because of the asynchronous loading.
const { imagesChecker } = require('urls-checker');
imagesChecker('https://example.com', , 'main-domain' - optional) // http / https is important
.then(res => {
console.log(res);
})
.catch(err => console.log(err));
The result is an object of ok, fail and error urls
{
ok: ['list-of-working-urls'], // status code: 200
fail: [['url', 'status-code'], [...]], // status code will not be 200
error: [['url', 'message'], [...]], // Could be certificate / authenticate error
}
We could get all pages from sitemap.xml, then loops through all the links. You should have a sitemap.xml on your website for SEO purposes.
This project is based on urls-checker, feel free to report bugs and make feature requests in the Issue Tracker, fork and create pull requests!