When you create a new application in mabl, mabl creates a plan called "Check all pages for broken links." This plan runs a test called "Visit all linked pages within the app", also known as the link crawler, once per week free of charge.
The link crawler test informs you of any potentially broken links in your application. The output of link crawler tests also serves as the basis for calculating test coverage for your application. See the article on page coverage for more details.
How it works
The link crawler test starts by loading the application URL and collecting a DOM snapshot and hyperlinks for each anchor element on the page.
The link crawler validates each link with a GET request and checks for a non-error HTTP status. A link is considered broken if it belongs to one of the following categories:
- 4xx or 5xx status response
- Too many redirects
- A malformed URL that gets identified as a link
Checking for broken links on a page
If the link matches the domain of the application URL and has a valid content type, the link is added to a page queue.
For example, if the application URL is
https://app.example.com, the link crawler visits
https://app.example.com/support, but it does not visit
https://example.com/blog because the domain doesn't match the application URL's domain.
Valid content types include:
Visiting pages in the queue
When the link crawler finishes checking the links on that page, it visits the next page in the page queue and checks the links, repeating this process until it has visited all the pages in the queue.
The link crawler test can visit a maximum of 500 pages and check a maximum of 10,000 unique links.
Generating the broken links report
At the end of the test, the link crawler generates a broken link report with a list of links that returned an error response. Click on the download button to download the broken links report as a CSV.
The broken links report