A lightweight and fast tool to help you keep your Markdown files free of broken links.
MIT License
This handy tool helps you maintain the integrity of your Markdown files by identifying broken links. It scans your files and detects:
Here's what it does:
README.md#no-fragment
.Example of output for fail.md
File: tests/test_md_files/fail.md:3 • Link: https://github.com/AlexanderDokuchaev/FAILED • Error: 404: Not Found
File: tests/test_md_files/fail.md:4 • Link: https://not_exist_github.githubcom/ • Error: 500: Internal Server Error
File: tests/test_md_files/fail.md:8 • Link: /test/fail.md1 • Error: Path not found
File: tests/test_md_files/fail.md:9 • Link: fail.md1 • Error: Path not found
File: tests/test_md_files/fail.md:13 • Link: /tests/test_md_files/fail.md#fail • Error: Fragment not found
File: tests/test_md_files/fail.md:15 • Link: not_exist_dir • Error: Path not found
❌ Found 6 dead links 🙀
[!NOTE] By defaults, only error codes like 404 (Not Found), 410 (Gone), and 500 (Internal Server Error), and links that don't exist are considered "dead links". Other error codes typically indicate temporary issues with the host server or unsupported links for the HEAD request type.
Add Github Action config to .github/workflow/
jobs:
md-dead-link-check:
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v4
- uses: AlexanderDokuchaev/[email protected]
Adding to your .pre-commit-config.yaml
to integrate in pre-commit tool
- repo: https://github.com/AlexanderDokuchaev/md-dead-link-check
rev: "v0.9"
hooks:
- id: md-dead-link-check
[!NOTE] For the
pull_request
event type, the action will only check external links for files that have been modified. To scan all links, consider using a separate action that runs periodically on target branches. This approach helps prevent pull request merges from being blocked by broken links unrelated to the files modified in the pull request.
# .github/workflows/nightly.yaml
name: nightly
on:
workflow_dispatch:
schedule:
- cron: '0 0 * * *'
jobs:
md-dead-link-check:
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v4
- uses: AlexanderDokuchaev/[email protected]
# .github/workflows/pull_request.yaml
name: pull_request
on:
pull_request:
types:
- opened
- reopened
- synchronize
jobs:
md-dead-link-check:
runs-on: ubuntu-22.04
steps:
- uses: actions/checkout@v4
- uses: AlexanderDokuchaev/[email protected]
For direct use, install with pip and run:
pip install md-dead-link-check
md-dead-link-check
This tool utilizes asynchronous API calls and avoids downloading full web pages, enabling it to process thousands links in several seconds.
This tool leverages your system's existing HTTP and HTTPS proxy configuration.
It achieves this by trusting the environment variables that your operating system utilizes to define proxy settings.
This functionality is enabled by the aiohttp.ClientSession(trust_env=True)
option.
For further technical details, you can refer to the
aiohttp documentation.
[!WARNING] Without proxy configuration in environment, link failures may not be reported. If your environment lacks proxy configuration (variables like
http_proxy
andhttps_proxy
), link retrieval attempts may time out without indicating a failure. To help diagnose this issue, use the--warn
argument to log all processed links.
This tool seamlessly integrates with your project's pyproject.toml
file for configuration.
To leverage a different file, invoke the --config
option during execution.
5
seconds.[404, 410, 500]
.[]
.[]
.GET
requests during checks. Default: []
.true
.true
.[!TIP] Leverage wildcard patterns (fnmatch syntax) for
exclude_links
,exclude_files
andforce_get_requests_for_links
parameters.
[tool.md_dead_link_check]
timeout = 5
exclude_links = ["https://github.com/", "https://github.com/*"]
exclude_files = ["tests/test_md_files/fail.md", "tests/*"]
check_web_links = true
catch_response_codes = [404, 410, 500]
force_get_requests_for_links = []
validate_ssl = true