- Increase timeout for requests to check web urls alive or not, defaults to 15 seconds. - Consider status codes between 2xx and 3xx as valid URLs. - Add headers Accept and User-Agent headers to the requests which are required by some websites.
0.2.3
- Skip another domain by IsuminI
0.2.2
- Change Broken URLs flagging to always try head and get request on any URL before flagging it as broken.
0.2.1
- Fix command line list[str] type issue and use Click.IntRange for retries and timeout.
0.2.0
- Redesign the package. - Port to using Click instead of arg_parser. - Expose options for external users to allow for more customization. - Increase coverage for paths by including paths that start with `/` or nothing. - Add retires for URLs before flagging them as broken. - Preform head request on URL which falls back to get if both not working flag as broken after retries count finishes. - Analyze all web URLs except the ones in skip_domains list. - Change Syntax of terminal comments to improve readability. - Add Spinner to indicate that the tool is working (Not compatible with all terminals) - Add documentation for the new features. - Use multiprocessing for checking broken urls reducing the execution time by 50%. - Add support for GitHub automatic annotations.
0.1.5
- Increase timeout for requests to check web urls alive or not. https://github.com/john0isaac/markdown-checker/pull/52