Scrapydwebx

Latest version: v1.5.7

Safety actively analyzes 693883 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 4

1.5.5

------------------
- Fix dialog showing on ajax request error
- Fix 500 error when jobs is empty
- Set table defaults to 25 items per page
- Set max jobs to 3000 by default until pagination is implemented

1.5.4

------------------
- Add dynamic jobs page update without reloading the page
- Dont edit the timer task name when editing the timer task
- Set `Add` by defult when creating or editing a timer task
- Load the latest version of the spider by default when creating or editing a timer task

1.5.3

------------------
- Use max_instances to limit the number of instances of the same spider running on the same node

1.5.0

------------------
- Update dependencies like Flask==2.0.0
- Support Python 3.6 to 3.12

1.4.1

1.4.0

------------------
- New Features
- Add API for sending text or alert via Slack, Telegram, or Email
- Improvements
- UI improvements on sidebar and multinode buttons
- Others
- Update config file to scrapydweb_settings_v10.py


[1.3.0](https://github.com/my8100/scrapydweb/issues?q=is%3Aclosed+milestone%3A1.3.0) (2019-08-04)
------------------
- New Features
- Add new pages Node Reports and Cluster Reports for aggregating jobs stats [(issue 72)](https://github.com/my8100/scrapydweb/issues/72)
- Improvements
- Adapt to [:link: *LogParser*](https://github.com/my8100/logparser) v0.8.2
- Add DATA_PATH option for customizing path to save program data [(issue 40)](https://github.com/my8100/scrapydweb/issues/40)
- Add DATABASE_URL option to support MySQL or PostgreSQL backend [(issue 42)](https://github.com/my8100/scrapydweb/issues/42)
- Support specify the latest version of Scrapy project in the Run Spider page [(issue 4)](https://github.com/my8100/scrapydweb/issues/4#issuecomment-475145676)
- Support specify default values of settings & arguments in the Run Spider page [(issue 55)](https://github.com/my8100/scrapydweb/issues/55)
- Others
- Update config file to scrapydweb_settings_v9.py
- Support continuous integration (CI) on [CircleCI](https://circleci.com/)

Page 1 of 4

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.