Scrapyd

Latest version: v1.4.3

Safety actively analyzes 641134 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

1.4.1

Fixed

- Encode the `FEEDS` command-line argument as JSON.

1.4.0

Added

- Add `item_url` and `log_url` to the response from the listjobs.json webservice. (mxdev88)
- Scrapy 2.8 support. Scrapyd sets `LOG_FILE` and `FEEDS` command-line arguments, instead of `SCRAPY_LOG_FILE` and `SCRAPY_FEED_URI` environment variables.
- Python 3.11 support.
- Python 3.12 support. Use `packaging.version.Version` instead of `distutils.LooseVersion`. (pawelmhm)

Changed

- Rename environment variables to avoid spurious Scrapy deprecation warnings.

- `SCRAPY_EGG_VERSION` to `SCRAPYD_EGG_VERSION`
- `SCRAPY_FEED_URI` to `SCRAPYD_FEED_URI`
- `SCRAPY_JOB` to `SCRAPYD_JOB`
- `SCRAPY_LOG_FILE` to `SCRAPYD_LOG_FILE`
- `SCRAPY_SLOT` to `SCRAPYD_SLOT`
- `SCRAPY_SPIDER` to `SCRAPYD_SPIDER`

::: attention
::: title
Attention
:::

These are undocumented and unused, and may be removed in future versions. If you use these environment variables, please [report your use in an issue](https://github.com/scrapy/scrapyd/issues).
:::

Removed

- Scrapy 1.x support.
- Python 3.6 support.
- Unmaintained files (Debian packaging) and unused code (`scrapyd/script.py`).

Fixed

- Print Scrapyd\'s version instead of Twisted\'s version with `--version` (`-v`) flag. (niuguy)
- Override Scrapy\'s `LOG_STDOUT` setting to `False` to suppress logging output for listspiders.json webservice. (Lucioric2000)

1.3.0

Added


- support for HTTP authentication in scrapyd server
- Jobs website shortcut to cancel a job using the cancel.json webservice.
- Make project argument to listjobs.json optional,
so that we can easily query for all jobs.
- Python 3.7, 3.8, 3.9, 3.10 support
- Configuration option for job storage class
- Configuration option for egg storage class
- improved HTTP headers in webservice
- improved test coverage

Removed


- Python 2 support
- Python 3.3 support (although never officially supported)
- Python 3.4 support
- Python 3.5 support
- Pypy 2 support
- Doc for ubuntu installs, Zyte no longer maintains ubuntu repo.

Fixed

- ScrapyD now respects Scrapy TWISTED_REACTOR setting
- replaced deprecated SafeConfigParser with ConfigParser

1.2.1

Fixed

* Http header types were breaking newer twisted versions
* DeferredQueue was hiding a pending job when reaching max_proc
* AddVersion's arguments' string types were breaking the environment in windows
* Tests: Updated binary eggs to be scrapy-1.x compatible

1.2.0

The highlight of this release is the long-awaited Python 3 support.

The new scrapy requirement is version 1.0 or higher.
Python 2.6 is no longer supported by scrapyd.

Some unused sqlite utilities are now deprecated
and will be removed from a later scrapyd release.
Instantiating them or subclassing from them
will trigger a deprecation warning.

These are located under `scrapyd.sqlite`:

- SqliteDict
- SqlitePickleDict
- SqlitePriorityQueue
- PickleSqlitePriorityQueue

Added

- Include run's PID in listjobs webservice.
- Include full tracebacks from scrapy when failing to get spider list.
This will lead to more noisy webservice output
but will make debugging deployment problems much easier.
- Include start/finish time in daemon's joblist page
- Twisted 16 compatibility
- Python 3 compatibility
- Make console script executable
- Project version argument in the schedule webservice
- Configuration option for website root class
- Optional jobid argument to schedule webservice
- Contribution documentation
- Daemon status webservice

Removed

- scrapyd's bind_address now defaults to 127.0.0.1 instead of 0.0.0.0
to listen only for connection from the local host
- scrapy < 1.0 compatibility
- python < 2.7 compatibility

Fixed

- Poller race condition for concurrently accessed queues

1.1.1

Removed

- Disabled bdist_wheel command in setup to define dynamic requirements
despite of pip-7 wheel caching bug.

Fixed

- Use correct type adapter for sqlite3 blobs.
In some systems, a wrong type adapter leads to incorrect buffer reads/writes.
- FEED_URI was always overridden by scrapyd
- Specified maximum versions for requirements that became incompatible.
- Marked package as zip-unsafe because twistd requires a plain ``txapp.py``
- Don't install zipped scrapy in py26 CI env
because its setup doesn't include the ``scrapy/VERSION`` file.

Added

- Enabled some missing tests for the sqlite queues.
- Enabled CI tests for python2.6 because it was supported by the 1.1 release.
- Document missing config options and include in default_scrapyd.conf
- Note the spider queue's ``priority`` argument in the scheduler's doc.

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.