Rq

Latest version: v2.3.1

Safety actively analyzes 724020 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 20 of 21

0.3.1

Not secure
(August 30th, 2012)

- `.enqueue()` now takes a `result_ttl` keyword argument that can be used to
change the expiration time of results.

- Queue constructor now takes an optional `async=False` argument to bypass the
worker (for testing purposes).

- Jobs now carry status information. To get job status information, like
whether a job is queued, finished, or failed, use the property `status`, or
one of the new boolean accessor properties `is_queued`, `is_finished` or
`is_failed`.

- Jobs return values are always stored explicitly, even if they have to
explicit return value or return `None` (with given TTL of course). This
makes it possible to distinguish between a job that explicitly returned
`None` and a job that isn't finished yet (see `status` property).

- Custom exception handlers can now be configured in addition to, or to fully
replace, moving failed jobs to the failed queue. Relevant documentation
[here](http://python-rq.org/docs/exceptions/) and
[here](http://python-rq.org/patterns/sentry/).

- `rqworker` now supports passing in configuration files instead of the
many command line options: `rqworker -c settings` will source
`settings.py`.

- `rqworker` now supports one-flag setup to enable Sentry as its exception
handler: `rqworker --sentry-dsn="http://public:secretexample.com/1"`
Alternatively, you can use a settings file and configure `SENTRY_DSN
= 'http://public:secretexample.com/1'` instead.

0.3

* Added the capability to create periodic (cron) and repeated job using `scheduler.enqueue`

0.3.0

Not secure
(August 5th, 2012)

- Reliability improvements

- Warm shutdown now exits immediately when Ctrl+C is pressed and worker is idle
- Worker does not leak worker registrations anymore when stopped gracefully

- `.enqueue()` does not consume the `timeout` kwarg anymore. Instead, to pass
RQ a timeout value while enqueueing a function, use the explicit invocation
instead:

python
q.enqueue(do_something, args=(1, 2), kwargs={'a': 1}, timeout=30)


- Add a `job` decorator, which can be used to do Celery-style delayed
invocations:

python
from redis import StrictRedis
from rq.decorators import job

Connect to Redis
redis = StrictRedis()

job('high', timeout=10, connection=redis)
def some_work(x, y):
return x + y


Then, in another module, you can call `some_work`:

python
from foo.bar import some_work

some_work.delay(2, 3)

0.2.2

Not secure
(August 1st, 2012)

- Fix bug where return values that couldn't be pickled crashed the worker

0.2.1

Not secure
(July 20th, 2012)

- Fix important bug where result data wasn't restored from Redis correctly
(affected non-string results only).

0.2.0

Not secure
(July 18th, 2012)

- `q.enqueue()` accepts instance methods now, too. Objects will be pickle'd
along with the instance method, so beware.
- `q.enqueue()` accepts string specification of functions now, too. Example:
`q.enqueue("my.math.lib.fibonacci", 5)`. Useful if the worker and the
submitter of work don't share code bases.
- Job can be assigned custom attrs and they will be pickle'd along with the
rest of the job's attrs. Can be used when writing RQ extensions.
- Workers can now accept explicit connections, like Queues.
- Various bug fixes.

Page 20 of 21

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.