Support bulk enqueueing of jobs
Adds a new API `task.bulk_enqueue` which supports enqueueing jobs in bulk. It is up to each backend to support this -- the default behaviour falls back to individual enqueueing of the jobs in a loop. Currently this is only any more efficient when using the Redis backends (where each batch becomes a single `LPUSH` operation).
Usage looks like this:
python
with my_task.bulk_enqueue() as enqueue:
enqueue(the_ids=[42, 43])
enqueue(the_ids=[45, 46])
This is equivalent to:
python
my_task(the_ids=[42, 43])
my_task(the_ids=[45, 46])