Added
- whole-`Cluster` file upload/download methods (`read()`, `get()`, `put()`,
`read_csv()`)
- can now set default S3 `TransferConfig` in `hostess.config`
- `RefAlarm` tool for debugging reference leaks / drops
- `Cluster` and `Instance` now have `install_conda()` methods that perform
managed installation(s) of a Conda distribution of Python (by default miniforge)
- `ServerPool` object for managed asynchronous dispatch of arbitrary numbers
of tasks to remote hosts
Changed
- `Cluster` can now be indexed by `Instance` names, instance ids, and ip
addresses as well as their indices within the `Cluster`'s `instances` list
- `Cluster` wait-until-connected behavior is now available as an individual
method (`connect()`), not just from `launch()`
- various `Cluster` methods can now be run in 'permissive' mode, returning
exceptions rather than raising them (intended to allow instructions to execute
on some instances even if attempts to execute instructions on others fail)
- backed by `ServerPool`, `Cluster`'s `commandmap()` and `pythonmap()` now
accept arbitrary numbers of tasks, dispatching them to instances as soon as they free up
- `hostess.caller.generic_python_endpoint()` now permits returning objects in
pickled and/or gzipped form (preparation for high-level features in a later version)
- `dicts` returned by the `price_per_hour()` methods of `Cluster` and `Instance`
now order keys the same way
- assorted updates to documentation and examples
Fixed
- `Bucket.put_stream()` updated to work with new `Bucket` internals
- `console_stream_handlers()` no longer creates secret duplicate stdout/stderr
caches
- edge case in `find_conda_env()` that sometimes prevented finding the most
recently-created env
- edge case in caller json serialization for some nested objects containing
strings
- `Instance.conda_env()` now autoconnects as intended
Removed
- Because they now accept arbitrary numbers of tasks, `Cluster.commandmap()`
and `pythonmap()` no longer interpret sequences of bare strings passed to
`argseq` as individual tasks (due to both ambiguity and partial incompatibility
with `pythonmap()'s` signature). They will always interpret a sequence of bare
strings as a `tuple` of args to be included in all tasks.