Dstack

Latest version: v0.19.1

Safety actively analyzes 723625 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 12 of 15

0.15.0

Resources

It is now possible to configure resources in the YAML configuration file:

yaml
type: dev-environment

0.14.0

OpenAI-compatible endpoints

With the latest update, we are extending the service configuration in dstack to enable you to optionally map your custom LLM to an OpenAI-compatible endpoint.

To learn more about how the new feature, read our [blog post](https://dstack.ai/blog/2024/01/19/openai-endpoints-preview/) on it.

What's changed

* Make gateway active by Egor-S in https://github.com/dstackai/dstack/pull/829
* Implement OpenAI streaming for TGI by Egor-S in https://github.com/dstackai/dstack/pull/833
* Make get_latest_runner_build robuster for editable installs by Egor-S in https://github.com/dstackai/dstack/pull/834
* Fix descending logs by r4victor in https://github.com/dstackai/dstack/pull/839
* Reraise Jinja2 TemplateError by Egor-S in https://github.com/dstackai/dstack/pull/840


**Full Changelog**: https://github.com/dstackai/dstack/compare/0.13.1...0.14.0

0.13.1

Mounting repos via Python API

If you submit a task or a service via the Python API, you can now specify the `repo` with the `Client.runs.submit` method.

This argument accepts an instance of `dstack.api.LocalRepo` (which allows you to mount additional files to the run from a local folder), `dstack.api.RemoteRepo` (which allows you to mount additional files to the run from a remote Git repo), or `dstack.api.VirtualRepo` (which allows you to mount additional files to the run programmatically).

Here's an example:

python
repo=RemoteRepo.from_url(
repo_url="https://github.com/dstackai/dstack-examples",
repo_branch="main"
)
client.repos.init(repo)

run = client.runs.submit(
configuration=...,
repo=repo,
)


This allows you to access the additional files in your run from the mounted repo.

More examples are now available in the [API documentation](https://dstack.ai/docs/reference/api/python/).

Note that the Python API is just one possible way to manage runs. Another one is the CLI. When using the CLI, it automatically mounts the repo in the current folder.

Bug-fixes

Among other improvements, the update addresses the issue that previously prevented the ability to pass custom arguments to the run using `${{ run.args }}` in the YAML configuration.

Here's an example:

yaml
type: task

python: "3.11" (Optional) If not specified, your local version is used

commands:
- pip install -r requirements.txt
- python train.py ${{ run.args }}
``

Now, you can pass custom arguments to the run via `dstack run`:

shell
dstack run . -f train.dstack.yml --gpu A100 --train_batch_size=1 --num_train_epochs=100


In this case `--train_batch_size=1 --num_train_epochs=100` will be passed to `python train.py`.

Contribution guide

Last but not least, we've extended our contribution guide with a [new wiki page](https://github.com/dstackai/dstack/wiki/How-to-add-a-backend) that guides you through the steps of adding a custom backend. This can be helpful if you decide to extend dstack with support for a custom backend (cloud provider).

Feel free to check out this [new wiki page](https://github.com/dstackai/dstack/wiki/How-to-add-a-backend) and share your feedback. As always, if you need help with adding custom backend support, you can always ask for assistance from our team.

0.13.0

Disk size

Previously, `dstack` set the disk size to `100GB` regardless of the cloud provider. Now, to accommodate larger language
models and datasets, `dstack` enables setting a custom disk size using `--disk` in `dstack run` or via the `disk`
property in `.dstack/profiles.yml`.

0.12.4.post1

Bug-fixes

- Resolves issues related to TensorDock.
- Enhances error handling. Previously, server errors were only visible when the debug log level was set. Now, errors appear regardless of the log level.
- The `dstack.FineTuningTask` failed because of a missing file
- Lastly, if you're using dstack Cloud, ensure you update to this version for compatibility.

0.12.3

Vast.ai

With dstack `0.12.3`, you can now use `dstack` with Vast.ai, a marketplace providing GPUs from independent hosts at notably lower prices.

Configuring Vast.ai is very easy. Log into your [Vast AI](https://cloud.vast.ai/) account, click Account in the sidebar, and copy your
API Key.

Then, go ahead and configure the backend via `~/.dstack/server/config.yml`:

yaml
projects:
- name: main
backends:
- type: vastai
creds:
type: api_key
api_key: d75789f22f1908e0527c78a283b523dd73051c8c7d05456516fc91e9d4efd8c5


Now you can restart the server and proceed to using the CLI or API for running development environments, tasks, and services.

shell

Page 12 of 15

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.