Huggingface-hub

Latest version: v0.29.1

Safety actively analyzes 707607 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 17

1.0

You are about to delete tag v1.0 on model Wauplin/my-cool-model
Proceed? [Y/n] y
Tag v1.0 deleted on Wauplin/my-cool-model


For more details, check out the [CLI guide](https://huggingface.co/docs/huggingface_hub/main/en/guides/cli#huggingface-cli-tag).

* CLI Tag Functionality by bilgehanertan in 2172

🧩 ModelHubMixin

This `ModelHubMixin` got a set of nice improvement to generate model cards and handle custom data types in the `config.json` file. More info in the [integration guide](https://huggingface.co/docs/huggingface_hub/main/en/guides/integrations#advanced-usage).

* `ModelHubMixin`: more metadata + arbitrary config types + proper guide by Wauplin in 2230
* Fix ModelHubMixin when class is a dataclass by Wauplin in 2159
* Do not document private attributes of ModelHubMixin by Wauplin in 2216
* Add support for pipeline_tag in ModelHubMixin by Wauplin in 2228

βš™οΈ Other

In a shared environment, it is now possible to set a custom path `HF_TOKEN_PATH` as environment variable so that each user of the cluster has their own access token.

* Support `HF_TOKEN_PATH` as environment variable by Wauplin in 2185

Thanks to Y4suyuki and lappemic, most custom errors defined in `huggingface_hub` are now aggregated in the same module. This makes it very easy to import them from `from huggingface_hub.errors import ...`.

* Define errors in errors.py by Y4suyuki in 2170
* Define errors in errors file by lappemic in 2202

Fixed `HFSummaryWriter` (class to seamlessly log tensorboard events to the Hub) to work with either `tensorboardX` or `torch.utils` implementation, depending on the user setup.

* Import SummaryWriter from either tensorboardX or torch.utils by Wauplin in 2205

Speed to list files using `HfFileSystem` has been drastically improved, thanks to awgr. The values returned from the cache are not deep-copied anymore, which was unfortunately the part taking the most time in the process. If users want to modify values returned by `HfFileSystem`, they would need to copy them before-hand. This is expected to be a very limited drawback.

* fix: performance of _ls_tree by awgr in 2103

Progress bars in `huggingface_hub` got some flexibility!
It is now possible to provide a name to a tqdm bar (similar to `logging.getLogger`) and to enable/disable only some progress bars. More details in [this guide](https://huggingface.co/docs/huggingface_hub/main/en/package_reference/utilities#configure-progress-bars).

py
>>> from huggingface_hub.utils import tqdm, disable_progress_bars
>>> disable_progress_bars("peft.foo")

No progress bars for `peft.boo.bar`
>>> for _ in tqdm(range(5), name="peft.foo.bar"):
... pass

But for `peft` yes
>>> for _ in tqdm(range(5), name="peft"):
... pass
100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 5/5 [00:00<00:00, 117817.53it/s]


* Implement hierarchical progress bar control in huggingface_hub by lappemic in 2217

πŸ’” Breaking changes

`--local-dir-use-symlink` and `--resume-download`

As part of the download process revamp, some breaking changes have been introduced. However we believe that the benefits outweigh the change cost. Breaking changes include:
- a `.cache/huggingface/` folder is not present at the root of the local dir. It only contains file locks, metadata and partially downloaded files. If you need to, you can safely delete this folder without corrupting the data inside the root folder. However, you should expect a longer recovery time if you try to re-run your download command.
- `--local-dir-use-symlink` is not in used anymore and will be ignored. It is not possible anymore to symlinks your local dir with the cache directory. Thanks to the `.cache/huggingface/` folder, it shouldn't be needed anyway.
- `--resume-download` has been deprecated and will be ignored. Resuming failed downloads is now activated by default all the time. If you need to force a new download, use `--force-download`.

Inference Types

As part of 2237 (Grammar and Tools support), we've updated the return value from `InferenceClient.chat_completion` and `InferenceClient.text_generation` to match exactly TGI output. The attributes of the returned objects did not change but the classes definition themselves yes. Expect errors if you've previously had `from huggingface_hub import TextGenerationOutput` in your code. This is however not the common usage since those objects are already instantiated by `huggingface_hub` directly.

Expected breaking changes

Some other breaking changes were expected (and announced since 0.19.x):
- `list_files_info` is definitively removed in favor of `get_paths_info` and `list_repo_tree`
- `WebhookServer.run` is definitively removed in favor of `WebhookServer.launch`
- `api_endpoint` in ModelHubMixin `push_to_hub`'s method is definitively removed in favor of the `HF_ENDPOINT` environment variable

Check 2156 for more details.

Small fixes and maintenance

βš™οΈ CI optimization

βš™οΈ fixes
* Fix HF_ENDPOINT not handled correctly by Wauplin in 2155
* Fix proxy if dynamic endpoint by Wauplin (direct commit on main)
* Update the note message when logging in to make it easier to understand and clearer by lh0x00 in 2163
* Fix URL when uploading to proxy by Wauplin in 2167
* Fix SafeTensorsInfo initialization by Wauplin in 2190
* Doc cli download timeout by zioalex in 2198
* Fix Typos in CONTRIBUTION.md and Formatting in README.md by lappemic in 2201
* change default model card by Wauplin (direct commit on main)
* Add returns documentation for save_pretrained by alexander-soare in 2226
* Update cli.md by QuinnPiers in 2242
* add warning tip that list_deployed_models only searches over cache by MoritzLaurer in 2241
* Respect default timeouts in `hf_file_system` by Wauplin in 2253
* Update harmonized token param desc and type def by lappemic in 2252
* Better document download attribute by Wauplin in 2250
* Correctly check inference endpoint is ready by Wauplin in 2229
* Add support for `updatedRefs` in WebhookPayload by Wauplin in 2169

βš™οΈ internal
* prepare for 0.23 by Wauplin in 2156
* lint by Wauplin (direct commit on main)
* quick fix by Wauplin (direct commit on main)
* Fix CI (inference tests, dataset viewer user, mypy) by Wauplin in 2208
* link by Wauplin (direct commit on main)
* Fix circular imports in eager mode? by Wauplin in 2211
* Drop generic from InferenceAPI framework list by Wauplin in 2240
* Remove test sort by acsending likes by Wauplin in 2243
* Delete legacy tests in `TestHfHubDownloadRelativePaths` + implicit delete folder is ok by Wauplin in 2259
* small doc clarification by julien-c [2261](https://github.com/huggingface/huggingface_hub/pull/2261)

Significant community contributions

The following contributors have made significant changes to the library over the last release:

* lappemic
* Fix Typos in CONTRIBUTION.md and Formatting in README.md ([2201](https://github.com/huggingface/huggingface_hub/pull/2201))
* Define errors in errors file ([2202](https://github.com/huggingface/huggingface_hub/pull/2202))
* [wip] Implement hierarchical progress bar control in huggingface_hub ([2217](https://github.com/huggingface/huggingface_hub/pull/2217))
* Update harmonized token param desc and type def ([2252](https://github.com/huggingface/huggingface_hub/pull/2252))
* bilgehanertan
* User API endpoints ([2147](https://github.com/huggingface/huggingface_hub/pull/2147))
* CLI Tag Functionality ([2172](https://github.com/huggingface/huggingface_hub/pull/2172))
* cjfghk5697
* 🌐 [i18n-KO] Translated `guides/repository.md` to Korean ([2124](https://github.com/huggingface/huggingface_hub/pull/2124))
* 🌐 [i18n-KO] Translated `package_reference/inference_client.md` to Korean ([2178](https://github.com/huggingface/huggingface_hub/pull/2178))
* 🌐 [i18n-KO] Translated `package_reference/utilities.md` to Korean ([2196](https://github.com/huggingface/huggingface_hub/pull/2196))
* SeungAhSon
* 🌐 [i18n-KO] Translated `guides/model_cards.md` to Korean" ([2128](https://github.com/huggingface/huggingface_hub/pull/2128))
* 🌐 [i18n-KO] Translated `reference/login.md` to Korean ([2151](https://github.com/huggingface/huggingface_hub/pull/2151))
* 🌐 [i18n-KO] Translated package_reference/hf_file_system.md to Korean ([2174](https://github.com/huggingface/huggingface_hub/pull/2174))
* seoulsky-field
* 🌐 [i18n-KO] Translated `guides/community.md` to Korean ([2126](https://github.com/huggingface/huggingface_hub/pull/2126))
* Y4suyuki
* Define errors in errors.py ([2170](https://github.com/huggingface/huggingface_hub/pull/2170))
* harheem
* 🌐 [i18n-KO] Translated `guides/cli.md` to Korean ([2131](https://github.com/huggingface/huggingface_hub/pull/2131))
* 🌐 [i18n-KO] Translated `reference/inference_endpoints.md` to Korean ([2180](https://github.com/huggingface/huggingface_hub/pull/2180))
* seoyoung-3060
* 🌐 [i18n-KO] Translated `guides/search.md` to Korean ([2134](https://github.com/huggingface/huggingface_hub/pull/2134))
* 🌐 [i18n-KO] Translated `package_reference/file_download.md` to Korean ([2184](https://github.com/huggingface/huggingface_hub/pull/2184))
* 🌐 [i18n-KO] Translated package_reference/serialization.md to Korean ([2233](https://github.com/huggingface/huggingface_hub/pull/2233))
* boyunJang
* 🌐 [i18n-KO] Translated `guides/inference.md` to Korean ([2130](https://github.com/huggingface/huggingface_hub/pull/2130))
* 🌐 [i18n-KO] Translated `package_reference/collections.md` to Korean ([2214](https://github.com/huggingface/huggingface_hub/pull/2214))
* 🌐 [i18n-KO] Translated `package_reference/space_runtime.md` to Korean ([2213](https://github.com/huggingface/huggingface_hub/pull/2213))
* 🌐 [i18n-KO] Translated `guides/manage-spaces.md` to Korean ([2220](https://github.com/huggingface/huggingface_hub/pull/2220))
* nuatmochoi
* 🌐 [i18n-KO] Translated `guides/webhooks_server.md` to Korean ([2145](https://github.com/huggingface/huggingface_hub/pull/2145))
* 🌐 [i18n-KO] Translated `package_reference/cache.md` to Korean ([2191](https://github.com/huggingface/huggingface_hub/pull/2191))
* fabxoe
* 🌐 [i18n-KO] Translated `package_reference/tensorboard.md` to Korean ([2173](https://github.com/huggingface/huggingface_hub/pull/2173))
* 🌐 [i18n-KO] Translated `package_reference/inference_types.md` to Korean ([2171](https://github.com/huggingface/huggingface_hub/pull/2171))
* 🌐 [i18n-KO] Translated `package_reference/hf_api.md` to Korean ([2165](https://github.com/huggingface/huggingface_hub/pull/2165))
* 🌐 [i18n-KO] Translated `package_reference/mixins.md` to Korean ([2166](https://github.com/huggingface/huggingface_hub/pull/2166))
* junejae
* 🌐 [i18n-KO] Translated `guides/upload.md` to Korean ([2139](https://github.com/huggingface/huggingface_hub/pull/2139))
* 🌐 [i18n-KO] Translated `reference/repository.md` to Korean ([2189](https://github.com/huggingface/huggingface_hub/pull/2189))
* heuristicwave
* 🌐 [i18n-KO] Translating `guides/hf_file_system.md` to Korean ([2146](https://github.com/huggingface/huggingface_hub/pull/2146))
* usr-bin-ksh
* 🌐 [i18n-KO] Translated `guides/inference_endpoints.md` to Korean ([2164](https://github.com/huggingface/huggingface_hub/pull/2164))

0.999

_update_metadata_model_index(existing_results, new_results, overwrite=True)


[{'dataset': {'name': 'IMDb', 'type': 'imdb'},
'metrics': [{'name': 'Accuracy', 'type': 'accuracy', 'value': 0.999}],
'task': {'name': 'Text Classification', 'type': 'text-classification'}}]


2. Add new metric to existing result

py
new_results = deepcopy(existing_results)
new_results[0]["metrics"][0]["name"] = "Recall"
new_results[0]["metrics"][0]["type"] = "recall"


[{'dataset': {'name': 'IMDb', 'type': 'imdb'},
'metrics': [{'name': 'Accuracy', 'type': 'accuracy', 'value': 0.995},
{'name': 'Recall', 'type': 'recall', 'value': 0.995}],
'task': {'name': 'Text Classification', 'type': 'text-classification'}}]


3. Add new result
py
new_results = deepcopy(existing_results)
new_results[0]["dataset"] = {'name': 'IMDb-2', 'type': 'imdb_2'}


[{'dataset': {'name': 'IMDb', 'type': 'imdb'},
'metrics': [{'name': 'Accuracy', 'type': 'accuracy', 'value': 0.995}],
'task': {'name': 'Text Classification', 'type': 'text-classification'}},
{'dataset': ({'name': 'IMDb-2', 'type': 'imdb_2'},),
'metrics': [{'name': 'Accuracy', 'type': 'accuracy', 'value': 0.995}],
'task': {'name': 'Text Classification', 'type': 'text-classification'}}]


* ENH Add update metadata to repocard by lvwerra in 844

Improvements and bug fixes

* Keras: Saving history in a JSON file by merveenoyan in 861
* space after uri by leondz in 866

0.29.1

This patch release includes two fixes:

- Fix revision bug in _upload_large_folder.py 2879
- bug fix in inference_endpoint wait function for proper waiting on update 2867

**Full Changelog**: https://github.com/huggingface/huggingface_hub/compare/v0.29.0...v0.29.1

0.29.0

We’re thrilled to announce the addition of three more outstanding serverless Inference Providers to the Hugging Face Hub: [Fireworks AI](https://fireworks.ai/), [Hyperbolic](https://hyperbolic.xyz/), [Nebius AI Studio](https://nebius.com/), and [Novita](https://novita.ai/). These providers join our growing ecosystem, enhancing the breadth and capabilities of serverless inference directly on the Hub’s model pages. This release adds official support for these 3 providers, making it super easy to use a wide variety of models with your preferred providers.

See our announcement blog for more details: https://huggingface.co/blog/new-inference-providers.

* Add Fireworks AI provider + instructions for new provider by Wauplin in 2848
* Add Hyperbolic provider by hanouticelina in 2863
* Add Novita provider by hanouticelina in 2865
* Nebius AI Studio provider added by Aktsvigun in 2866
* Add Black Forest Labs provider by hanouticelina in 2864

Note that Black Forest Labs is not yet supported on the Hub. Once we announce it, `huggingface_hub 0.29.0` will automatically support it.

⚑ Other Inference updates

* Default to `base_url` if provided by Wauplin in 2805
* update supported models by hanouticelina in 2813
* [InferenceClient] Better handling of task parameters by hanouticelina in 2812
* Add YuE (music gen) from fal.ai by Wauplin in 2801
* [InferenceClient] Renaming `extra_parameters` to `extra_body` by hanouticelina in 2821
* fix automatic-speech-recognition output parsing by hanouticelina in 2826
* [Bot] Update inference types by HuggingFaceInfra in 2791
* Support inferenceProviderMapping as expand property by Wauplin in 2841
* Handle extra fields in inference types by Wauplin in 2839
* [InferenceClient] Add dynamic inference providers mapping by hanouticelina in 2836
* (misc) Deprecate some hf-inference specific features (wait-for-model header, can't override model's task, get_model_status, list_deployed_models) by Wauplin in 2851
* Partial revert 2851: allow task override on sentence-similarity by Wauplin in 2861
* Fix Inference Client VCR tests by hanouticelina in 2858
* update new provider doc by hanouticelina in 2870


πŸ’” Breaking changes

None.

πŸ› οΈ Small fixes and maintenance

😌 QoL improvements
* dev(narugo): add resume for ranged headers of http_get function by narugo1992 in 2823

πŸ› Bug and typo fixes
* [Docs] Fix broken link in CLI guide documentation by hanouticelina in 2799
* fix by anael-l in 2806): Replace urljoin for HF_ENDPOINT paths
* InferenceClient some minor docstrings thingies by julien-c in 2810
* Do not send staging token to production by Wauplin in 2811
* Add `HF_DEBUG` environment variable for debugging/reproducibility by Wauplin in 2819
* Fix curlify by Wauplin in 2828
* Improve whoami() error messages by specifying token source by aniketqw in 2814
* Fix error message if invalid token on file download by Wauplin in 2847
* Fix test_dataset_info (missing dummy dataset) by Wauplin in 2850
* Fix is_jsonable if integer key in dict by Wauplin in 2857

πŸ—οΈ internal

* another test by Wauplin (direct commit on main)
* feat(ci): ignore unverified trufflehog results by Wauplin in 2837
* Add datasets and diffusers to prerelease tests by Wauplin in 2834
* Always proxy hf-inference calls + update tests by Wauplin in 2798
* Skip list_models(inference=...) tests in CI by Wauplin in 2852
* Deterministic test_export_folder (dduf testsΓ  by Wauplin in 2854
* [cleanup] Unique constants in tests + env variable for inference tests by Wauplin in 2855
* feat: Adds a new environment variable HF_HUB_USER_AGENT_ORIGIN to set origin of calls in user-agent by Hugoch in 2869


Significant community contributions

The following contributors have made significant changes to the library over the last release:

* narugo1992
* dev(narugo): add resume for ranged headers of http_get function (2823)
* Aktsvigun
* Nebius AI Studio provider added (2866)

0.28.1

Release 0.28.0 introduced a bug making it impossible to set a `HF_ENDPOINT` env variable with a value with a subpath. This has been fixed in https://github.com/huggingface/huggingface_hub/pull/2807.

**Full Changelog**: https://github.com/huggingface/huggingface_hub/compare/v0.28.0...v0.28.1

0.28.0

⚑️Unified Inference Across Multiple Inference Providers
<img width="1406" alt="Screenshot 2025-01-28 at 12 05 42" src="https://github.com/user-attachments/assets/5d0e8515-c895-46ee-8fba-96d31e40c2f3" />


The `InferenceClient` now supports third-party providers, offering a unified interface to run inference across multiple services while leveraging models from the Hugging Face Hub. This update enables developers to:
- **🌐 Switch providers seamlessly** - Transition between inference providers with a single interface.
- **πŸ”— Unified model IDs** - Always reference Hugging Face Hub model IDs, even when using external providers.
- **πŸ”‘ Simplified billing and access management** - You can use your Hugging Face Token for routing to third-party providers (billed through your HF account).

A list of supported third-party providers can be found [here](https://huggingface.co/docs/huggingface_hub/main/en/guides/inference#supported-providers-and-tasks).


Example of text-to-image inference with [Replicate](https://replicate.com/):
python
>>> from huggingface_hub import InferenceClient

>>> replicate_client = InferenceClient(
... provider="replicate",
... api_key="my_replicate_api_key", Using your personal Replicate key
)
>>> image = replicate_client.text_to_image(
... "A cyberpunk cat hacking neural networks",
... model="black-forest-labs/FLUX.1-schnell"
)
>>> image.save("cybercat.png")

Another example of chat completion with [Together AI](https://www.together.ai/):
python
>>> from huggingface_hub import InferenceClient
>>> client = InferenceClient(
... provider="together", Use Together AI provider
... api_key="<together_api_key>", Pass your Together API key directly
... )
>>> client.chat_completion(
... model="deepseek-ai/DeepSeek-R1",
... messages=[{"role": "user", "content": "How many r's are there in strawberry?"}],
... )

When using external providers, you can choose between two access modes: either use the provider's native API key, as shown in the examples above, or route calls through Hugging Face infrastructure (billed to your HF account):
python
>>> from huggingface_hub import InferenceClient
>>> client = InferenceClient(
... provider="fal-ai",
... token="hf_****" Your Hugging Face token
)

⚠️ Parameters availability may vary between providers - check provider documentation.
πŸ”œ New providers/models/tasks will be added iteratively in the future.
πŸ‘‰ You can find a list of supported tasks per provider and more details [here](https://huggingface.co/docs/huggingface_hub/main/en/guides/inference).



> - [InferenceClient] Add third-party providers support by hanouticelina in 2757
> - Unified `prepare_request` method + class-based providers by Wauplin in 2777
> - [InferenceClient] Support proxy calls for 3rd party providers by hanouticelina in 2781
> - [InferenceClient] Add `text-to-video` task and update supported tasks and models by hanouticelina in 2786
> - Add type hints for providers by Wauplin in 2788
> - [InferenceClient] Update inference documentation by hanouticelina in 2776
> - Add text-to-video to supported tasks by Wauplin in 2790

✨ HfApi
The following change aligns the client with server-side updates by adding new repositories properties: `usedStorage` and `resourceGroup`.
> [HfApi] update list of repository properties following server side updates by hanouticelina in 2728

Extends empty commit prevention to file copy operations, preserving clean version histories when no changes are made.
> [HfApi] prevent empty commits when copying files by hanouticelina in 2730

🌐 πŸ“š Documentation
Thanks to WizKnight, the hindi translation is much better!
> Improved Hindi Translation in DocumentationπŸ“ by WizKnight in 2697
πŸ’” Breaking changes
The `like` endpoint has been removed to prevent misuse. You can still remove existing likes using the `unlike`endpoint.
> [HfApi] remove `like` endpoint by hanouticelina in 2739

πŸ› οΈ Small fixes and maintenance

😌 QoL improvements

- [InferenceClient] flag `chat_completion()`'s `logit_bias` as UNUSED by hanouticelina in 2724
- Remove unused parameters from method's docstring by hanouticelina in 2738
- Add optional rejection_reason when rejecting a user access token by Wauplin in 2758
- Add `py.typed` to be compliant with PEP-561 again by hanouticelina in 2752
πŸ› Bug and typo fixes
- Fix super_squash_history revision not urlencoded by [Wauplin](https://huggingface.co/Wauplin) in #2795
- Replace model repo with repo in docstrings by albertvillanova in 2715
- [BUG] Fix 404 NOT FOUND issue caused by endpoint tail slash by Mingqi2 in 2721
- Fix `typing.get_type_hints` call on a `ModelHubMixin` by aliberts in 2729
- fix typo by qwertyforce in 2762
- rejection reason docstring by Wauplin in 2764
- Add timeout to WeakFileLock by Wauplin in 2751
- Fix`CardData.get()` to respect default values when `None` by hanouticelina in 2770
- Fix RepoCard.load when passing a repo_id that is also a dir path by Wauplin in 2771
- Fix filename too long when downloading to local folder by Wauplin in 2789
πŸ—οΈ internal
- Migrate to new Ruff "2025 style guide" formatter by hanouticelina in 2749
- remove org tokens tests by hanouticelina in 2759
- Fix `RepoCard` test on Windows by hanouticelina in 2774
- [Bot] Update inference types by HuggingFaceInfra in 2712

Page 1 of 17

Β© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.