Spacy-transformers

Latest version: v1.3.5

Safety actively analyzes 688486 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 2 of 8

1.2.5

* Extend support for transformers up to v4.30.x.

1.2.4

- Extend support for transformers up to v4.29.x.

1.2.3

- Extend support for transformers up to v4.28.x.
- Implement coalesced pooling over entire batches (368).

1.2.2

* `Transformer.predict`: do not broadcast to listeners, requires `spacy>=3.5.0` (345)
* Correct and clarify the handling of empty/zero-length `Doc`s during training and inference (365)
* Remove superfluous datatype and device conversions, requires `torch>=1.8.0` (369)
* Fix memory leak in offsets mapping alignment for fast tokenizers (373)

1.2.1

* Extend support for `transformers` up to v4.26.x.

1.2.0

* For fast tokenizers, use the offset mapping provided by the tokenizer (338).

Using the offset mapping instead of the heuristic alignment from `spacy-alignments` resolves unexpected and missing alignments such as those discussed in https://github.com/explosion/spaCy/discussions/6563, https://github.com/explosion/spaCy/discussions/10794 and https://github.com/explosion/spaCy/discussions/12023.

> :warning: Slow and fast tokenizers will no longer give identical results due to potential differences in the alignments between transformer tokens and spaCy tokens. We recommend retraining all models with fast tokenizers for use with `spacy-transformers` v1.2.
* Serialize the tokenizer `use_fast` setting (339).

Page 2 of 8

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.