Spacy-llm

Latest version: v0.7.3

Safety actively analyzes 710342 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 3 of 5

0.4.1

:sparkles: New features and improvements
- Verify authentication details at init (instead of at run) time for Anthropic and Cohere models (206)

🔴 Bug fixes
- Update OpenLLaMA model names after updates on HuggingFace (209)
- Fix incorrectly spelled model names for OpenAI models in migration guide (210)

👥 Contributors
honnibal, ines, rmitsch, svlandeg

0.4.0

:sparkles: New features and improvements

- NEW: Refactored to transition from backend- to model-centric architecture. Note: this is breaking, you'll need to adjust your configs (176)
- NEW: Support for Falcon via HuggingFace (179)
- NEW: Extract prompt examples from component initialization (163)
- NEW: Summary task `spacy.Summary.v1` (181)
- NEW: Sentiment analysis task `spacy.Sentiment.v1` (200)
- More thorough check for label inconsistencies in span-related tasks NER, REL, SpanCat, TextCat (183)
- Update `langchain` pin (196)
- Make fewshot file reader more robust w.r.t. file formats (184)

⚠️ Backwards incompatibilities
- Built-in support for MiniChain was dropped (176)
- The switch from a backend- to a model-centric architecture (176) requires light adjustments in your config. Check out the [migration guide](https://github.com/explosion/spacy-llm/blob/fb393388e4b85ef1ac9b033dcda489b83307ecca/migration_guide.md#03x-to-04x) to see how to update your config

👥 Contributors
bdura, honnibal, ines, kabirkhan, koaning, rmitsch, shadeMe, svlandeg, vin-ivar

0.3.2

🔴 Bug fixes
- Use `doc._context` to ensure that `nlp.pipe(..., as_tuples=True)` works (188)
- Fix issue with caching that prevented last doc in cache batch being cached with their LLM IO data (i. e. raw prompt and LLM response) (191)

👥 Contributors
honnibal, ines, kabirkhan, rmitsch

0.3.1

:sparkles: New features and improvements

- Make type validation optional with the new `validate_types` flag (178)

🔴 Bug fixes
- Fixed `nlp.pipe_labels()` not working for the `llm` component (175)

👥 Contributors
honnibal, ines, kabirkhan, rmitsch

0.3.0

:sparkles: New features and improvements

- NEW: Optional storing of prompts and responses in `Doc` objects (127)
- NEW: Optional logging of prompts and responses (80)
- NEW: Streamlit demo (102)
- NEW: Support for Cohere in backend `spacy.REST.v1` (165)
- NEW: Support for Anthropic in backend `spacy.REST.v1` (157)
- NEW: Support for OpenLLaMA via HuggingFace with the backend `spacy.OpenLLaMa_HF.v1` (151)
- NEW: Support for StableLM via HuggingFace with the backend `spacy.StableLM_HF.v1` (141)
- NEW: Lemmatization task `spacy.Lemma.v1` (164)

🔴 Bug fixes
- Fix bug with sending empty prompts if all `Doc` objects are cached (166)
- Fix issue with `LangChain` model creation due to updated argument name (162)

👥 Contributors
adrianeboyd, bdura, honnibal, ines, kabirkhan, ljvmiranda921, rmitsch, svlandeg, victorialslocum, vin-ivar

0.2.1

:sparkles: New features and improvements

- NEW: `llm` component supports scoring, like other spaCy components (135)
- Labels for `spacy.NER.v2`, `spacy.REL.v1`, `spacy.SpanCat.v2`, `spacy.TextCat.v2` can be specified as list (137)

🔴 Bug fixes
- Fix type comparison in type checks failing on some platforms (158)
- Fix example 3 in readme failing (137)

👥 Contributors
adrianeboyd, bdura, honnibal, ines, kabirkhan, KennethEnevoldsen, rmitsch, svlandeg, vin-ivar

Page 3 of 5

© 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.