Spacy-llm

Latest version: v0.7.3

Safety actively analyzes 701786 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 4 of 5

0.2.0

:sparkles: New features and improvements

- NEW: New relation extraction task `spacy.REL.v1` ([114](https://github.com/explosion/spacy-llm/pull/114))
- NEW: New spancat task `spacy.SpanCat.v1` for entity recognition with overlapping spans ([101](https://github.com/explosion/spacy-llm/pull/101))
- NEW: Set prompt templates as string or read them from files (e. g. Jinja templates) using `spacy.FileReader.v1` ([95](https://github.com/explosion/spacy-llm/pull/95))
- Improved prompt for NER task `spacy.NER.v2` ([99](https://github.com/explosion/spacy-llm/pull/99))
- Ability to describe labels in tasks `spacy.NER.v2`, `spacy.SpanCat.v1` ([84](https://github.com/explosion/spacy-llm/pull/84))
- Improved error handling and retry mechanics in REST backend `spacy.REST.v1` ([110](https://github.com/explosion/spacy-llm/pull/110))
- `spacy-llm` can now ([119](https://github.com/explosion/spacy-llm/pull/119)) be installed with all dependencies for
- MiniChain and LangChain with `spacy-llm[minichain]` or `spacy-llm[langchain]` respectively
for locally running models on GPU with `spacy-llm[transformers]`
- Improved type checking to ensure tasks and backend play nice with each other ([83](https://github.com/explosion/spacy-llm/pull/83))
- Use Task abstraction instead of previously two functions template and parse to make the config easier to read ([91](https://github.com/explosion/spacy-llm/pull/91))

๐Ÿ”ดย Bug fixes
- Fix improper doc identity check in caching ([104](https://github.com/explosion/spacy-llm/pull/104))
- Fix multiprocessing support for `.pipe()` ([117](https://github.com/explosion/spacy-llm/pull/117))

๐Ÿ“– Documentation and examples
- Updated the readme with information about the [caching](https://github.com/explosion/spacy-llm#cache) mechanism.
- New usage examples:
- [Performing multiple tasks in a single pipeline](https://github.com/explosion/spacy-llm/tree/v0.2.0/usage_examples/multitask_openai)
- [Relation extraction using LLMs](https://github.com/explosion/spacy-llm/tree/v0.2.0/usage_examples/rel_openai)
- [Using OpenAI models with LangChain](https://github.com/explosion/spacy-llm/tree/v0.2.0/usage_examples/ner_langchain_openai)
- [Using OpenAI models with MiniChain](https://github.com/explosion/spacy-llm/tree/v0.2.0/usage_examples/ner_minichain_openai)

๐Ÿ‘ฅ Contributors
adrianeboyd, bdura, honnibal, ines, kabirkhan, ljvmiranda921, rmitsch, svlandeg

0.1.2

- Fix processing of LLM response for binary textcat

0.1.1

- Update setup.cfg
- Remove python-dotenv dependency

0.1.0

This package integrates Large Language Models (LLMs) into spaCy, featuring a modular system for fast prototyping and prompting, and turning unstructured responses into robust outputs for various NLP tasks, no training data required.

โœจ New features

- Serializable `llm` component to integrate prompts into your pipeline
- Modular functions to define the `task` (prompting and parsing) and `backend`(model to use)
- Support for hosted APIs and self-hosted open-source models
- Integration with `MiniChain` and `LangChain`
- Access to OpenAI API, including GPT-4 and various GPT-3 models
- Built-in support for open-source Dolly models hosted on Hugging Face
- Usage examples for Named Entity Recognition and Text Classification
- Easy implementation of your own functions via spaCy's registry for custom prompting, parsing and model integrations

๐Ÿ“– Documentation and examples


- Succint example use-cases: [https://github.com/explosion/spacy-llm/blob/v0.1.0/README.md#-usage](https://github.com/explosion/spacy-llm/blob/v0.1.0/README.md#-usage)
- Full examples: [https://github.com/explosion/spacy-llm/tree/v0.1.0/usage_examples](https://github.com/explosion/spacy-llm/tree/v0.1.0/usage_examples)
- API documentation: [https://github.com/explosion/spacy-llm/tree/v0.1.0/#-api](https://github.com/explosion/spacy-llm/tree/v0.1.0#-api)

๐Ÿ‘ฅ Contributors

adrianeboyd, bdura, honnibal, ines, kabirkhan, kadarakos, koaning, ljvmiranda921, rmitsch, shadeMe, svlandeg

0.1.0a4

Final version before release

0.1.0a3

Page 4 of 5

ยฉ 2025 Safety CLI Cybersecurity Inc. All Rights Reserved.