Transformers-js-py

Latest version: v0.19.4

Safety actively analyzes 688087 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 9 of 14

2.1.0

What's new?
Improved [feature extraction pipeline](https://huggingface.co/docs/transformers.js/api/pipelines#pipelinesfeatureextractionpipeline-codepipelinecode) for Embeddings
You can now perform feature extraction on models other than sentence-transformers! All you need to do is target a repo (and/or revision) that was exported with `--task default`. Also be sure to use the correct quantization for your use-case!

**Example**: Run feature extraction with bert-base-uncased (without pooling/normalization).
js
let extractor = await pipeline('feature-extraction', 'Xenova/bert-base-uncased', { revision: 'default' });
let result = await extractor('This is a simple test.');
console.log(result);
// Tensor {
// type: 'float32',
// data: Float32Array [0.05939924716949463, 0.021655935794115067, ...],
// dims: [1, 8, 768]
// }


**Example**: Run feature extraction with bert-base-uncased (with pooling/normalization).
js
let extractor = await pipeline('feature-extraction', 'Xenova/bert-base-uncased', { revision: 'default' });
let result = await extractor('This is a simple test.', { pooling: 'mean', normalize: true });
console.log(result);
// Tensor {
// type: 'float32',
// data: Float32Array [0.03373778983950615, -0.010106077417731285, ...],
// dims: [1, 768]
// }


**Example**: Calculating embeddings with sentence-transformers models.
js
let extractor = await pipeline('feature-extraction', 'Xenova/all-MiniLM-L6-v2');
let result = await extractor('This is a simple test.', { pooling: 'mean', normalize: true });
console.log(result);
// Tensor {
// type: 'float32',
// data: Float32Array [0.09094982594251633, -0.014774246141314507, ...],
// dims: [1, 384]
// }


This also means you can do things like semantic search directly in JavaScript/Typescript! Check out [the Pinecone docs]( https://docs.pinecone.io/docs/semantic-search-text-typescript) for an example app which uses Transformers.js!

Over 100 Transformers.js models on the hub!
We now have 109 models to choose from! Check them out at https://huggingface.co/models?other=transformers.js! If you'd like to contribute models (exported with Optimum), you can tag them with `library_name: "transformers.js"`! Let's make ML more web-friendly!

Misc
- Fixed various quantization/exporting issues


**Full Changelog**: https://github.com/xenova/transformers.js/compare/2.0.2...2.1.0

2.0.2

Fixes issues stemming from ORT's recent release of a buggy version 1.15.0 🙄 (https://www.npmjs.com/package/onnxruntime-web)

Also freezes examples and updates links to use the latest stable wasm files.

2.0.1

Minor bug fixes
- BERT Tokenization for strings with numbers
- Demo site for token-classification (116)

NPM package
- Update keywords

2.0.0

Major changes

- Complete ES6 rewrite
- Documentation live at https://huggingface.co/docs/transformers.js
- Overhauled testing framework (now using [Jest](https://jestjs.io/))
- Improve hub integration (now you can simply pass model ids to `pipeline`, `AutoModel`, `AutoTokenizer`, and `AutoProcessor`.)
- Node.js model caching (https://github.com/xenova/transformers.js/issues/62)

Minor changes

- Added tutorials and example projects (https://github.com/xenova/transformers.js/tree/main/examples)
- CI/CD with GitHub actions
- Various bug fixes and improvements

2.0.0alpha.4

2.0.0alpha.3

Page 9 of 14

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.