What's Changed
* Sparse pooling by Flegyas . It is the default subword pooling strategy. It is deterministic, but it doesn't support ONNX runtime export.
* `return_words` parameter is now `subword_pooling_strategy` and the possible values are `sparse`, `scatter` and `none`.
* `Tokenizer` accept `return_sparse_offsets` during initialization. If you are using the `scatter` subword pooling strategy, you can set it to `False` to reduce memory usage.
* Update transformers requirement from <4.18,>=4.3 to >=4.3,<4.19 by dependabot in https://github.com/Riccorl/transformers-embedder/pull/44
New Contributors
* Flegyas made their first contribution in https://github.com/Riccorl/transformers-embedder/pull/43
**Full Changelog**: https://github.com/Riccorl/transformers-embedder/compare/2.0.2...3.0.0