* Support SentenceTransformers models inference by aleksandr-mokrov in https://github.com/huggingface/optimum-intel/pull/865
python
from optimum.intel import OVSentenceTransformer
model_id = "sentence-transformers/all-mpnet-base-v2"
model = OVSentenceTransformer.from_pretrained(model_id, export=True)
sentences = ["This is an example sentence", "Each sentence is converted"]
embeddings = model.encode(sentences)
* Infer if the model needs to be exported or not by echarlaix in https://github.com/huggingface/optimum-intel/pull/825
diff
from optimum.intel import OVModelForCausalLM
- model = OVModelForCausalLM.from_pretrained("gpt2", export=True)
+ model = OVModelForCausalLM.from_pretrained("gpt2")
Compatible with transformers>=4.36,<=4.44
**Full Changelog**: https://github.com/huggingface/optimum-intel/compare/v1.18.0...v1.19.0