Linktransformer

Latest version: v0.1.14

Safety actively analyzes 623860 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 3

0.1.13

- Major Update : Allowed Online contrastive loss in model training with paired data with labels
- Fixed the behaviour of the custom suffix feature in merge functions
- Fixed an incorrect piece of preprocessing code for paired data with labels - we recommend training such models again for a substantial increase in performance
- Allowed loss_type in the train_model args - "supcon" and "onlinecontrastive"
- Made some changes in the linkage configs to allow loss params into the training args. loss_params is a dictionary containing "temperature" for supcon loss and "margin" for onlinecontrastive.

0.1.12

- Fixes in merge and classification inference to be in line with OpenAI API changes
- Bug fixes in column serialisation when using open ai embeddings
- Updated toml file to restrict by package versioning - forward compatibility would not be supported
- Made merge with blocking faster - no longer needs loading the model for each block
- All Linkage inference functions can now take in a LinkTransformer model (wrapper around SentenceTransformer) as input. This would be useful for workflows requiring looping; preventing repeated model loading.
- More robust typing where it was lacking
- Fixes in cluster functions and bringing them to the root directory

0.1.11

- Allowed a progress bar for training inference
- Fixed a bug in tokenizer saving

0.1.10

- Allowed a progress bar for classification inference

0.1.9

- Minor bug fixes

0.1.8

- More fixes; performance improvements
- Added classification features - infer using HF transformers, OpenAI (Chat), and train custom transformer models in 1 line
- More demo datasets

Page 1 of 3

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.