Delft

Latest version: v0.3.4

Safety actively analyzes 681812 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

Page 1 of 2

0.3.4

- support multiple GPU training/inference (`--multi-gpu` parameter)
- support safetensors model weights format
- support private HuggingFace models
- in application scripts: add `max-epoch` parameter, learning rate parameter
- add grobid model for funding and acknowledgement information
- more parameter information printed when training
- some dependency updates

0.3.3

* support for incremental training
* fix SciBERT tokenier initialization from HuggingFace model
* updated HuggingFace transformers library to 4.25.1 and tensorflow to 2.9.3
* review the support of BPE tokenizers in the case of pre-tokenized input with the updated transformers library for most transformer models using it (tested with Roberta/GPT2, CamemBERT, bart-base, albert-base-v2, and XLM)
* addition of some model variants for sequence labeling (BERT_FEATURES, BERT_ChainCRF_FEATURES)

0.3.2

* Print model parameters at creation and load time
* Dataset recognition
* Model updates
* Set feature channel embeddings trainable

**Full Changelog**: https://github.com/kermitt2/delft/compare/v0.3.1...v0.3.2

0.3.1

* fix a problem with CRF tensorflow-addons when batch size is 1

0.3.0

* Migration of DeLFT to TensorFlow 2.7
* Support of HuggingFace transformer library (auto* library)
* New architectures and updated models
* General usage of optimizer with learning rate decay
* Updated docs now via readthedoc
* Improved ELMo embeddings
* Transformers wrapper to limit usage of Hugging Face hub only necessary, model with transformer layer fully portable without hub access

0.2.6

* add automatic download of embeddings if not locally available
* enable embedding preload script for docker image

Page 1 of 2

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.