Fast-bert

Latest version: v2.0.26

Safety actively analyzes 682404 Python packages for vulnerabilities to keep your Python projects secure.

Scan your dependencies

1.8.0

Release 1.8.0 - New Learning Rate finder integrated with learner object.

1.7.0

We have switched to Auto-model for Multi-class classification. This would let you train any pretrained model architecture for text classification.

1.6.0

Now supports the initial version of Abstractive Summarisation inference, fast-bert style

In a not so future release, you will be able to use your custom language model fine-tuned on custom corpus for the encoder model.

1.5.1

Fixed some of the bugs related to fastai dependencies.

1.5.0

<h2>Three new models have been added in v1.5.0</h2>

<li>
<ul><b>ALBERT</b> (Pytorch) (from Google Research and the Toyota Technological Institute at Chicago) released with the paper ALBERT: A Lite BERT for Self-supervised Learning of Language Representations, by Zhenzhong Lan, Mingda Chen, Sebastian Goodman, Kevin Gimpel, Piyush Sharma, Radu Soricut.
</ul>
<ul>
<b>CamemBERT</b> (Pytorch) (from Facebook AI Research, INRIA, and La Sorbonne Université), as the first large-scale Transformer language model. Released alongside the paper CamemBERT: a Tasty French Language Model by Louis Martin, Benjamin Muller, Pedro Javier Ortiz Suarez, Yoann Dupont, Laurent Romary, Eric Villemonte de la Clergerie, Djame Seddah, and Benoît Sagot. It was added by louismartin with the help of julien-c.
</ul>
<ul>
<b>DistilRoberta</b> (Pytorch) from VictorSanh as the third distilled model after DistilBERT and DistilGPT-2.
</ul>
</li>

Links

Releases

© 2024 Safety CLI Cybersecurity Inc. All Rights Reserved.