<h2>Three new models have been added in v1.5.0</h2>
<li>
<ul><b>ALBERT</b> (Pytorch) (from Google Research and the Toyota Technological Institute at Chicago) released with the paper ALBERT: A Lite BERT for Self-supervised Learning of Language Representations, by Zhenzhong Lan, Mingda Chen, Sebastian Goodman, Kevin Gimpel, Piyush Sharma, Radu Soricut.
</ul>
<ul>
<b>CamemBERT</b> (Pytorch) (from Facebook AI Research, INRIA, and La Sorbonne Université), as the first large-scale Transformer language model. Released alongside the paper CamemBERT: a Tasty French Language Model by Louis Martin, Benjamin Muller, Pedro Javier Ortiz Suarez, Yoann Dupont, Laurent Romary, Eric Villemonte de la Clergerie, Djame Seddah, and Benoît Sagot. It was added by louismartin with the help of julien-c.
</ul>
<ul>
<b>DistilRoberta</b> (Pytorch) from VictorSanh as the third distilled model after DistilBERT and DistilGPT-2.
</ul>
</li>