We are excited to announce the release of D2L 1.0.0-alpha0! We thank all the [265 contributors](https://d2l.ai/chapter_preface/index.html#acknowledgments) who have made this open-source textbook better for everyone.
New Topics and Revision
We have added the following new topics, with discussions of more recent methods such as ResNeXt, RegNet, ConvNeXt, Vision Transformer, Swin Transformer, T5, GPT-1/2/3, zero-shot, one-shot, few-shot, Gato, Imagen, Minerva, and Parti.
* [Object-Oriented Design for Implementation](https://d2l.ai/chapter_linear-regression/oo-design.html)
* [Synthetic Regression Data](https://d2l.ai/chapter_linear-regression/synthetic-regression-data.html)
* [Generalization](https://d2l.ai/chapter_linear-regression/generalization.html)
* [The Base Classification Model](https://d2l.ai/chapter_linear-classification/classification.html)
* [Generalization in Classification](https://d2l.ai/chapter_linear-classification/generalization-classification.html)
* [Generalization in Deep Learning](https://d2l.ai/chapter_multilayer-perceptrons/generalization-deep.html)
* [ResNeXt](https://d2l.ai/chapter_convolutional-modern/resnet.html)
* [Designing Convolution Network Architectures](https://d2l.ai/chapter_convolutional-modern/cnn-design.html)
* [Transformers for Vision](https://d2l.ai/chapter_attention-mechanisms-and-transformers/vision-transformer.html)
* [Large-Scale Pretraining with Transformers](https://d2l.ai/chapter_attention-mechanisms-and-transformers/large-pretraining-transformers.html)
Besides new topics, we have significantly revised all the topics up to transformers. For example, the previous *Linear Neural Networks* and *Multilayer Perceptrons* chapters have been revamped as new chapters of *Linear Neural Networks for Regression*, *Linear Neural Networks for Classification*, and *Multilayer Perceptrons*.
New API
Throughout the book we repeatedly walk through various components including the data, the model, the loss function, and the optimization algorithm. Treating components in deep learning as objects, we can define classes for these objects and their interactions. This object-oriented design for implementation will greatly streamline the presentation. Therefore, inspired by open-source libraries such as [PyTorch Lightning](https://www.pytorchlightning.ai/), we have re-designed the API with three core classes:
* `Module` contains models, losses, and optimization methods;
* `DataModule` provides data loaders for training and validation;
* Both classes are combined using the `Trainer` class, which allows us to train models on a variety of hardware platforms.
For example, with the classic API in previous releases:
model = Multilayer perceptron definition
train_iter, test_iter = d2l.load_data_fashion_mnist(batch_size=256)
loss = nn.CrossEntropyLoss(reduction='none')
trainer = torch.optim.SGD(net.parameters(), lr=0.1)
d2l.train_ch3(model, train_iter, test_iter, loss, num_epochs=10, trainer)
With the new API:
model = Multilayer perceptron definition
data = d2l.FashionMNIST(batch_size=256)
trainer = d2l.Trainer(max_epochs=10)
trainer.fit(model, data)
Lazy Layers in PyTorch
Since v1.8.0, PyTorch offers "lazy" layers where input shape specification is no longer required. For simplicity we will use "lazy" layers whenever we can, such as:
class LinearRegression(d2l.Module):
def __init__(self, lr):
super().__init__()
self.save_hyperparameters()
self.net = nn.LazyLinear(1) Lazy layer with output dimension only
self.net.weight.data.normal_(0, 0.01)
self.net.bias.data.fill_(0)
Ongoing Translations
Join [us](https://github.com/d2l-ai) to improve ongoing translations in:
* [Chinese](https://zh.d2l.ai/) ([Git repo](https://github.com/d2l-ai/d2l-zh))
* [Portuguese](https://pt.d2l.ai/) ([Git repo](https://github.com/d2l-ai/d2l-pt))
* [Turkish](https://tr.d2l.ai/) ([Git repo](https://github.com/d2l-ai/d2l-tr))
* [Vietnamese](https://d2l.aivivn.com/) ([Git repo](https://github.com/d2l-ai/d2l-vi))
* [Korean](https://ko.d2l.ai/) ([Git repo](https://github.com/d2l-ai/d2l-ko))
* [Japanese](https://ja.d2l.ai/) ([Git repo](https://github.com/d2l-ai/d2l-ja))