You are reading the article What Is Pytorch Lightning With Examples? updated in September 2023 on the website Cersearch.com. We hope that the information we have shared is helpful to you. If you find the content interesting and meaningful, please share it with your friends and continue to follow and support us for the latest updates. Suggested October 2023 What Is Pytorch Lightning With Examples?
Introduction to PyTorch LightningWeb development, programming languages, Software testing & others
What is PyTorch Lightning?PyTorch Lightning is an AI research tool mostly preferred for its high performance where deep learning boilerplate can be abstracted easily so that we have control over the code we are writing in Python. Lightning helps to scale the models, and with this, code enhancement can be done based on our requirement, and this will not scale the boilerplate. A structure is given to the research code in all the ways by the Lightning module with the help of indices and many other components.
There are five sections for organizing code into the Lightning module. They are computations, train loop, validation loop, test loop, and optimizers. First, we have to init to define the computations and forward them to know where the code is pointing to from one end. Then, Training_step is the full training loop of the code, and validation_step is the full validation loop of the code. Similarly, we have test_step for the full testing loop and configure_optimizers to explain the module’s optimizers and schedulers.
Typical ProjectLightning transformers are used as an interface for training transformer models based on SOTA. We can use Lightning callbacks, accelerators, or loggers that help in better performance for training the data. Speed optimizations such as DeepSpeed ZeRo and FairScale Sharded Training can be used to enhance memory and improve performance. We can swap the models and add more configurations based on optimizers and schedulers using Hydra, a config composition. We also have an option of building from scratch with the help of transformer task abstraction that helps in the research and experimentation of the code.
Here a project about lightning transformers is considered into focus.
The first step is to install the module.
pip install lightning-transformerscd lightning-transformers pip install
PyTorch Lightning – ModelWe can design multi-layered neural networks using PyTorch Lightning.
import torch from chúng tôi import functional as Fun from torch import nn from pytorch_lightning.core.lightning import LightningModule class LitMNIST(LightningModule): def __init__(self): super().__init__() self.layer_1 = nn.Linear(14 * 14, 144) self.layer_2 = nn.Linear(144, 288) self.layer_3 = nn.Linear(288, 10) def forward(self, x): batch_size, channels, height, width = x.size() x = x.view(batch_size, -1) x = self.layer_1(x) x = Fun.relu(x) x = self.layer_2(x) x = Fun.relu(x) x = self.layer_3(x) x = Fun.log_softmax(x, dim=1) return xWe can use a Lightning module like the PyTorch module and make necessary changes. Here we are using the MNIST dataset.
class LitMNIST(LightningModule): def training_step(self, batch, batch_idx): x, y = batch logits = self(x) loss = Fun.nll_loss(logits, y) return loss PyTorch Lightning – DataDataLoader is needed for Lightning modules to operate. The following code explains the data using the MNIST dataset.
from chúng tôi import DataLoader, random_split from torchvision.datasets import MNIST import os from torchvision import datasets, transforms transform = transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))]) mnist_train = MNIST(os.getcwd(), train=True, download=True, transform=transform) mnist_train = DataLoader(mnist_train, batch_size=64)DataLoaders can be used in different ways in the Lightning module. For example, the fit function can be used in the dataloader.
model = LitMNIST() trainer = Trainer() trainer.fit(model, mnist_train) class LitMNIST(pl.LightningModule): def train_dataloader(self): transform = transforms.Compose([transforms.ToTensor(), transforms.Normalize((0.1307,), (0.3081,))]) mnist_train = MNIST(os.getcwd(), train=True, download=True, transform=transform) return DataLoader(mnist_train, batch_size=64) def val_dataloader(self): transforms = ... mnist_val = ... return DataLoader(mnist_val, batch_size=64) def test_dataloader(self): transforms = ... mnist_test = ... return DataLoader(mnist_test, batch_size=64)We can use Datasets inside DataLoaders and make it functional without any additional information.
class MyDataModule(LightningDataModule): def __init__(self): super().__init__() self.train_dims = None self.vocab_size = 0 def prepare_data(self): download_dataset() tokenize() build_vocab() def setup(self, stage: Optional[str] = None): vocab = load_vocab() self.vocab_size = len(vocab) self.train, chúng tôi chúng tôi = load_datasets() self.train_dims = self.train.next_batch.size() def train_dataloader(self): transforms = ... return DataLoader(self.train, batch_size=64) def val_dataloader(self): transforms = ... return DataLoader(self.val, batch_size=64) def test_dataloader(self): transforms = ... return DataLoader(self.test, batch_size=64)Dataset definitions can be easily fetched from the data modules.
mnist_dm = MNISTDatamodule() model = LitModel(num_classes=mnist_dm.num_classes) trainer.fit(model, mnist_dm) imagenet_dm = ImagenetDatamodule() model = LitModel(num_classes=imagenet_dm.num_classes) trainer.fit(model, imagenet_dm) PyTorch Lightning examplesInitially, we must install PyTorch and give the model format so that PyTorch will be aware of the dataset present in the code. Then, we should add the training details, scheduler, and optimizer in the model and present them in the code. Finally, we can load the data using the following code.
import pytorch-lightning as pylight from torchvision import datasets,transforms from chúng tôi import DataLoader class Data(pl.LightningDataModule): def prepare_data(self): transform=transforms.Compose([ transforms.ToTensor() ]) self.train_data = datasets.MNIST('', train=True, download=True, transform=transform) self.test_data = datasets.MNIST('', train=False, download=True, transform=transform) def train_dataloader(self): return DataLoader(self.train_data, batch_size= 32, shuffle=True) def val_dataloader(self): return DataLoader(self.test_data, batch_size= 32, shuffle=True) class model(pl.LightningModule): def __init__(self): super(model,self).__init__() chúng tôi = nn.Linear(28*28,256) chúng tôi = nn.Linear(256,128) chúng tôi = nn.Linear(128,10) chúng tôi = 0.01 chúng tôi = nn.CrossEntropyLoss() def configure_optimizers(self): return SGD(self.parameters(),lr = self.lr) def training_step(self, train_batch, batch_idx): x, y = train_batch logits = self.forward(x) loss = self.loss(logits,y) return loss ConclusionIt is easy to use the Lightning module as the readability is more where it avoids all the engineering code and uses only the known modules in Python. Moreover, it is easy to track the code changes, and hence the reproducibility is easy in PyTorch Lightning. Also, lightning helps to run codes in GPU, CPU, and clusters without any additional management.
Recommended ArticlesWe hope that this EDUCBA information on “PyTorch Lightning” was beneficial to you. You can view EDUCBA’s recommended articles for more information.
You're reading What Is Pytorch Lightning With Examples?
Update the detailed information about What Is Pytorch Lightning With Examples? on the Cersearch.com website. We hope the article's content will meet your needs, and we will regularly update the information to provide you with the fastest and most accurate information. Have a great day!