How to perform backpropagation with PyTorch?

Chris Staff asked 6 months ago
1 Answers
Best Answer
Chris Staff answered 6 months ago

In PyTorch, you can apply backpropagation with loss.backprop() in the training loop that you define when creating the model. It is performed in each minibatch during each epoch and is part of the default flow between forward pass -> loss computation -> backward pass -> optimization.
Example of this flow in a PyTorch model:

  # Run the training loop
  for epoch in range(0, 5): # 5 epochs at maximum
    
    # Print epoch
    print(f'Starting epoch {epoch+1}')
    
    # Set current loss value
    current_loss = 0.0
    
    # Iterate over the DataLoader for training data
    for i, data in enumerate(trainloader, 0):
      
      # Get inputs
      inputs, targets = data
      
      # Zero the gradients
      optimizer.zero_grad()
      
      # Perform forward pass
      outputs = mlp(inputs)
      
      # Compute loss
      loss = loss_function(outputs, targets)
      
      # Perform backward pass
      loss.backward()
      
      # Perform optimization
      optimizer.step()

Your Answer

9 + 13 =