QBoard » Artificial Intelligence & ML » AI and ML - PyTorch » Adding L1/L2 regularization in PyTorch?

Adding L1/L2 regularization in PyTorch?

  • Is there any way, I can add simple L1/L2 regularization in PyTorch? We can probably compute the regularized loss by simply adding the data_loss with the reg_loss but is there any explicit way, any support from PyTorch library to do it more easily without doing it manually?
      December 28, 2020 12:25 PM IST
    0
  • Following should help for L2 regularization:

    optimizer = torch.optim.Adam(model.parameters(), lr=1e-4, weight_decay=1e-5)
    
      August 18, 2021 2:04 PM IST
    0
  • For L2 regularization,

    l2_lambda = 0.01
    l2_reg = torch.tensor(0.)
    for param in model.parameters():
        l2_reg += torch.norm(param)
    loss += l2_lambda * l2_reg

     

    References:

      October 25, 2021 2:00 PM IST
    0
  • This is presented in the documentation for PyTorch. Have a look at http://pytorch.org/docs/optim.html#torch.optim.Adagrad. You can add L2 loss using the weight decay parameter to the Optimization function.
      December 29, 2020 12:09 PM IST
    0