Skip to content

Instantly share code, notes, and snippets.

@bsantraigi
Created November 3, 2019 12:37
Show Gist options
  • Save bsantraigi/ee52562f4d9e37189bb9149168baa521 to your computer and use it in GitHub Desktop.
Save bsantraigi/ee52562f4d9e37189bb9149168baa521 to your computer and use it in GitHub Desktop.
Snippets for Gradients in PyTorch | Clip Gradient Norm
# Check Gradient
for p in model.parameters():
param_norm = p.grad.data.norm(2)
total_norm += param_norm.item() ** 2
total_norm = total_norm ** (1. / 2)
# Clip Gradient Norm
optimizer.zero_grad()
loss, hidden = model(data, hidden, targets)
loss.backward()
torch.nn.utils.clip_grad_norm_(model.parameters(), args.clip)
optimizer.step()
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment