learningrate¶
Query and control PyTorch optimizer learning rate.
-
torchutils.learningrate.
get_lr
(optimizer)¶ Get learning rate.
- Parameters
optimizer (optim.Optimizer) – PyTorch optimizer.
- Returns
Learning rate of the optimizer.
- Return type
float
Example:
import torchvision import torchutils as tu import torch.optim as optim model = torchvision.models.alexnet() optimizer = optim.Adam(model.parameters()) current_lr = tu.get_lr(optimizer) print('Current learning rate:', current_lr)
Out:
Current learning rate: 0.001
-
torchutils.learningrate.
set_lr
(optimizer, lr)¶ Set learning rate.
- Parameters
optimizer (optim.Optimizer) – PyTorch optimizer.
lr (float) – New learning rate value.
- Returns
PyTorch optimizer.
- Return type
optim.Optimizer
Example:
import torchvision import torchutils as tu import torch.optim as optim model = torchvision.models.alexnet() optimizer = optim.Adam(model.parameters()) current_lr = tu.get_lr(optimizer) print('Current learning rate:', current_lr) optimizer = tu.set_lr(optimizer, current_lr*0.1) revised_lr = tu.get_lr(optimizer) print('Revised learning rate:', revised_lr)
Out:
Current learning rate: 0.001 Revised learning rate: 0.0001