- class mentpy.optimizers.SGDOpt(mentpy.optimizers.base.BaseOpt)
Class for the SGD optimizer.
- Parameters:¶
Examples¶
Create an SGD optimizer
In [1]: opt = mp.optimizers.SGDOpt() In [2]: print(opt) <mentpy.optimizers.sgd.SGDOpt object at 0x7f4950207490>
See also
mp.optimizers.AdamOpt
Constructors¶
- SGDOpt(step_size=0.1, momentum=0.0, nesterov=False)
Initialize the SGD optimizer.
Methods¶
- optimize(f, x0, num_iters=100, callback=None, verbose=False, **)
Optimize a function f using the SGD optimizer.
- optimize_and_gradient_norm(f, x0, num_iters=100, callback=None, ...)
Optimize a function f using the SGD optimizer.
- reset(*args, **kwargs)
Reset the optimizer.
- step(f, x, i, **kwargs)
Take a step of the SGD optimizer.
- update_step_size(x, i, factor=0.99)
Update the step size of the optimizer.