class mentpy.optimizers.SGDOpt(mentpy.optimizers.base.BaseOpt)

Class for the SGD optimizer.

Parameters:
step_size : float, optional

The step size of the optimizer, by default 0.1

momentum : float, optional

The momentum of the optimizer, by default 0.9

nesterov : bool, optional

Whether to use Nesterov momentum, by default False

Examples

Create an SGD optimizer

In [1]: opt = mp.optimizers.SGDOpt()

In [2]: print(opt)
<mentpy.optimizers.sgd.SGDOpt object at 0x7f4950207490>

See also

mp.optimizers.AdamOpt

Constructors

SGDOpt(step_size=0.1, momentum=0.0, nesterov=False)

Initialize the SGD optimizer.

Methods

optimize(f, x0, num_iters=100, callback=None, verbose=False, **)

Optimize a function f using the SGD optimizer.

optimize_and_gradient_norm(f, x0, num_iters=100, callback=None, ...)

Optimize a function f using the SGD optimizer.

reset(*args, **kwargs)

Reset the optimizer.

step(f, x, i, **kwargs)

Take a step of the SGD optimizer.

update_step_size(x, i, factor=0.99)

Update the step size of the optimizer.