Module optimizers

Source
Expand description

Trait and implementations for optimizers.

Structs§

AdaDelta
The AdaDelta optimizer with a learning rate
AdaDeltaBuilder
Builder for AdaDelta.
AdaGrad
The Adagrad optimizer.
AdaGradBuilder
Builder for AdaGrad.
Adafactor
The Adafactor optimizer.
AdafactorBuilder
Builder for Adafactor.
AdafactorState
State of the Adafactor optimizer.
Adam
The Adam optimizer.
AdamBuilder
Builder for Adam.
AdamW
The AdamW optimizer [1].
AdamWBuilder
Builder for AdamW.
Adamax
The Adamax optimizer, a variant of Adam based on the infinity norm [1].
AdamaxBuilder
Builder for Adamax.
Lion
The Lion optimizer [1].
LionBuilder
Builder for Lion.
RmsProp
The RMSprop optimizer [1].
RmsPropBuilder
Builder for RmsProp.
Sgd
Stochastic gradient descent optimizer.
SgdBuilder
Builder for Sgd.

Traits§

Optimizer
Trait for optimizers.
OptimizerState
Trait for optimizer states.

Functions§

clip_grad_norm
Clips the global norm of the gradients

Type Aliases§

AdafactorBeta1
Type alias for the beta1 used in Adafactor
AdafactorBuilderBeta1
Option<f32> Type alias for the beta1 used in Adafactor builder due to limitation in the generate_builder macro
AdafactorBuilderLr
Option<Array>. Type alias for the learning rate used in Adafactor builder due to limitation in the generate_builder macro
AdafactorEps
Type alias for the epsilon values used in Adafactor builder
AdafactorLr
Type alias for the learning rate used in Adafactor
Betas
(f32, f32O). Type alias for betas in the Adam/AdamW/Adamax optimizer builders due to limitation in the generate_builder macro
MaybeClippedGrads
Type alias for clipped gradients that is returned by clip_grad_norm.
State
Type alias for common optimizer state.