Expand description
Trait and implementations for optimizers.
Structs§
- AdaDelta
- The AdaDelta optimizer with a learning rate
- AdaDelta
Builder - Builder for
AdaDelta
. - AdaGrad
- The Adagrad optimizer.
- AdaGrad
Builder - Builder for
AdaGrad
. - Adafactor
- The Adafactor optimizer.
- Adafactor
Builder - Builder for
Adafactor
. - Adafactor
State - State of the Adafactor optimizer.
- Adam
- The Adam optimizer.
- Adam
Builder - Builder for
Adam
. - AdamW
- The AdamW optimizer [1].
- AdamW
Builder - Builder for
AdamW
. - Adamax
- The Adamax optimizer, a variant of Adam based on the infinity norm [1].
- Adamax
Builder - Builder for
Adamax
. - Lion
- The Lion optimizer [1].
- Lion
Builder - Builder for
Lion
. - RmsProp
- The RMSprop optimizer [1].
- RmsProp
Builder - Builder for
RmsProp
. - Sgd
- Stochastic gradient descent optimizer.
- SgdBuilder
- Builder for
Sgd
.
Traits§
- Optimizer
- Trait for optimizers.
- Optimizer
State - Trait for optimizer states.
Functions§
- clip_
grad_ norm - Clips the global norm of the gradients
Type Aliases§
- Adafactor
Beta1 - Type alias for the beta1 used in Adafactor
- Adafactor
Builder Beta1 Option<f32>
Type alias for the beta1 used in Adafactor builder due to limitation in thegenerate_builder
macro- Adafactor
Builder Lr Option<Array>
. Type alias for the learning rate used in Adafactor builder due to limitation in thegenerate_builder
macro- Adafactor
Eps - Type alias for the epsilon values used in Adafactor builder
- Adafactor
Lr - Type alias for the learning rate used in Adafactor
- Betas
(f32, f32O)
. Type alias for betas in the Adam/AdamW/Adamax optimizer builders due to limitation in thegenerate_builder
macro- Maybe
Clipped Grads - Type alias for clipped gradients that is returned by
clip_grad_norm
. - State
- Type alias for common optimizer state.