mlx_rs

Module nn

Source
Expand description

Neural network support for MLX

All modules provide a new() function that take mandatory parameters and other methods to set optional parameters.

Structs§

Enums§

Traits§

Functions§

  • Builds a new QuantizedLinear
  • Applies the Continuously Differentiable Exponential Linear Unit.
  • Applies the Exponential Linear Unit.
  • Applies the Gaussian Error Linear Units function.
  • An approximation to Gaussian Error Linear Unit.
  • A fast approximation to Gaussian Error Linear Unit.
  • Applies the gated linear unit function.
  • Applies the hardswish function, element-wise.
  • Applies the Leaky Rectified Linear Unit.
  • Applies the Log Sigmoid function.
  • Applies the Log Softmax function.
  • Applies the Mish function, element-wise.
  • Applies the element-wise parametric ReLU.
  • Quantize a module.
  • Applies the Rectified Linear Unit.
  • Applies the Rectified Linear Unit 6.
  • Applies the Scaled Exponential Linear Unit.
  • Applies the element-wise sigmoid logistic sigmoid.
  • Applies the Sigmoid Linear Unit. Also known as Swish.
  • Applies the Exponential Linear Unit.
  • Applies the Softsign function.
  • Applies the Step Activation Function.
  • Transform the passed function f(model, args) to a function that computes the gradients of f with regard to the model’s trainable parameters and also its value.

Type Aliases§