Expand description
Neural network support for MLX
All modules provide a new()
function that take mandatory parameters and other methods
to set optional parameters.
Structs§
- Alibi
- Attention with Linear Biases
- Alibi
Input - Input for the
Alibi
module. - Alibi
Input Builder - Builder for
AlibiInput
. - AvgPool1d
- Applies 1-dimensional average pooling.
- AvgPool2d
- Applies 2-dimensional average pooling.
- Batch
Norm - Applies batch normalization [1] on the inputs.
- Batch
Norm Builder - Builder for
BatchNorm
. - Bilinear
- Applies a bilinear transformation to the inputs.
- Bilinear
Builder - Builder for
Bilinear
module - Celu
- Applies the Continuously Differentiable Exponential Linear Unit.
- Celu
Builder - Builder for
Celu
. - Conv1d
- Applies a 1-dimensional convolution over the multi-channel input sequence.
- Conv1d
Builder - Builder for the
Conv1d
module. - Conv2d
- Applies a 2-dimensional convolution over the multi-channel input image.
- Conv2d
Builder - Builder for the
Conv2d
module. - Conv3d
- Applies a 3-dimensional convolution over the multi-channel input image.
- Conv3d
Builder - Builder for the
Conv3d
module. - Conv
Transpose1d - Applies a 1-dimensional convolution over the multi-channel input sequence.
- Conv
Transpose1d Builder - Builder for the
ConvTranspose1d
module. - Conv
Transpose2d - Applies a 2-dimensional convolution over the multi-channel input image.
- Conv
Transpose2d Builder - Builder for the
ConvTranspose2d
module. - Conv
Transpose3d - Applies a 3-dimensional convolution over the multi-channel input image.
- Conv
Transpose3d Builder - Builder for the
ConvTranspose3d
module. - Dropout
- Randomly zero a portion of the elements during training.
- Dropout2d
- Apply 2D channel-wise dropout during training.
- Dropout2d
Builder - Builder for
Dropout2d
. - Dropout3d
- Apply 3D channel-wise dropout during training.
- Dropout3d
Builder - Builder for
Dropout3d
. - Dropout
Builder - Builder for
Dropout
. - Embedding
- Implements a simple lookup table that maps each input integer to a high-dimensional vector.
- Gelu
- Applies the Gaussian Error Linear Units function.
- Gelu
Builder - Builder for
Gelu
. - Glu
- Applies the gated linear unit function.
- GluBuilder
- Builder for
Glu
. - Group
Norm - Applies Group Normalization [1] on the inputs.
- Group
Norm Builder - Builder for
GroupNorm
. - Gru
- A gated recurrent unit (GRU) RNN layer.
- GruBuilder
- Builder for the
Gru
module. - Hard
Swish - Applies the hardswish function, element-wise
- Instance
Norm - Applies instance normalization [1] on the inputs.
- Instance
Norm Builder - Builder for
InstanceNorm
. - Layer
Norm - Applies layer normalization [1] on the inputs.
- Layer
Norm Builder - Builder for
LayerNorm
. - Leaky
Relu - Applies the Leaky Rectified Linear Unit.
- Leaky
Relu Builder - Builder for
LeakyRelu
. - Linear
- Applies an affine transformation to the input.
- Linear
Builder - Builder for
Linear
module - LogSigmoid
- Applies the Log Sigmoid function.
- LogSoftmax
- Applies the Log Softmax function.
- LogSoftmax
Builder - Builder for
LogSoftmax
. - Lstm
- A long short-term memory (LSTM) RNN layer.
- Lstm
Builder - Builder for the
Lstm
module. - Lstm
Input - Input for the LSTM module.
- Lstm
Input Builder - Builder for
LstmInput
. - MaxPool1d
- Applies 1-dimensional max pooling.
- MaxPool2d
- Applies 2-dimensional max pooling.
- Mish
- Applies the Mish function, element-wise.
- Multi
Head Attention - Implements the scaled dot product attention with multiple heads.
- Multi
Head Attention Builder - Builder for the
MultiHeadAttention
module - Multi
Head Attention Input - Input to the
MultiHeadAttention
module - Multi
Head Attention Input Builder - Builder for
MultiHeadAttentionInput
. - Pool
- Abstract pooling layer.
- Prelu
- Applies the element-wise parametric ReLU.
- Prelu
Builder - The builder for the Prelu module.
- Quantized
Embedding - The same as
Embedding
but with a quantized weight matrix. - Quantized
Embedding Builder - Builder for
QuantizedEmbedding
- Quantized
Linear - Applies an affine transformation to the input using a quantized weight matrix.
- Quantized
Linear Builder - Builder for
QuantizedLinear
- Relu
- Applies the Rectified Linear Unit.
- Relu6
- Applies the Rectified Linear Unit 6.
- RmsNorm
- Applies Root Mean Square normalization [1] to the inputs.
- RmsNorm
Builder - Builder for
RmsNorm
. - Rnn
- An Elman recurrent layer.
- RnnBuilder
- Builder for the
Rnn
module. - RnnInput
- Input for the RNN module.
- RnnInput
Builder - Builder for
RnnInput
. - Rope
Input - Input for the
RotaryPositionalEncoding
module. - Rope
Input Builder - Builder for
RopeInput
. - Rotary
Positional Encoding - Implements the rotary positional encoding.
- Rotary
Positional Encoding Builder - Builder for
RotaryPositionalEncoding
. - Selu
- Applies the Scaled Exponential Linear Unit.
- Sequential
- A sequential layer.
- Sigmoid
- Applies the element-wise logistic sigmoid.
- Silu
- Applies the Sigmoid Linear Unit. Also known as Swish.
- Sinusoidal
Positional Encoding - Implements sinusoidal positional encoding.
- Sinusoidal
Positional Encoding Builder - Builder for
SinusoidalPositionalEncoding
. - Softmax
- Applies the Softmax function.
- Softmax
Builder - Builder for
Softmax
. - Softplus
- Applies the Softplus function.
- Softsign
- Applies the Softsign function.
- Step
- Applies the Step Activation Function.
- Step
Builder - Builder for
Step
. - Tanh
- Applies the hyperbolic tangent function
- Transformer
- Implements a standard Transformer model.
- Transformer
Builder - Builder for the
Transformer
module - Transformer
Input - Input to the
Transformer
module - Upsample
- Upsample the input signal spatially
Enums§
- Gelu
Approx - Variants of Gaussian Error Linear Units function.
- Upsample
Mode - Upsample mode
Traits§
- Activation
- A marker trait for activation functions used in transformers.
- Into
Module Value AndGrad - Helper trait for
value_and_grad
- Pooling
- Marker trait for pooling operations.
- Sequential
Module Item - Marker trait for items that can be used in a
Sequential
module.
Functions§
- build_
quantized_ linear - Builds a new
QuantizedLinear
- celu
- Applies the Continuously Differentiable Exponential Linear Unit.
- elu
- Applies the Exponential Linear Unit.
- gelu
- Applies the Gaussian Error Linear Units function.
- gelu_
approximate - An approximation to Gaussian Error Linear Unit.
- gelu_
fast_ approximate - A fast approximation to Gaussian Error Linear Unit.
- glu
- Applies the gated linear unit function.
- hard_
swish - Applies the hardswish function, element-wise.
- leaky_
relu - Applies the Leaky Rectified Linear Unit.
- log_
sigmoid - Applies the Log Sigmoid function.
- log_
softmax - Applies the Log Softmax function.
- mish
- Applies the Mish function, element-wise.
- prelu
- Applies the element-wise parametric ReLU.
- quantize
- Quantize a module.
- relu
- Applies the Rectified Linear Unit.
- relu6
- Applies the Rectified Linear Unit 6.
- selu
- Applies the Scaled Exponential Linear Unit.
- sigmoid
- Applies the element-wise sigmoid logistic sigmoid.
- silu
- Applies the Sigmoid Linear Unit. Also known as Swish.
- softplus
- Applies the Exponential Linear Unit.
- softsign
- Applies the Softsign function.
- step
- Applies the Step Activation Function.
- value_
and_ grad - Transform the passed function
f(model, args)
to a function that computes the gradients off
with regard to the model’s trainable parameters and also its value.
Type Aliases§
- GruInput
- Type alias for the input of the GRU module.
- GruInput
Builder - Type alias for the builder of the input of the GRU module.
- NonLinearity
- Type alias for the non-linearity function.
- Rope
- Type alias for
RotaryPositionalEncoding
. - Rope
Builder - Type alias for
RotaryPositionalEncodingBuilder
. - Sinpe
- Type alias for
SinusoidalPositionalEncoding
. - Sinpe
Builder - Type alias for
SinusoidalPositionalEncodingBuilder
.