LayerNorm

LayerNorm#

class penzai.nn.standardization.LayerNorm[source]#

Bases: Sequential

Layer normalization layer.

Layer normalization layers consist of:

  • standardization over a feature axis or axes,

  • a learned parallel rescaling of each feature along those axes,

  • and a learned bias for those axes.

For flexibility, LayerNorm is a subclass of Sequential.

Methods

__init__(sublayers)

from_config(across_axes[, epsilon, dtype])

Constructs a layer normalization layer.

Attributes

sublayers

Inherited Methods

(expand to view inherited methods)

attributes_dict()

Constructs a dictionary with all of the fields in the class.

from_attributes(**field_values)

Directly instantiates a struct given all of its fields.

input_structure()

Returns the input structure of this layer.

key_for_field(field_name)

Generates a JAX PyTree key for a given field name.

output_structure()

Returns the output structure of this layer.

select()

Wraps this struct in a selection, enabling functional-style mutations.

tree_flatten()

Flattens this tree node.

tree_flatten_with_keys()

Flattens this tree node with keys.

tree_unflatten(aux_data, children)

Unflattens this tree node.

treescope_color()

__call__(value)

Runs each of the sublayers in sequence.

classmethod from_config(across_axes: dict[str, int], epsilon: float | jax.Array = 1e-06, dtype: jax.typing.DTypeLike = <class 'jax.numpy.float32'>) LayerNorm[source]#

Constructs a layer normalization layer.

Parameters:
  • across_axes – Names and lengths of the axes to normalize over.

  • epsilon – Epsilon parameter for the standardization step.

  • dtype – Dtype of the scale and shift parameters.

Returns:

A newly-constructed LayerNorm layer.