LayerNorm#
- class penzai.nn.standardization.LayerNorm[source]#
Bases:
Sequential
Layer normalization layer.
Layer normalization layers consist of:
standardization over a feature axis or axes,
a learned parallel rescaling of each feature along those axes,
and a learned bias for those axes.
For flexibility,
LayerNorm
is a subclass ofSequential
.Methods
__init__
(sublayers)from_config
(name, init_base_rng, across_axes)Constructs a layer normalization layer.
Attributes
sublayers
Inherited Methods
(expand to view inherited methods)
attributes_dict
()Constructs a dictionary with all of the fields in the class.
bind_variables
(variables[, allow_unused])Convenience function to bind variables to a layer.
from_attributes
(**field_values)Directly instantiates a struct given all of its fields.
key_for_field
(field_name)Generates a JAX PyTree key for a given field name.
select
()Wraps this struct in a selection, enabling functional-style mutations.
stateless_call
(variable_values, argument, /, ...)Calls a layer with temporary variables, without modifying its state.
tree_flatten
()Flattens this tree node.
tree_flatten_with_keys
()Flattens this tree node with keys.
tree_unflatten
(aux_data, children)Unflattens this tree node.
treescope_color
()__call__
(value, **side_inputs)Runs each of the sublayers in sequence.
- classmethod from_config(name: str, init_base_rng: jax.Array | None, across_axes: dict[str, int], epsilon: float | jax.Array = 1e-06, dtype: jax.typing.DTypeLike = <class 'jax.numpy.float32'>) LayerNorm [source]#
Constructs a layer normalization layer.
- Parameters:
name – The name of the layer.
init_base_rng – The base RNG to use for initializing model parameters.
across_axes – Names and lengths of the axes to normalize over.
epsilon – Epsilon parameter for the standardization step.
dtype – Dtype of the scale and shift parameters.
- Returns:
A newly-constructed
LayerNorm
layer.