neural_compressor.adaptor.tf_utils.graph_rewriter.generic.fuse_layer_norm

Fuse small ops to LayerNorm Graph Rewriter.

Module Contents

Classes

FuseLayerNormOptimizer

Remap smaller ops into fused LayerNorm.

Functions

node_name_from_input(node_name)

Strips off ports and other decorations to get the underlying node name.

node_from_map(node_map, name)

Pulls a node def from a dictionary for a given name.

values_from_const(node_def)

Extracts the values from a const NodeDef as a numpy ndarray.

class neural_compressor.adaptor.tf_utils.graph_rewriter.generic.fuse_layer_norm.FuseLayerNormOptimizer(input_graph_def)[source]

Remap smaller ops into fused LayerNorm.

Current fusion is only for the case, when LayerNormalization uses FusedBatcNormV3. And further restrict it to only 2D or 3D tensor inputs to keras LayerNormalization api.

neural_compressor.adaptor.tf_utils.graph_rewriter.generic.fuse_layer_norm.node_name_from_input(node_name)[source]

Strips off ports and other decorations to get the underlying node name.

neural_compressor.adaptor.tf_utils.graph_rewriter.generic.fuse_layer_norm.node_from_map(node_map, name)[source]

Pulls a node def from a dictionary for a given name.

Parameters:
  • node_map – Dictionary containing an entry indexed by name for every node.

  • name – Identifies the node we want to find.

Returns:

NodeDef of the node with the given name.

Raises:

ValueError – If the node isn’t present in the dictionary.

neural_compressor.adaptor.tf_utils.graph_rewriter.generic.fuse_layer_norm.values_from_const(node_def)[source]

Extracts the values from a const NodeDef as a numpy ndarray.

Parameters:

node_def – Const NodeDef that has the values we want to access.

Returns:

Numpy ndarray containing the values.

Raises:

ValueError – If the node isn’t a Const.