neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils
Utils for Tensorflow model converting to ONNX model.
Module Contents
Classes
Wrap around TensorProto.* to signify a tensor sequence of a given type. |
Functions
|
Set op name for inserted ops. |
|
Find opset. |
|
Raise error message. |
|
Map numpy dtype to ONNX dtype. |
|
Map ONNX dtype to numpy dtype. |
|
Map node output number to name. |
|
Parse tensorflow node attribute. |
|
Get shape from tensorflow tensor. |
Get shape from tensorflow attr "shape". |
|
|
Convert tensorflow dtype to ONNX. |
|
Get data from tensorflow tensor. |
|
Convert tensorflow tensor to onnx tensor. |
Read tensorflow node attribute names. |
|
|
Infer shapes and dtypes for outputs of the node. |
|
Shape with -1 is not valid in onnx ... |
|
Wrapper for creating onnx graph inputs or outputs. |
|
Save ONNX protobuf file. |
|
Check if it's onnx domain. |
|
Check the object is list or tuple. |
|
Check whether 2 shapes are equal. |
|
Returns an iterator over the graphs/subgraphs of a model (using dfs). |
|
Avoid name conflicts by initializing the counter used by make_name based on the provided model. |
|
Returns the index of the dimension that the strided slice is reading from the shape node or None. |
|
Find nodes with constant inputs and compute their values using TF. |
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.set_name(name)[source]
Set op name for inserted ops.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.find_opset(opset)[source]
Find opset.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.assert_error(bool_val, error_msg, *args)[source]
Raise error message.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.map_numpy_to_onnx_dtype(np_dtype)[source]
Map numpy dtype to ONNX dtype.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.map_onnx_to_numpy_type(onnx_type)[source]
Map ONNX dtype to numpy dtype.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.add_port_to_name(name, nr=0)[source]
Map node output number to name.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.get_tensorflow_node_attr(node, name)[source]
Parse tensorflow node attribute.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.get_tensorflow_tensor_shape(tensor)[source]
Get shape from tensorflow tensor.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.get_tensorflow_node_shape_attr(node)[source]
Get shape from tensorflow attr “shape”.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.map_tensorflow_dtype(dtype)[source]
Convert tensorflow dtype to ONNX.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.get_tensorflow_tensor_data(tensor)[source]
Get data from tensorflow tensor.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.convert_tensorflow_tensor_to_onnx(tensor, name='')[source]
Convert tensorflow tensor to onnx tensor.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.read_tensorflow_node_attrs(node)[source]
Read tensorflow node attribute names.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.infer_onnx_shape_dtype(node, opset_version, input_shapes, input_dtypes, initializers=None)[source]
Infer shapes and dtypes for outputs of the node.
Sometimes, shape inference needs the values of node’s inputs, so initializers are used.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.make_onnx_shape(shape)[source]
Shape with -1 is not valid in onnx …
make it a name.
- class neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.SeqType(tensor_dtype)[source]
Wrap around TensorProto.* to signify a tensor sequence of a given type.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.make_onnx_inputs_outputs(name, elem_type, shape, **kwargs)[source]
Wrapper for creating onnx graph inputs or outputs.
- Parameters:
name – Text
elem_type – TensorProto.DataType
shape – Optional[Sequence[int]]
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.save_protobuf(path, message, as_text=False)[source]
Save ONNX protobuf file.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.is_onnx_domain(domain)[source]
Check if it’s onnx domain.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.is_list_or_tuple(obj)[source]
Check the object is list or tuple.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.are_shapes_equal(src, dest)[source]
Check whether 2 shapes are equal.
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.get_subgraphs_from_onnx(model_proto)[source]
Returns an iterator over the graphs/subgraphs of a model (using dfs).
- neural_compressor.adaptor.tf_utils.graph_rewriter.onnx.tf2onnx_utils.initialize_name_counter(model_proto)[source]
Avoid name conflicts by initializing the counter used by make_name based on the provided model.