neural_compressor.adaptor.tf_utils.util

Tensorflow Utils Helper functions.

Module Contents

Functions

version1_lt_version2(version1, version2)

Check if version1 is less than version2.

version1_gt_version2(version1, version2)

Check if version1 is greater than version2.

version1_eq_version2(version1, version2)

Check if version1 is equal to version2.

version1_gte_version2(version1, version2)

Check if version1 is greater than or equal to version2.

version1_lte_version2(version1, version2)

Check if version1 is less than or equal to version2.

disable_random([seed])

A Decorator to disable tf random seed.

read_graph(in_graph[, in_graph_is_binary])

Reads input graph file as GraphDef.

write_graph(out_graph_def, out_graph_file)

Write output graphDef to file.

is_ckpt_format(model_path)

Check the model_path format is ckpt or not.

is_saved_model_format(model_path)

Check the model_path format is saved_model or not.

get_estimator_graph(estimator, input_fn)

Get the graph of the estimator.

get_tensor_by_name(graph, name[, try_cnt])

Get the tensor by name.

iterator_sess_run(sess, iter_op, feed_dict, output_tensor)

Run the graph that have iterator integrated in the graph.

collate_tf_preds(results)

Collate tbe prediction results.

get_input_output_node_names(graph_def)

Get the input node name and output node name of the graph_def.

fix_ref_type_of_graph_def(graph_def)

Fix ref type of the graph_def.

strip_unused_nodes(graph_def, input_node_names, ...)

Strip unused nodes of the graph_def.

strip_equivalent_nodes(graph_def, output_node_names)

Strip nodes with the same input and attr.

get_graph_def(model[, outputs, auto_input_output])

Get the model's graph_def.

get_model_input_shape(model)

Get the inout shape of the input model.

get_tensor_val_from_graph_node(...)

Get the tensor value for given node name.

int8_node_name_reverse(node)

Reverse int8 node name.

tf_diagnosis_helper(fp32_model, quan_model, tune_cfg, ...)

Tensorflow diagnosis helper function.

generate_feed_dict(input_tensor, inputs)

Generate feed dict helper function.

neural_compressor.adaptor.tf_utils.util.version1_lt_version2(version1, version2)

Check if version1 is less than version2.

neural_compressor.adaptor.tf_utils.util.version1_gt_version2(version1, version2)

Check if version1 is greater than version2.

neural_compressor.adaptor.tf_utils.util.version1_eq_version2(version1, version2)

Check if version1 is equal to version2.

neural_compressor.adaptor.tf_utils.util.version1_gte_version2(version1, version2)

Check if version1 is greater than or equal to version2.

neural_compressor.adaptor.tf_utils.util.version1_lte_version2(version1, version2)

Check if version1 is less than or equal to version2.

neural_compressor.adaptor.tf_utils.util.disable_random(seed=1)

A Decorator to disable tf random seed.

neural_compressor.adaptor.tf_utils.util.read_graph(in_graph, in_graph_is_binary=True)

Reads input graph file as GraphDef.

Parameters:
  • in_graph – input graph file.

  • in_graph_is_binary – whether input graph is binary, default True.

Returns:

input graphDef.

neural_compressor.adaptor.tf_utils.util.write_graph(out_graph_def, out_graph_file)

Write output graphDef to file.

Parameters:
  • out_graph_def – output graphDef.

  • out_graph_file – path to output graph file.

Returns:

None.

neural_compressor.adaptor.tf_utils.util.is_ckpt_format(model_path)

Check the model_path format is ckpt or not.

Parameters:

model_path (string) – the model folder path

Returns:

return the ckpt prefix if the model_path contains ckpt format data else None.

Return type:

string

neural_compressor.adaptor.tf_utils.util.is_saved_model_format(model_path)

Check the model_path format is saved_model or not.

Parameters:

model_path (string) – the model folder path

Returns:

return True if the model_path contains saved_model format else False.

Return type:

bool

neural_compressor.adaptor.tf_utils.util.get_estimator_graph(estimator, input_fn)

Get the graph of the estimator.

Parameters:
  • estimator – tf estimator model

  • input_fn – input function

Returns:

graph

neural_compressor.adaptor.tf_utils.util.get_tensor_by_name(graph, name, try_cnt=3)

Get the tensor by name.

Considering the ‘import’ scope when model may be imported more then once, handle naming format like both name:0 and name.

Parameters:
  • graph (tf.compat.v1.GraphDef) – the model to get name from

  • name (string) – tensor of tensor_name:0 or tensor_name without suffixes

  • try_cnt – the times to add ‘import/’ to find tensor

Returns:

tensor got by name.

Return type:

tensor

neural_compressor.adaptor.tf_utils.util.iterator_sess_run(sess, iter_op, feed_dict, output_tensor, iteration=-1, measurer=None)

Run the graph that have iterator integrated in the graph.

Parameters:
  • sess (tf.compat.v1.Session) – the model sess to run the graph

  • iter_op (Operator) – the MakeIterator op

  • feed_dict (dict) – the feeds to initialize a new iterator

  • output_tensor (list) – the output tensors

  • iteration (int) – iterations to run, when -1 set, run to end of iterator

Returns:

the results of the predictions

Return type:

preds

neural_compressor.adaptor.tf_utils.util.collate_tf_preds(results)

Collate tbe prediction results.

neural_compressor.adaptor.tf_utils.util.get_input_output_node_names(graph_def)

Get the input node name and output node name of the graph_def.

neural_compressor.adaptor.tf_utils.util.fix_ref_type_of_graph_def(graph_def)

Fix ref type of the graph_def.

neural_compressor.adaptor.tf_utils.util.strip_unused_nodes(graph_def, input_node_names, output_node_names)

Strip unused nodes of the graph_def.

The strip_unused_nodes pass is from tensorflow/python/tools/strip_unused_lib.py of official tensorflow r1.15 branch

neural_compressor.adaptor.tf_utils.util.strip_equivalent_nodes(graph_def, output_node_names)

Strip nodes with the same input and attr.

neural_compressor.adaptor.tf_utils.util.get_graph_def(model, outputs=[], auto_input_output=False)

Get the model’s graph_def.

neural_compressor.adaptor.tf_utils.util.get_model_input_shape(model)

Get the inout shape of the input model.

neural_compressor.adaptor.tf_utils.util.get_tensor_val_from_graph_node(graph_node_name_mapping, node_name)

Get the tensor value for given node name.

Parameters:
  • graph_node_name_mapping – key: node name, val: node

  • node_name – query node

Returns:

numpy array

Return type:

tensor_val

neural_compressor.adaptor.tf_utils.util.int8_node_name_reverse(node)

Reverse int8 node name.

neural_compressor.adaptor.tf_utils.util.tf_diagnosis_helper(fp32_model, quan_model, tune_cfg, save_path)

Tensorflow diagnosis helper function.

neural_compressor.adaptor.tf_utils.util.generate_feed_dict(input_tensor, inputs)

Generate feed dict helper function.