tlt.models.tf_model.TFModel

class tlt.models.tf_model.TFModel(model_name: str, framework: FrameworkType, use_case: UseCaseType)[source]

Base class to represent a TF pretrained model

__init__(model_name: str, framework: FrameworkType, use_case: UseCaseType)[source]

Class constructor

Methods

__init__(model_name, framework, use_case)

Class constructor

benchmark(dataset[, saved_model_dir, ...])

Use Intel Neural Compressor to benchmark the model with the dataset argument.

cleanup_saved_objects_for_distributed()

evaluate(dataset)

Evaluate the model using the specified dataset.

export(output_dir)

Exports a trained model as a saved_model.pb file.

export_for_distributed([export_dir, ...])

Exports the model, optimizer, loss, train data and validation data to the export_dir for distributed script to access.

load_from_directory(model_dir)

Loads a saved model from the specified directory

optimize_graph(output_dir[, overwrite_model])

Performs FP32 graph optimization using the Intel Neural Compressor on the model and writes the inference-optimized model to the output_dir.

quantize(output_dir, dataset[, config, ...])

Performs post training quantization using the Intel Neural Compressor on the model using the dataset.

set_auto_mixed_precision(...)

Enable auto mixed precision for training.

train(dataset, output_dir[, epochs, ...])

Train the model using the specified dataset

Attributes

framework

Framework with which the model is compatible

learning_rate

Learning rate for the model

model_name

Name of the model

preprocessor

Preprocessor for the model

use_case

Use case (or category) to which the model belongs