neural_compressor.compression.distillation.optimizers
Intel Neural Compressor built-in Optimizers on multiple framework backends.
Classes
Class to get all registered TensorFlow Optimizers once only. |
|
Class to get all registered PyTorch Optimizers once only. |
|
Main entry to get the specific type of optimizer. |
|
TensorFlow keras SGD optimizer. |
|
tensorflow_addons AdamW optimizer. |
|
Tensorflow Adam optimizer. |
|
PyTorch SGD optimizer. |
Functions
|
Class decorator used to register all Optimizer subclasses. |
Module Contents
- class neural_compressor.compression.distillation.optimizers.TensorflowOptimizers[source]
Class to get all registered TensorFlow Optimizers once only.
- class neural_compressor.compression.distillation.optimizers.PyTorchOptimizers[source]
Class to get all registered PyTorch Optimizers once only.
- class neural_compressor.compression.distillation.optimizers.Optimizers(framework)[source]
Main entry to get the specific type of optimizer.
- neural_compressor.compression.distillation.optimizers.optimizer_registry(optimizer_type, framework)[source]
Class decorator used to register all Optimizer subclasses.
Cross framework optimizer is supported by add param as framework=’tensorflow, pytorch’
- Parameters:
optimizer_type (str) – The string of supported criterion.
framework (str) – The string of supported framework.
- Returns:
The class of register.
- Return type:
cls
- class neural_compressor.compression.distillation.optimizers.TensorFlowSGD(param_dict)[source]
TensorFlow keras SGD optimizer.
- Parameters:
param_dict (dict) – The dict of parameters setting by user for SGD optimizer
- class neural_compressor.compression.distillation.optimizers.TensorFlowAdamW(param_dict)[source]
tensorflow_addons AdamW optimizer.
- Parameters:
param_dict (dict) – The dict of parameters setting by user for AdamW optimizer