neural_compressor.compression.callbacks
This is a module for Component class.
The Component class will be inherited by the class ‘QuantizationAwareTrainingCallbacks’, ‘PruningCallbacks’ and ‘DistillationCallbacks’.
Classes
This is base class of Neural Compressor Callbacks. |
|
This is the class for callbacks of quantization aware training. |
|
This is the class for callbacks of pruning object. |
|
Distillation class derived from Component class. |
Module Contents
- class neural_compressor.compression.callbacks.BaseCallbacks(conf=None, model=None)[source]
This is base class of Neural Compressor Callbacks.
This class will be inherited by the class ‘QuantizationAwareTrainingCallbacks’, ‘PruningCallbacks’ and ‘DistillationCallbacks’. This design is mainly for pruning/distillation/quantization-aware training. In this class will apply all hooks for ‘Quantization’, ‘Pruning’ and ‘Distillation’.
- class neural_compressor.compression.callbacks.QuantizationAwareTrainingCallbacks(conf=None, model=None, adaptor=None)[source]
This is the class for callbacks of quantization aware training.
This design is mainly for Quantization-Aware Training. In this class will apply all hooks for Quantization-Aware Training.
- class neural_compressor.compression.callbacks.PruningCallbacks(conf=None, model=None)[source]
This is the class for callbacks of pruning object.
In this class will apply all hooks for Pruning.
- class neural_compressor.compression.callbacks.DistillationCallbacks(conf=None, model=None)[source]
Distillation class derived from Component class.
Distillation class abstracted the pipeline of knowledge distillation, transfer the knowledge of the teacher model to the student model.
- Parameters:
conf – Distillation_Conf containing teacher model, distillation criterion etc.
model – Student model. It should be neural compressor model.