neural_compressor.compression.distillation.criterions
Initialize critetion classes.
- Classes includes:
TensorFlowCrossEntropyLoss, PyTorchCrossEntropyLoss, TensorFlowSparseCategoricalCrossentropy, TensorflowKnowledgeDistillationLoss, PyTorchKnowledgeDistillationLoss, PyTorchIntermediateLayersKnowledgeDistillationLoss.
Classes
Record criterions in TensorflowCriterions class. |
|
Record criterions in PyTorchCriterions class. |
|
Integrate criterions of different framework. |
|
TensorFlow CrossEntropyLoss criterion. |
|
TensorFlow SparseCategoricalCrossentropyLoss criterion. |
|
PyTorch CrossEntropyLoss criterion. |
|
Knowledge Distillation Framework. |
|
Initialize the KnowledgeDistillationLoss class. |
|
The PyTorchKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss. |
|
PyTorchKnowledgeDistillationLossWrapper wraps PyTorchKnowledgeDistillationLoss. |
|
The TensorflowKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss. |
|
TensorflowKnowledgeDistillationLossWrapper wraps TensorflowKnowledgeDistillationLoss. |
|
TensorflowKnowledgeDistillationLossExternal inherits from KnowledgeDistillationLoss. |
|
The IntermediateLayersKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss. |
|
PyTorch Intermediate Layers Knowledge Distillation Loss. |
|
PyTorch Intermediate Layers Knowledge Distillation Loss Wrapper. |
|
SelfKnowledge Distillation Loss. |
|
PyTorch SelfKnowledge Distillation Loss. |
|
PyTorch SelfKnowledge Distillation Loss Wrapper. |
Functions
|
Use to register criterion classes in registry_criterions. |
Module Contents
- class neural_compressor.compression.distillation.criterions.TensorflowCriterions[source]
Record criterions in TensorflowCriterions class.
- class neural_compressor.compression.distillation.criterions.PyTorchCriterions[source]
Record criterions in PyTorchCriterions class.
- class neural_compressor.compression.distillation.criterions.Criterions(framework)[source]
Integrate criterions of different framework.
- neural_compressor.compression.distillation.criterions.criterion_registry(criterion_type, framework)[source]
Use to register criterion classes in registry_criterions.
- Parameters:
criterion_type (str) – The string of supported criterion.
framework (str) – The string of supported framework.
- Returns:
The class of register.
- Return type:
cls
- class neural_compressor.compression.distillation.criterions.TensorFlowCrossEntropyLoss(param_dict)[source]
TensorFlow CrossEntropyLoss criterion.
- class neural_compressor.compression.distillation.criterions.TensorFlowSparseCategoricalCrossentropy(param_dict)[source]
TensorFlow SparseCategoricalCrossentropyLoss criterion.
- class neural_compressor.compression.distillation.criterions.PyTorchCrossEntropyLoss(param_dict)[source]
PyTorch CrossEntropyLoss criterion.
- class neural_compressor.compression.distillation.criterions.KnowledgeDistillationFramework(student_model=None, teacher_model=None)[source]
Knowledge Distillation Framework.
- class neural_compressor.compression.distillation.criterions.KnowledgeDistillationLoss(temperature=1.0, loss_types=['CE', 'CE'], loss_weights=[0.5, 0.5], student_model=None, teacher_model=None)[source]
Initialize the KnowledgeDistillationLoss class.
- class neural_compressor.compression.distillation.criterions.PyTorchKnowledgeDistillationLoss(temperature=1.0, loss_types=['CE', 'CE'], loss_weights=[0.5, 0.5], student_model=None, teacher_model=None)[source]
The PyTorchKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss.
- class neural_compressor.compression.distillation.criterions.PyTorchKnowledgeDistillationLossWrapper(param_dict)[source]
PyTorchKnowledgeDistillationLossWrapper wraps PyTorchKnowledgeDistillationLoss.
- class neural_compressor.compression.distillation.criterions.TensorflowKnowledgeDistillationLoss(temperature=1.0, loss_types=['CE', 'CE'], loss_weights=[0.5, 0.5], student_model=None, teacher_model=None)[source]
The TensorflowKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss.
- class neural_compressor.compression.distillation.criterions.TensorflowKnowledgeDistillationLossWrapper(param_dict)[source]
TensorflowKnowledgeDistillationLossWrapper wraps TensorflowKnowledgeDistillationLoss.
- class neural_compressor.compression.distillation.criterions.TensorflowKnowledgeDistillationLossExternal(temperature=1.0, loss_types=['CE', 'CE'], loss_weights=[0.5, 0.5], student_model=None, teacher_model=None)[source]
TensorflowKnowledgeDistillationLossExternal inherits from KnowledgeDistillationLoss.
- class neural_compressor.compression.distillation.criterions.IntermediateLayersKnowledgeDistillationLoss(layer_mappings=[], loss_types=None, loss_weights=None, add_origin_loss=False, student_model=None, teacher_model=None)[source]
The IntermediateLayersKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss.
- class neural_compressor.compression.distillation.criterions.PyTorchIntermediateLayersKnowledgeDistillationLoss(layer_mappings=[], loss_types=None, loss_weights=None, add_origin_loss=False, student_model=None, teacher_model=None)[source]
PyTorch Intermediate Layers Knowledge Distillation Loss.
- class neural_compressor.compression.distillation.criterions.PyTorchIntermediateLayersKnowledgeDistillationLossWrapper(param_dict)[source]
PyTorch Intermediate Layers Knowledge Distillation Loss Wrapper.
- class neural_compressor.compression.distillation.criterions.SelfKnowledgeDistillationLoss(layer_mappings=[], loss_types=None, loss_weights=None, temperature=1.0, add_origin_loss=False, student_model=None, teacher_model=None)[source]
SelfKnowledge Distillation Loss.