neural_compressor.compression.distillation.criterions

Initialize critetion classes.

Classes includes:

TensorFlowCrossEntropyLoss, PyTorchCrossEntropyLoss, TensorFlowSparseCategoricalCrossentropy, TensorflowKnowledgeDistillationLoss, PyTorchKnowledgeDistillationLoss, PyTorchIntermediateLayersKnowledgeDistillationLoss.

Classes

TensorflowCriterions

Record criterions in TensorflowCriterions class.

PyTorchCriterions

Record criterions in PyTorchCriterions class.

Criterions

Integrate criterions of different framework.

TensorFlowCrossEntropyLoss

TensorFlow CrossEntropyLoss criterion.

TensorFlowSparseCategoricalCrossentropy

TensorFlow SparseCategoricalCrossentropyLoss criterion.

PyTorchCrossEntropyLoss

PyTorch CrossEntropyLoss criterion.

KnowledgeDistillationFramework

Knowledge Distillation Framework.

KnowledgeDistillationLoss

Initialize the KnowledgeDistillationLoss class.

PyTorchKnowledgeDistillationLoss

The PyTorchKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss.

PyTorchKnowledgeDistillationLossWrapper

PyTorchKnowledgeDistillationLossWrapper wraps PyTorchKnowledgeDistillationLoss.

TensorflowKnowledgeDistillationLoss

The TensorflowKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss.

TensorflowKnowledgeDistillationLossWrapper

TensorflowKnowledgeDistillationLossWrapper wraps TensorflowKnowledgeDistillationLoss.

TensorflowKnowledgeDistillationLossExternal

TensorflowKnowledgeDistillationLossExternal inherits from KnowledgeDistillationLoss.

IntermediateLayersKnowledgeDistillationLoss

The IntermediateLayersKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss.

PyTorchIntermediateLayersKnowledgeDistillationLoss

PyTorch Intermediate Layers Knowledge Distillation Loss.

PyTorchIntermediateLayersKnowledgeDistillationLossWrapper

PyTorch Intermediate Layers Knowledge Distillation Loss Wrapper.

SelfKnowledgeDistillationLoss

SelfKnowledge Distillation Loss.

PyTorchSelfKnowledgeDistillationLoss

PyTorch SelfKnowledge Distillation Loss.

PyTorchSelfKnowledgeDistillationLossWrapper

PyTorch SelfKnowledge Distillation Loss Wrapper.

Functions

criterion_registry(criterion_type, framework)

Use to register criterion classes in registry_criterions.

Module Contents

class neural_compressor.compression.distillation.criterions.TensorflowCriterions[source]

Record criterions in TensorflowCriterions class.

class neural_compressor.compression.distillation.criterions.PyTorchCriterions[source]

Record criterions in PyTorchCriterions class.

class neural_compressor.compression.distillation.criterions.Criterions(framework)[source]

Integrate criterions of different framework.

neural_compressor.compression.distillation.criterions.criterion_registry(criterion_type, framework)[source]

Use to register criterion classes in registry_criterions.

Parameters:
  • criterion_type (str) – The string of supported criterion.

  • framework (str) – The string of supported framework.

Returns:

The class of register.

Return type:

cls

class neural_compressor.compression.distillation.criterions.TensorFlowCrossEntropyLoss(param_dict)[source]

TensorFlow CrossEntropyLoss criterion.

class neural_compressor.compression.distillation.criterions.TensorFlowSparseCategoricalCrossentropy(param_dict)[source]

TensorFlow SparseCategoricalCrossentropyLoss criterion.

class neural_compressor.compression.distillation.criterions.PyTorchCrossEntropyLoss(param_dict)[source]

PyTorch CrossEntropyLoss criterion.

class neural_compressor.compression.distillation.criterions.KnowledgeDistillationFramework(student_model=None, teacher_model=None)[source]

Knowledge Distillation Framework.

class neural_compressor.compression.distillation.criterions.KnowledgeDistillationLoss(temperature=1.0, loss_types=['CE', 'CE'], loss_weights=[0.5, 0.5], student_model=None, teacher_model=None)[source]

Initialize the KnowledgeDistillationLoss class.

class neural_compressor.compression.distillation.criterions.PyTorchKnowledgeDistillationLoss(temperature=1.0, loss_types=['CE', 'CE'], loss_weights=[0.5, 0.5], student_model=None, teacher_model=None)[source]

The PyTorchKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss.

class neural_compressor.compression.distillation.criterions.PyTorchKnowledgeDistillationLossWrapper(param_dict)[source]

PyTorchKnowledgeDistillationLossWrapper wraps PyTorchKnowledgeDistillationLoss.

class neural_compressor.compression.distillation.criterions.TensorflowKnowledgeDistillationLoss(temperature=1.0, loss_types=['CE', 'CE'], loss_weights=[0.5, 0.5], student_model=None, teacher_model=None)[source]

The TensorflowKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss.

class neural_compressor.compression.distillation.criterions.TensorflowKnowledgeDistillationLossWrapper(param_dict)[source]

TensorflowKnowledgeDistillationLossWrapper wraps TensorflowKnowledgeDistillationLoss.

class neural_compressor.compression.distillation.criterions.TensorflowKnowledgeDistillationLossExternal(temperature=1.0, loss_types=['CE', 'CE'], loss_weights=[0.5, 0.5], student_model=None, teacher_model=None)[source]

TensorflowKnowledgeDistillationLossExternal inherits from KnowledgeDistillationLoss.

class neural_compressor.compression.distillation.criterions.IntermediateLayersKnowledgeDistillationLoss(layer_mappings=[], loss_types=None, loss_weights=None, add_origin_loss=False, student_model=None, teacher_model=None)[source]

The IntermediateLayersKnowledgeDistillationLoss class inherits from KnowledgeDistillationLoss.

class neural_compressor.compression.distillation.criterions.PyTorchIntermediateLayersKnowledgeDistillationLoss(layer_mappings=[], loss_types=None, loss_weights=None, add_origin_loss=False, student_model=None, teacher_model=None)[source]

PyTorch Intermediate Layers Knowledge Distillation Loss.

class neural_compressor.compression.distillation.criterions.PyTorchIntermediateLayersKnowledgeDistillationLossWrapper(param_dict)[source]

PyTorch Intermediate Layers Knowledge Distillation Loss Wrapper.

class neural_compressor.compression.distillation.criterions.SelfKnowledgeDistillationLoss(layer_mappings=[], loss_types=None, loss_weights=None, temperature=1.0, add_origin_loss=False, student_model=None, teacher_model=None)[source]

SelfKnowledge Distillation Loss.

class neural_compressor.compression.distillation.criterions.PyTorchSelfKnowledgeDistillationLoss(layer_mappings=[], loss_types=None, loss_weights=None, temperature=1.0, add_origin_loss=False, student_model=None, teacher_model=None)[source]

PyTorch SelfKnowledge Distillation Loss.

class neural_compressor.compression.distillation.criterions.PyTorchSelfKnowledgeDistillationLossWrapper(param_dict)[source]

PyTorch SelfKnowledge Distillation Loss Wrapper.