neural_compressor.experimental.pytorch_pruner.pruner
¶
pruner module.
Module Contents¶
Classes¶
Pruning Pruner. |
|
Pruning Pruner. |
|
Pruning Pruner. |
|
Pruning Pruner. |
|
Pruning Pruner. |
Functions¶
|
Class decorator to register a Pruner subclass to the registry. |
|
Get registered pruner class. |
- neural_compressor.experimental.pytorch_pruner.pruner.register_pruners(name)¶
Class decorator to register a Pruner subclass to the registry.
Decorator function used before a Pattern subclass. Make sure that the Pruner class decorated by this function can be registered in PRUNERS.
- Parameters:
cls (class) – The subclass of register.
name – A string. Define the pruner type.
- Returns:
The class of register.
- Return type:
cls
- neural_compressor.experimental.pytorch_pruner.pruner.get_pruner(modules, config)¶
Get registered pruner class.
Get a Pruner object from PRUNERS.
- Parameters:
modules – A dict {“module_name”: Tensor}. Store the pruning modules’ weights.
config – A config dict object. Contains the pruner information.
- Returns:
A Pruner object.
Raises: AssertionError: Cuurently only support pruners which have been registered in PRUNERS.
- class neural_compressor.experimental.pytorch_pruner.pruner.Pruner(modules, config)¶
Pruning Pruner.
The class which executes pruning process. 1. Defines pruning functions called at step begin/end, epoch begin/end. 2. Defines the pruning criteria.
- Parameters:
modules – A dict {“module_name”: Tensor}. Store the pruning modules’ weights.
config – A config dict object. Contains the pruner information.
- modules¶
A dict {“module_name”: Tensor}. Store the pruning modules’ weights.
- config¶
A config dict object. Contains the pruner information.
- masks¶
A dict {“module_name”: Tensor}. Store the masks for modules’ weights.
- scores¶
A dict {“module_name”: Tensor}. Store the score for modules’ weights, which are used to decide pruning parts with a criteria.
- pattern¶
A Pattern object. Defined in ./patterns.py
- scheduler¶
A scheduler object. Defined in ./scheduler.py
- current_sparsity_ratio¶
A float. Current model’s sparsity ratio, initialized as zero.
- global_step¶
A integer. The total steps the model has run.
- start_step¶
A integer. When to trigger pruning process.
- end_step¶
A integer. When to end pruning process.
- update_frequency_on_step¶
A integer. The pruning frequency, which’s valid when iterative pruning is enabled.
- target_sparsity_ratio¶
A float. The final sparsity after pruning.
- max_sparsity_ratio_per_layer¶
A float. Sparsity ratio maximum for every module.
- on_epoch_begin(epoch)¶
Functions called in the beginning of each epoch.
- mask_weights()¶
Functions called when masks are applied on corresponding modules’ weights.
Weights are multipled with masks. This is the formal pruning process.
- on_step_begin(local_step)¶
Functions called on the beginning of each step.
Judge if the current step should execute a pruning process. If so, using scores and criteria to update the masks and pruning the model. Or, simply train the model with its original structure.
- on_step_end()¶
Functions called in the end of each step.
- on_epoch_end()¶
Functions called in the end of each epoch.
- on_before_optimizer_step()¶
Functions called before the optimizer.step().
- on_after_optimizer_step()¶
Functions called after the optimizer.step().
Prune the model after optimization.
- on_train_begin(dataloader=None)¶
Functions called in the beginning of training.
- on_train_end()¶
Functions called in the end of each training.
- on_before_eval()¶
Functions called in the beginning of evaluation.
- on_after_eval()¶
Functions called in the end of evaluation.
- check_is_pruned_step(step)¶
Decide whether the current step should execute a pruning process.
- update_scores()¶
Update self.scores.
- class neural_compressor.experimental.pytorch_pruner.pruner.MagnitudePruner(modules, config)¶
Bases:
Pruner
Pruning Pruner.
A Pruner class derived from Pruner. In this pruner, the scores are calculated based on weights.
- Parameters:
modules – A dict {“module_name”: Tensor}. Store the pruning modules’ weights.
config – A config dict object. Contains the pruner information.
- Inherit from parent class Pruner.
- update_scores()¶
Update self.scores.
- class neural_compressor.experimental.pytorch_pruner.pruner.SnipPruner(modules, config)¶
Bases:
Pruner
Pruning Pruner.
A Pruner class derived from Pruner. In this pruner, the scores are calculated based on SNIP. Please refer to SNIP: Single-shot Network Pruning based on Connection Sensitivity (https://arxiv.org/abs/1810.02340)
- Parameters:
modules – A dict {“module_name”: Tensor}. Store the pruning modules’ weights.
config – A config dict object. Contains the pruner information.
- Inherit from parent class Pruner.
- on_after_optimizer_step()¶
Functions called after the optimizer.step().
Prune the model after optimization and update the scores based on weights and gradients.
- class neural_compressor.experimental.pytorch_pruner.pruner.SnipMomentumPruner(modules, config)¶
Bases:
Pruner
Pruning Pruner.
A Pruner class derived from Pruner. In this pruner, the scores are calculated based on SNIP. Moreoever, the score map is updated with a momentum like process.
- Parameters:
modules – A dict {“module_name”: Tensor}. Store the pruning modules’ weights.
config – A config dict object. Contains the pruner information.
- Inherit from parent class Pruner.
- on_after_optimizer_step()¶
Functions called after the optimizer.step().
Prune the model after optimization and update the scores based on weights and gradients.
- class neural_compressor.experimental.pytorch_pruner.pruner.PatternLockPruner(modules, config)¶
Bases:
Pruner
Pruning Pruner.
A Pruner class derived from Pruner. In this pruner, original model’s sparsity pattern will be fixed while training. This pruner is useful when you want to train a sparse model without change its original structure.
- Parameters:
modules – A dict {“module_name”: Tensor}. Store the pruning modules’ weights.
config – A config dict object. Contains the pruner information.
- Inherit from parent class Pruner.
- on_step_begin(local_step)¶
Functions called on the beginning of each step.
- on_after_optimizer_step()¶
Functions called after the optimizer.step().