neural_compressor.experimental.pytorch_pruner.prune_utils

Prune utils.

Module Contents

Functions

check_config(prune_config)

Functions that check key-value is valid to run Pruning object.

reset_non_value_to_default(obj, key, default)

Functions that add up undefined configurations.

process_and_check_config(val)

Functions which converts a initial configuration object to a Pruning configuration.

process_config(config)

Obtain a config dict object from a config file.

parse_to_prune(model, config)

Keep target pruned layers.

parse_not_to_prune(modules, config)

Drop non pruned layers.

neural_compressor.experimental.pytorch_pruner.prune_utils.check_config(prune_config)[source]

Functions that check key-value is valid to run Pruning object.

Parameters:

prune_config – A config dict object. Contains Pruning parameters and configurations.

Returns:

None if everything is correct.

Raises:

AssertionError.

neural_compressor.experimental.pytorch_pruner.prune_utils.reset_non_value_to_default(obj, key, default)[source]

Functions that add up undefined configurations.

If some configurations are not defined in the configuration, set it to a default value.

Parameters:
  • obj – A dict{key: value}

  • key – A string. Key in obj.

  • default – When the key is not in obj, Add key: default item in original obj.

neural_compressor.experimental.pytorch_pruner.prune_utils.process_and_check_config(val)[source]

Functions which converts a initial configuration object to a Pruning configuration.

Copy parameters and add some non-define parameters to a new Pruning configuration object.

Parameters:

val – A dict directly read from a config file.

Returns:

A dict whose contents which are regularized for a Pruning object.

neural_compressor.experimental.pytorch_pruner.prune_utils.process_config(config)[source]

Obtain a config dict object from a config file.

Parameters:

config – A string. The path to configuration file.

Returns:

A config dict object.

neural_compressor.experimental.pytorch_pruner.prune_utils.parse_to_prune(model, config)[source]

Keep target pruned layers.

neural_compressor.experimental.pytorch_pruner.prune_utils.parse_not_to_prune(modules, config)[source]

Drop non pruned layers.