neural_compressor.compression.pruner ==================================== .. py:module:: neural_compressor.compression.pruner .. autoapi-nested-parse:: Pruning init. Subpackages ----------- .. toctree:: :maxdepth: 1 /autoapi/neural_compressor/compression/pruner/model_slim/index /autoapi/neural_compressor/compression/pruner/patterns/index /autoapi/neural_compressor/compression/pruner/pruners/index /autoapi/neural_compressor/compression/pruner/wanda/index Submodules ---------- .. toctree:: :maxdepth: 1 /autoapi/neural_compressor/compression/pruner/criteria/index /autoapi/neural_compressor/compression/pruner/pruning/index /autoapi/neural_compressor/compression/pruner/regs/index /autoapi/neural_compressor/compression/pruner/schedulers/index /autoapi/neural_compressor/compression/pruner/tf_criteria/index /autoapi/neural_compressor/compression/pruner/utils/index Functions --------- .. autoapisummary:: neural_compressor.compression.pruner.save neural_compressor.compression.pruner.prepare_pruning Package Contents ---------------- .. py:function:: save(obj: object, f, pickle_module=None, pickle_protocol=None, _use_new_zipfile_serialization=None) A rewrite function for torch save. :param obj: :param f: :param pickle_module: :param pickle_protocol: :param _use_new_zipfile_serialization: :return: .. py:function:: prepare_pruning(model, config, optimizer=None, dataloader=None, loss_func=None, framework='pytorch', device: str = None) Get registered pruning class, wrapper the model and optimizer to support all the pruning functionality. Get a pruning object from PRUNINGS. :param modules: A dict {"module_name": Tensor} that stores the pruning modules' weights. :param config: A config dict object that contains the pruners information. :returns: A pruning object. Raises: AssertionError: Currently only support prunings that have been registered in PRUNINGS.