neural_compressor.adaptor.torch_utils.waq.utils

Module Contents

Functions

get_module(model, key)

Get module from model by key name.

set_module(model, key, new_module)

Set new module into model by key name.

reshape_in_channel_to_last(layer_name, model)

Move the input channel to the last dim

reshape_scale_as_input(layer, scale)

Reshape the scale for input feature in channel

reshape_scale_as_weight(layer, scale)

Reshape the scale for weight input channel, depthwise output channel

register_autotune(name)

Class decorator to register a smoothquant auto-tune subclass.

neural_compressor.adaptor.torch_utils.waq.utils.get_module(model, key)[source]

Get module from model by key name.

Parameters:
  • model (torch.nn.Module) – original model

  • key (str) – module name to be replaced

neural_compressor.adaptor.torch_utils.waq.utils.set_module(model, key, new_module)[source]

Set new module into model by key name.

Parameters:
  • model (torch.nn.Module) – original model

  • key (str) – module name to be replaced

  • new_module (torch.nn.Module) – new module to be inserted

neural_compressor.adaptor.torch_utils.waq.utils.reshape_in_channel_to_last(layer_name, model)[source]

Move the input channel to the last dim :param layer_name: Layer name :return: The reshaped weight.

neural_compressor.adaptor.torch_utils.waq.utils.reshape_scale_as_input(layer, scale)[source]

Reshape the scale for input feature in channel :param layer:

Parameters:

scale

Returns:

neural_compressor.adaptor.torch_utils.waq.utils.reshape_scale_as_weight(layer, scale)[source]

Reshape the scale for weight input channel, depthwise output channel :param layer: torch module :param scale: orig scale :return: reshaped scale.

neural_compressor.adaptor.torch_utils.waq.utils.register_autotune(name)[source]

Class decorator to register a smoothquant auto-tune subclass.

Returns:

the class of register