neural_compressor.training

The configuration of the training loop.

Module Contents

Classes

CompressionManager

CompressionManager is used in train loop for what user want to deal with additional.

CallBacks

Define the basic command for the training loop.

Functions

fit(compression_manager, train_func[, eval_func, ...])

Compress the model with accuracy tuning for quantization.

prepare_compression(model, confs, **kwargs)

Summary.

class neural_compressor.training.CompressionManager(model: Callable, confs: Callable | List, **kwargs)[source]

CompressionManager is used in train loop for what user want to deal with additional.

Parameters:
  • model – A model to be compressed.

  • confs – The instance of QuantizationAwareTrainingConfig, PruningConfig and distillationConfig, or a list of config for orchestration optimization.

Examples:

import neural_compressor.training.prepare_compression
compression_manager = prepare_compression(model, confs)
compression_manager.callbacks.on_train_begin()
model = compression_manager.model
# train_loop:
for epoch in range(epochs):
    compression_manager.callbacks.on_epoch_begin(epoch)
    for i, (batch, label) in enumerate(dataloader):
        compression_manager.callbacks.on_step_begin(i)
        ......
        output = model(batch)
        loss = ......
        loss = compression_manager.callbacks.on_after_compute_loss(batch, output, loss)
        loss.backward()
        compression_manager.callbacks.on_before_optimizer_step()
        optimizer.step()
        compression_manager.callbacks.on_step_end()
    compression_manager.callbacks.on_epoch_end()
compression_manager.callbacks.on_train_end()
compression_manager.save("path_to_save")
neural_compressor.training.fit(compression_manager, train_func, eval_func=None, eval_dataloader=None, eval_metric=None, **kwargs)[source]

Compress the model with accuracy tuning for quantization.

Parameters:
  • compression_manager (CompressionManager) – The Compression manager contains the model and callbacks.

  • train_func (function, optional) – Training function for quantization aware training. It is optional. This function takes “model” as input parameter and executes entire inference process. If this parameter specified.

  • eval_func (function, optional) –

    The evaluation function provided by user. This function takes model as parameter, and evaluation dataset and metrics should be encapsulated in this function implementation and outputs a higher-is-better accuracy scalar value. The pseudo code should be something like: def eval_func(model):

    input, label = dataloader() output = model(input) accuracy = metric(output, label) return accuracy

  • eval_dataloader (generator, optional) – Data loader for evaluation. It is iterable and should yield a tuple of (input, label). The input could be a object, list, tuple or dict, depending on user implementation, as well as it can be taken as model input. The label should be able to take as input of supported metrics. If this parameter is not None, user needs to specify pre-defined evaluation metrics object and should set “eval_func” parameter as None. Tuner will combine model, eval_dataloader and pre-defined metrics to run evaluation process.

  • eval_metric (dict or obj) – Set metric class or a dict of built-in metric configures, and neural_compressor will initialize this class when evaluation.

Returns:

A optimized model.

Examples:

from neural_compressor.training import fit, prepare_compression

compression_manager = prepare_compression(conf, model)

def train_func(model):
    compression_manager.callbacks.on_train_begin()
    for epoch in range(epochs):
        compression_manager.callbacks.on_epoch_begin(epoch)
        for i, (batch, label) in enumerate(dataloader):
            compression_manager.callbacks.on_step_begin(i)
            ......
            output = model(batch)
            loss = ......
            loss = compression_manager.callbacks.on_after_compute_loss(batch, output, loss)
            loss.backward()
            compression_manager.callbacks.on_before_optimizer_step()
            optimizer.step()
            compression_manager.callbacks.on_step_end()
        compression_manager.callbacks.on_epoch_end()
    compression_manager.callbacks.on_train_end()
    return model

def eval_func(model):
    for i, (batch, label) in enumerate(dataloader):
        output = model(batch)
        # compute metric
        metric = top1(output, label)
    return metric.results()

model = fit(compression_manager, train_func=train_func, eval_func=eval_func)
neural_compressor.training.prepare_compression(model: Callable, confs: Callable | List, **kwargs)[source]

Summary.

Parameters:
  • model (Callable, optional) – The model to optimize.

  • confs (Union[Callable, List]) – The instance of QuantizationAwareTrainingConfig, PruningConfig and distillationConfig, or a list of config for orchestration optimization.

Returns:

An object of CompressionManager.

Examples:

from neural_compressor.training import prepare_compression

compression_manager = prepare_compression(conf, model)
model = compression_manager.model
# train_loop:
compression_manager.callbacks.on_train_begin()
for epoch in range(epochs):
    compression_manager.callbacks.on_epoch_begin(epoch)
    for i, (batch, label) in enumerate(dataloader):
        compression_manager.callbacks.on_step_begin(i)
        ......
        output = model(batch)
        loss = ......
        loss = compression_manager.callbacks.on_after_compute_loss(batch, output, loss)
        loss.backward()
        compression_manager.callbacks.on_before_optimizer_step()
        optimizer.step()
        compression_manager.callbacks.on_step_end()
    compression_manager.callbacks.on_epoch_end()
compression_manager.callbacks.on_train_end()
class neural_compressor.training.CallBacks(callbacks_list)[source]

Define the basic command for the training loop.