neural_compressor.training
¶
The configuration of the training loop.
Module Contents¶
Classes¶
CompressionManager is uesd in train loop for what user want to deal with additional. |
Functions¶
|
Summary. |
- class neural_compressor.training.CompressionManager(component)¶
CompressionManager is uesd in train loop for what user want to deal with additional.
- Parameters:
commponent – one instance of Distillation, Quantization, Pruning, Scheduler
Examples
import neural_compressor.training.prepare_compression compression_manager = prepare_compression(conf, model) compression_manager.callbacks.on_train_begin() model = compression_manager.model train_loop:
- for epoch in range(epochs):
compression_manager.callbacks.on_epoch_begin(epoch) for i, batch in enumerate(dataloader):
output = model(batch) loss = …… loss = compression_manager.callbacks.on_after_compute_loss(batch, output, loss) loss.backward() compression_manager.callbacks.on_before_optimizer_step() optimizer.step() compression_manager.callbacks.on_step_end()
compression_manager.callbacks.on_epoch_end()
compression_manager.callbacks.on_train_end() compression_manager.save(“path_to_save”)
- class CallBacks(component)¶
Define the basic command for the training loop.
- on_train_begin(dataloader=None)¶
Called before the beginning of epochs.
- on_train_end()¶
Called after the end of epochs.
- on_epoch_begin(epoch)¶
Called on the beginning of epochs.
- on_step_begin(batch_id)¶
Called on the beginning of batches.
- on_after_compute_loss(input, student_output, student_loss, teacher_output=None)¶
Called on the end of loss computation.
- on_before_optimizer_step()¶
Called on the end of backward.
- on_after_optimizer_step()¶
Called on the end of backward.
- on_step_end()¶
Called on the end of batches.
- on_epoch_end()¶
Called on the end of epochs.
- save(root=None)¶
Save compressed model.
- Parameters:
root (str) – path to save the model
- export(save_path: str, conf)¶
Convert the model to another type model, like onnx model and so on.
- Parameters:
save_path (str) – The path to save the model
conf (Union[Callable, List]) – The configure for onnx exportation.
- neural_compressor.training.prepare_compression(model: Callable, confs: Callable | List, **kwargs)¶
Summary.
- Parameters:
model (Callable, optional) – The model to optimize.
confs (Union[Callable, List]) – Config of Distillation, Quantization, Pruning, or list of config for orchestration optimization
options (Options, optional) – The configure for random_seed, workspace, resume path and tensorboard flag.
- Returns:
CompressionManager
Examples
import neural_compressor.training.prepare_compression compression_manager = prepare_compression(conf, model) train_loop:
compression_manager.on_train_begin() for epoch in range(epochs):
compression_manager.on_epoch_begin(epoch) for i, batch in enumerate(dataloader):
output = model(batch) loss = …… loss = compression_manager.on_after_compute_loss(batch, output, loss) loss.backward() compression_manager.on_before_optimizer_step() optimizer.step() compression_manager.on_step_end()
compression_manager.on_epoch_end()
compression_manager.on_train_end()