neural_compressor.utils.kl_divergence
¶
KL Divergence: measure probability distribution difference to determine the thresholds per quantized op.
Module Contents¶
Classes¶
The class of supporting KL divergence calibration algorithm. |
- class neural_compressor.utils.kl_divergence.KL_Divergence¶
Bases:
object
The class of supporting KL divergence calibration algorithm.
- expand_quantized_bins(quantized_bins, reference_bins)¶
Expand quantized bins.
- safe_entropy(reference_distr_P, P_sum, candidate_distr_Q, Q_sum)¶
Safe entropy.
- get_threshold(hist, hist_edges, min_val, max_val, num_bins, quantized_type, num_quantized_bins=255)¶
The interface of getting threshold per op using KL divergency algorithm.