neural_compressor.torch.utils.environ

Intel Neural Compressor PyTorch environment check.

Functions

is_ipex_imported(→ bool)

Check whether intel_extension_for_pytorch is imported.

is_transformers_imported(→ bool)

Check whether transformers is imported.

is_package_available(package_name)

Check if the package exists in the environment without importing.

is_hpex_available()

Returns whether hpex is available.

is_optimum_available()

Return whether optimum-habana is available.

is_optimum_habana_available()

Return whether optimum-habana is available.

is_ipex_available()

Return whether ipex is available.

get_ipex_version()

Return ipex version if ipex exists.

get_torch_version()

Return torch version if ipex exists.

get_accelerator([device_name])

Return the recommended accelerator based on device priority.

device_synchronize(raw_func)

Function decorator that calls accelerated.synchronize before and after a function call.

can_pack_with_numba()

Check if Numba and TBB are available for packing.

is_numba_available()

Check if Numba is available.

is_tbb_available()

Check if TBB is available.

get_used_hpu_mem_MB()

Get HPU used memory: MiB.

get_used_cpu_mem_MB()

Get the amount of CPU memory used by the current process in MiB (Mebibytes).

Module Contents

neural_compressor.torch.utils.environ.is_ipex_imported() bool[source]

Check whether intel_extension_for_pytorch is imported.

neural_compressor.torch.utils.environ.is_transformers_imported() bool[source]

Check whether transformers is imported.

neural_compressor.torch.utils.environ.is_package_available(package_name)[source]

Check if the package exists in the environment without importing.

Parameters:

package_name (str) – package name

neural_compressor.torch.utils.environ.is_hpex_available()[source]

Returns whether hpex is available.

neural_compressor.torch.utils.environ.is_optimum_available()[source]

Return whether optimum-habana is available.

neural_compressor.torch.utils.environ.is_optimum_habana_available()[source]

Return whether optimum-habana is available.

neural_compressor.torch.utils.environ.is_ipex_available()[source]

Return whether ipex is available.

neural_compressor.torch.utils.environ.get_ipex_version()[source]

Return ipex version if ipex exists.

neural_compressor.torch.utils.environ.get_torch_version()[source]

Return torch version if ipex exists.

neural_compressor.torch.utils.environ.get_accelerator(device_name='auto')[source]

Return the recommended accelerator based on device priority.

neural_compressor.torch.utils.environ.device_synchronize(raw_func)[source]

Function decorator that calls accelerated.synchronize before and after a function call.

neural_compressor.torch.utils.environ.can_pack_with_numba()[source]

Check if Numba and TBB are available for packing.

To pack tensor with Numba, both Numba and TBB are required, and TBB should be configured correctly.

neural_compressor.torch.utils.environ.is_numba_available()[source]

Check if Numba is available.

neural_compressor.torch.utils.environ.is_tbb_available()[source]

Check if TBB is available.

neural_compressor.torch.utils.environ.get_used_hpu_mem_MB()[source]

Get HPU used memory: MiB.

neural_compressor.torch.utils.environ.get_used_cpu_mem_MB()[source]

Get the amount of CPU memory used by the current process in MiB (Mebibytes).