:py:mod:`neural_compressor.adaptor.torch_utils.bf16_convert` ============================================================ .. py:module:: neural_compressor.adaptor.torch_utils.bf16_convert .. autoapi-nested-parse:: Bf16 Convert for Torch Utils. Module Contents --------------- Classes ~~~~~~~ .. autoapisummary:: neural_compressor.adaptor.torch_utils.bf16_convert.BF16ModuleWrapper Functions ~~~~~~~~~ .. autoapisummary:: neural_compressor.adaptor.torch_utils.bf16_convert.Convert neural_compressor.adaptor.torch_utils.bf16_convert.bf16_symbolic_trace .. py:class:: BF16ModuleWrapper(module) BF16Module Wrapper Class. .. py:function:: Convert(model, tune_cfg) Convert to bf16 model. :param model: the input model. :type model: object :param tune_cfg: dictionary of quantization configuration. :type tune_cfg: dict :returns: model with mixed precision. :rtype: mixed_precision_model (object) .. py:function:: bf16_symbolic_trace(model, fx_sub_module_list, prefix='') Symbolic trace for bf16 models. :param model: the input model. :type model: object :param fx_sub_module_list: _description_ :type fx_sub_module_list: list :param prefix: prefix of op name. :type prefix: str :returns: model (object)