Deep Neural Network Library (DNNL)  1.2.0
Performance library for Deep Learning
 All Classes Namespaces Files Functions Variables Typedefs Enumerations Enumerator Groups Pages
Inspecting JIT Code

DNNL uses just-in-time compilation (JIT) to generate optimal code for some functions based on input parameters and instruction set supported by the system. The library provides a mechanism to save the generated code into a file for inspection.

This behavior can be enabled with DNNL_JIT_DUMP environment variable or dnnl_set_jit_dump function.

Value Behavior
0 JIT dump is disabled (default)
any other value JIT dump is enabled

The function setting takes precedence over the environment variable.


$ DNNL_JIT_DUMP=1 ./simple-net-cpp

This will produce the following output files if running on a CPU supporting Intel(R) AVX2:


Use any disassembler to view the code. For example:

XED is a decoder tool available as part as Intel(R) Software Development Emulator (Intel(R) SDE).