Accelerate Deep Learning Training and Inference for Model Zoo Workloads on Intel GPU
Introduction
This example shows the guideline to run Model Zoo workloads using TensorFlow* framework on Intel GPU with the optimizations from Intel® Extension for TensorFlow*.
Quick Start Guide
Run Models in the Docker Container
For Intel® Data Center GPU Flex Series
Refer to AI Model Zoo Containers on Flex Series to run optimized Deep Learning inference workloads.
For Intel® Data Center GPU Max Series
Refer to AI Model Zoo Containers on Max Series to run optimized Deep Learning training and inference workloads.
Run Models on Bare Metal
Refer to AI Model Zoo Examples on Intel® Data Center GPU to run optimized Deep Learning training and inference workloads on bare metal.