Deep Neural Network Library (DNNL)  1.2.0
Performance library for Deep Learning
Resampling

API reference

The resampling primitive computes forward or backward resampling operation on 1D, 2D, or 3D spatial data. Resampling performs spatial scaling of original tensor using one of the supported interpolation algorithms:

• Nearest Neighbor
• Linear (or Bilinear for 2D spatial tensor, Trilinear for 3D spatial tensor).

Resampling operation is defined by the source tensor and scaling factors in each spatial dimension. Upsampling and downsampling are the alternative terms for resampling that are used when all scaling factors are greater (upsampling) or less (downsampling) than one.

The resampling operation is defined by the following formulas. We show formulas only for 2D spatial data which are straightforward to generalize to cases of higher and lower dimensions. Variable names follow the standard Naming Conventions.

Let $$src$$ and $$dst$$ be $$N \times C \times IH \times IW$$ and $$N \times C \times OH \times OW$$ tensors respectively. Let $$F_h = \frac{OH}{IH}$$ and $$F_w = \frac{OW}{IW}$$ define scaling factors in each spatial dimension.

The following formulas show how DNNL computes resampling for nearest neighbor and bilinear interpolation methods. To further simplify the formulas, we assume the following:

• $$src(n, ic, ih, iw) = 0$$ if $$ih < 0$$ or $$iw < 0$$,
• $$src(n, ic, ih, iw) = src(n, ic, IH - 1, iw)$$ if $$ih \geq IH$$,
• $$src(n, ic, ih, iw) = src(n, ic, ih, IW - 1)$$ if $$iw \geq IW$$.

### Forward

#### Nearest Neighbor Resampling

$dst(n, c, oh, ow) = src(n, c, ih, iw)$

where

• $$ih = [\frac{oh + 0.5} {F_h} - 0.5]$$,
• $$iw = [\frac{ow + 0.5} {F_w} - 0.5]$$.

#### Bilinear Resampling

$dst(n, c, oh, ow) = src(n, c, ih_0, iw_0) \cdot W_{ih} \cdot W_{iw} + \\ src(n, c, ih_1, iw_0) \cdot (1 - W_{ih}) \cdot W_{iw} + \\ src(n, c, ih_0, iw_1) \cdot W_{ih} \cdot (1 - W_{iw}) + \\ src(n, c, ih_1, iw_1) \cdot (1 - W_{ih}) \cdot (1 - W_{iw}) \\$

where

• $$ih_0 = \left\lfloor{\frac {oh + 0.5} {F_h} - 0.5}\right\rfloor$$,
• $$ih_1 = \left\lceil {\frac {oh + 0.5} {F_h} - 0.5}\right\rceil$$,
• $$iw_0 = \left\lfloor{\frac {ow + 0.5} {F_w} - 0.5}\right\rfloor$$,
• $$iw_1 = \left\lceil {\frac {ow + 0.5} {F_w} - 0.5}\right\rceil$$,
• $$W_{ih} = \frac{oh + 0.5}{F_h} - 0.5 - ih_0$$,
• $$W_{iw} = \frac{ow + 0.5}{F_w} - 0.5 - iw_0$$.

#### Difference Between Forward Training and Forward Inference

There is no difference between the dnnl_forward_training and dnnl_forward_inference propagation kinds.

### Backward

The backward propagation computes $$diff\_src$$ based on $$diff\_dst$$.

## Implementation Details

### General Notes

1. Resampling implementation supports data with arbitrary data tag (nchw, nhwc, nChw16c, etc.) but memory tags for src and dst are expected to be the same. Resampling primitive supports dst and diff_src memory tag dnnl::memory::format_tag::any and can define destination format based on source format.
2. Resampling descriptor can be created by specifying the source and destination memory descriptors, only the source descriptor and floating point factors, or the source and destination memory descriptors and factors. In case when user does not provide the destination descriptor, the destination dimensions are deduced using the factors: $$output\_spatial\_size = \left\lfloor{ \frac{input\_spatial\_size} {F} }\right\rfloor$$.
Note
Implementation of resampling algorithm uses factors as defined by the relation $$F = \frac{output\_spatial\_ size} { input\_spatial\_size}$$ that do not necessarily equal to the ones passed by the user.

### Data Types

Resampling primitive supports the following combination of data types for source and destination memory objects:

Propagation Source Destination
forward / backward f32 f32
forward / backward bf16 bf16

### Post-ops and Attributes

The resampling primitive doesn't support any post-ops or attributes.

## Implementation Limitations

1. No primitive specific limitations. Refer to Data Types for limitations related to data types support.

N/A