models.position_encoding

Various positional encodings for the transformer.

Classes

PositionEmbeddingSine

This is a more standard version of the position embedding, very similar to the one

PositionEmbeddingLearned

Absolute pos embedding, learned.

Module Contents

class models.position_encoding.PositionEmbeddingSine(num_pos_feats=64, temperature=10000, normalize=False, scale=None)[source]

This is a more standard version of the position embedding, very similar to the one used by the Attention is all you need paper, generalized to work on images.

class models.position_encoding.PositionEmbeddingLearned(num_pos_feats=256)[source]

Absolute pos embedding, learned.