neural_compressor.tensorflow.utils.data

BaseDataloder of all dataloaders.

Classes

IterableFetcher

Iterate to get next batch-size samples as a batch.

IndexFetcher

Take single index or a batch of indices to fetch samples as a batch.

IterableSampler

Internally samples elements.

SequentialSampler

Sequentially samples elements, used for datasets retrieved element by index.

BatchSampler

Yield a batch of indices and number of batches.

BaseDataLoader

Base class for TF DataLoaders.

DummyDataset

Dataset used for dummy data generation.

DummyDatasetV2

Dataset used for dummy_v2 data generation.

Functions

default_collate(batch)

Merge data with outer dimension batch size.

Module Contents

neural_compressor.tensorflow.utils.data.default_collate(batch)[source]

Merge data with outer dimension batch size.

class neural_compressor.tensorflow.utils.data.IterableFetcher(dataset, collate_fn, drop_last, distributed)[source]

Iterate to get next batch-size samples as a batch.

class neural_compressor.tensorflow.utils.data.IndexFetcher(dataset, collate_fn, drop_last, distributed)[source]

Take single index or a batch of indices to fetch samples as a batch.

class neural_compressor.tensorflow.utils.data.IterableSampler(dataset)[source]

Internally samples elements.

Used for datasets retrieved element by iterator. Yield None to act as a placeholder for each iteration.

class neural_compressor.tensorflow.utils.data.SequentialSampler(dataset, distributed)[source]

Sequentially samples elements, used for datasets retrieved element by index.

class neural_compressor.tensorflow.utils.data.BatchSampler(sampler, batch_size, drop_last=True)[source]

Yield a batch of indices and number of batches.

class neural_compressor.tensorflow.utils.data.BaseDataLoader(dataset, batch_size=1, last_batch='rollover', collate_fn=None, sampler=None, batch_sampler=None, num_workers=0, pin_memory=False, shuffle=False, distributed=False)[source]

Base class for TF DataLoaders.

_generate_dataloader is needed to create a dataloader object from the general params like batch_size and sampler. The dynamic batching is just to generate a new dataloader by setting batch_size and last_batch.

class neural_compressor.tensorflow.utils.data.DummyDataset(shape, low=-128.0, high=127.0, dtype='float32', label=True, transform=None, filter=None)[source]

Dataset used for dummy data generation.

This Dataset is to construct a dataset from a specific shape. The value range is calculated from: low * stand_normal(0, 1) + high. (TODO) construct dummy data from real dataset or iteration of data.

class neural_compressor.tensorflow.utils.data.DummyDatasetV2(input_shape, label_shape=None, low=-128.0, high=127.0, dtype='float32', transform=None, filter=None)[source]

Dataset used for dummy_v2 data generation.

This Dataset is to construct a dataset from a input shape and label shape. The value range is calculated from: low * stand_normal(0, 1) + high.