util.misc ========= .. py:module:: util.misc .. autoapi-nested-parse:: Misc functions, including distributed helpers. Mostly copy-paste from torchvision references. Classes ------- .. autoapisummary:: util.misc.SmoothedValue Functions --------- .. autoapisummary:: util.misc.all_gather util.misc.reduce_dict util.misc.setup_for_distributed util.misc.accuracy util.misc.interpolate Module Contents --------------- .. py:class:: SmoothedValue(window_size=20, fmt=None) Track a series of values and provide access to smoothed values over a window or the global series average. .. py:method:: synchronize_between_processes() Warning: does not synchronize the deque! .. py:function:: all_gather(data) Run all_gather on arbitrary picklable data (not necessarily tensors) :param data: any picklable object :returns: list of data gathered from each rank :rtype: list[data] .. py:function:: reduce_dict(input_dict, average=True) :param input_dict: all the values will be reduced :type input_dict: dict :param average: whether to do average or sum :type average: bool Reduce the values in the dictionary from all processes so that all processes have the averaged results. Returns a dict with the same fields as input_dict, after reduction. .. py:function:: setup_for_distributed(is_master) This function disables printing when not in master process. .. py:function:: accuracy(output, target, topk=(1, )) Computes the precision@k for the specified values of k. .. py:function:: interpolate(input: torch.Tensor, size: Optional[List[int]] = None, scale_factor: Optional[float] = None, mode: str = 'nearest', align_corners: Optional[bool] = None) -> torch.Tensor Equivalent to nn.functional.interpolate, but with support for empty batch sizes. This will eventually be supported natively by PyTorch, and this class can go away.