ShapeNormalizationBlock

class maze.perception.blocks.shape_normalization.ShapeNormalizationBlock(*args: Any, **kwargs: Any)

Perception block normalizing the input and de-normalizing the output tensor dimensions.

Examples where this interface needs to be implemented are Dense Layers (batch-dim, feature-dim) or Convolution Blocks (batch-dim, feature-dim, row-dim, column-dim)

Parameters
  • in_keys – Keys identifying the input tensors.

  • out_keys – Keys identifying the output tensors.

  • in_shapes – List of input shapes.

  • in_num_dims – Required number of dimensions for corresponding input.

  • out_num_dims – Required number of dimensions for corresponding output.

forward(block_input: Dict[str, torch.Tensor]) → Dict[str, torch.Tensor]

(overrides PerceptionBlock)

implementation of PerceptionBlock interface

abstract normalized_forward(block_input: Dict[str, torch.Tensor]) → Dict[str, torch.Tensor]

Shape normalized forward pass called in the actual forward pass of this block.

Parameters

block_input – The block’s shape normalized input dictionary.

Returns

The block’s shape normalized output dictionary.