Relay Core Tensor Operators

This page contains the list of core tensor operator primitives pre-defined in tvm.relay. The core tensor operator primitives cover typical workloads in deep learning. They can represent workloads in front-end frameworks and provide basic building blocks for optimization. Since deep learning is a fast evolving field, it is possible to have operators that are not in here.

Note

This document will directly list the function signature of these operators in the python frontend.

Overview of Operators

Level 1: Basic Operators

This level enables fully connected multi-layer perceptron.

tvm.relay.log

Compute elementwise log of data.

tvm.relay.sqrt

Compute elementwise sqrt of data.

tvm.relay.rsqrt

Compute elementwise rsqrt of data.

tvm.relay.exp

Compute elementwise exp of data.

tvm.relay.sigmoid

Compute elementwise sigmoid of data.

tvm.relay.add

Addition with numpy-style broadcasting.

tvm.relay.subtract

Subtraction with numpy-style broadcasting.

tvm.relay.multiply

Multiplication with numpy-style broadcasting.

tvm.relay.divide

Division with numpy-style broadcasting.

tvm.relay.mod

Mod with numpy-style broadcasting.

tvm.relay.tanh

Compute element-wise tanh of data.

tvm.relay.concatenate

Concatenate the input tensors along the given axis.

tvm.relay.expand_dims

Insert num_newaxis axes at the position given by axis.

tvm.relay.nn.softmax

Computes softmax.

tvm.relay.nn.log_softmax

Computes log softmax.

tvm.relay.nn.relu

Rectified linear unit.

tvm.relay.nn.dropout

Applies the dropout operation to the input array.

tvm.relay.nn.batch_norm

Batch normalization layer (Ioffe and Szegedy, 2014).

tvm.relay.nn.bias_add

add_bias operator.

Level 2: Convolutions

This level enables typical convnet models.

tvm.relay.nn.conv2d

2D convolution.

tvm.relay.nn.conv2d_transpose

Two dimensional transposed convolution operator.

tvm.relay.nn.conv3d

3D convolution.

tvm.relay.nn.conv3d_transpose

3D transpose convolution.

tvm.relay.nn.dense

Dense operator.

tvm.relay.nn.max_pool2d

2D maximum pooling operator.

tvm.relay.nn.max_pool3d

3D maximum pooling operator.

tvm.relay.nn.avg_pool2d

2D average pooling operator.

tvm.relay.nn.avg_pool3d

3D average pooling operator.

tvm.relay.nn.global_max_pool2d

2D global maximum pooling operator.

tvm.relay.nn.global_avg_pool2d

2D global average pooling operator.

tvm.relay.nn.upsampling

Upsampling.

tvm.relay.nn.upsampling3d

3D Upsampling.

tvm.relay.nn.batch_flatten

BatchFlatten.

tvm.relay.nn.pad

Padding

tvm.relay.nn.lrn

This operator takes data as input and does local response normalization.

tvm.relay.nn.l2_normalize

Perform L2 normalization on the input data

tvm.relay.nn.bitpack

Tensor packing for bitserial operations.

tvm.relay.nn.bitserial_dense

Bitserial Dense operator.

tvm.relay.nn.bitserial_conv2d

2D convolution using bitserial computation.

tvm.relay.nn.contrib_conv2d_winograd_without_weight_transform

2D convolution with winograd algorithm.

tvm.relay.nn.contrib_conv2d_winograd_weight_transform

Weight Transformation part for 2D convolution with winograd algorithm.

tvm.relay.nn.contrib_conv3d_winograd_without_weight_transform

3D convolution with winograd algorithm.

tvm.relay.nn.contrib_conv3d_winograd_weight_transform

Weight Transformation part for 3D convolution with winograd algorithm.

Level 3: Additional Math And Transform Operators

This level enables additional math and transform operators.

tvm.relay.nn.leaky_relu

This operator takes data as input and does Leaky version of a Rectified Linear Unit.

tvm.relay.nn.prelu

This operator takes data as input and does Leaky version of a Rectified Linear Unit.

tvm.relay.reshape

Reshape the input array.

tvm.relay.reshape_like

Reshapes the input tensor by the size of another tensor.

tvm.relay.copy

Copy a tensor.

tvm.relay.transpose

Permutes the dimensions of an array.

tvm.relay.squeeze

Squeeze axes in the array.

tvm.relay.floor

Compute element-wise floor of data.

tvm.relay.ceil

Compute element-wise ceil of data.

tvm.relay.sign

Compute element-wise absolute of data.

tvm.relay.trunc

Compute element-wise trunc of data.

tvm.relay.clip

Clip the elements in a between a_min and a_max.

tvm.relay.round

Compute element-wise round of data.

tvm.relay.abs

Compute element-wise absolute of data.

tvm.relay.negative

Compute element-wise negative of data.

tvm.relay.take

Take elements from an array along an axis.

tvm.relay.zeros

Fill array with zeros.

tvm.relay.zeros_like

Returns an array of zeros, with same type and shape as the input.

tvm.relay.ones

Fill array with ones.

tvm.relay.ones_like

Returns an array of ones, with same type and shape as the input.

tvm.relay.gather

Gather values along given axis from given indices.

tvm.relay.gather_nd

Gather elements or slices from data and store to a tensor whose shape is defined by indices.

tvm.relay.full

Fill array with scalar value.

tvm.relay.full_like

Return a scalar value array with the same shape and type as the input array.

tvm.relay.cast

Cast input tensor to data type.

tvm.relay.reinterpret

Reinterpret input tensor to data type.

tvm.relay.split

Split input tensor along axis by sections or indices.

tvm.relay.arange

Return evenly spaced values within a given interval.

tvm.relay.meshgrid

Create coordinate matrices from coordinate vectors.

tvm.relay.stack

Join a sequence of arrays along a new axis.

tvm.relay.repeat

Repeats elements of an array.

tvm.relay.tile

Repeats the whole array multiple times.

tvm.relay.reverse

Reverses the order of elements along given axis while preserving array shape.

tvm.relay.reverse_sequence

Reverse the tensor for variable length slices.

tvm.relay.unravel_index

Convert a flat index or array of flat indices into a tuple of coordinate arrays.

tvm.relay.sparse_to_dense

Converts a sparse representation into a dense tensor.

Level 4: Broadcast and Reductions

tvm.relay.right_shift

Right shift with numpy-style broadcasting.

tvm.relay.left_shift

Left shift with numpy-style broadcasting.

tvm.relay.equal

Broadcasted elementwise test for (lhs == rhs).

tvm.relay.not_equal

Broadcasted elementwise test for (lhs != rhs).

tvm.relay.greater

Broadcasted elementwise test for (lhs > rhs).

tvm.relay.greater_equal

Broadcasted elementwise test for (lhs >= rhs).

tvm.relay.less

Broadcasted elementwise test for (lhs < rhs).

tvm.relay.less_equal

Broadcasted elementwise test for (lhs <= rhs).

tvm.relay.all

Computes the logical AND of boolean array elements over given axes.

tvm.relay.any

Computes the logical OR of boolean array elements over given axes.

tvm.relay.logical_and

logical AND with numpy-style broadcasting.

tvm.relay.logical_or

logical OR with numpy-style broadcasting.

tvm.relay.logical_not

Compute element-wise logical not of data.

tvm.relay.logical_xor

logical XOR with numpy-style broadcasting.

tvm.relay.maximum

Maximum with numpy-style broadcasting.

tvm.relay.minimum

Minimum with numpy-style broadcasting.

tvm.relay.power

Power with numpy-style broadcasting.

tvm.relay.where

Selecting elements from either x or y depending on the value of the condition.

tvm.relay.argmax

Returns the indices of the maximum values along an axis.

tvm.relay.argmin

Returns the indices of the minimum values along an axis.

tvm.relay.sum

Computes the sum of array elements over given axes.

tvm.relay.max

Computes the max of array elements over given axes.

tvm.relay.min

Computes the min of array elements over given axes.

tvm.relay.mean

Computes the mean of array elements over given axes.

tvm.relay.variance

Computes the variance of data over given axes.

tvm.relay.std

Computes the standard deviation of data over given axes.

tvm.relay.mean_variance

Computes the mean and variance of data over given axes.

tvm.relay.mean_std

Computes the mean and standard deviation of data over given axes.

tvm.relay.prod

Computes the products of array elements over given axes.

tvm.relay.strided_slice

Strided slice of an array.

tvm.relay.broadcast_to

Return a scalar value array with the same type, broadcast to the provided shape.

Level 5: Vision/Image Operators

tvm.relay.image.resize1d

Image resize1d operator.

tvm.relay.image.resize2d

Image resize2d operator.

tvm.relay.image.resize3d

Image resize3d operator.

tvm.relay.image.crop_and_resize

Crop input images and resize them.

tvm.relay.image.dilation2d

Morphological Dilation 2D.

tvm.relay.vision.multibox_prior

Generate prior(anchor) boxes from data, sizes and ratios.

tvm.relay.vision.multibox_transform_loc

Location transformation for multibox detection

tvm.relay.vision.nms

Non-maximum suppression operations.

tvm.relay.vision.yolo_reorg

Yolo reorg operation used in darknet models.

Level 6: Algorithm Operators

tvm.relay.argsort

Performs sorting along the given axis and returns an array of indices having same shape as an input array that index data in sorted order.

tvm.relay.topk

Get the top k elements in an input tensor along the given axis.

Level 10: Temporary Operators

This level support backpropagation of broadcast operators. It is temporary.

tvm.relay.broadcast_to_like

Return a scalar value array with the same shape and type as the input array.

tvm.relay.collapse_sum_like

Return a scalar value array with the same shape and type as the input array.

tvm.relay.slice_like

Slice the first input with respect to the second input.

tvm.relay.shape_of

Get shape of a tensor.

tvm.relay.ndarray_size

Get number of elements of input tensor.

tvm.relay.layout_transform

Transform the layout of a tensor

tvm.relay.device_copy

Copy data from the source device to the destination device.

tvm.relay.annotation.on_device

Annotates a body expression with device constraints.

tvm.relay.reverse_reshape

Reshapes the input array where the special values are inferred from right to left.

tvm.relay.sequence_mask

Sets all elements outside the expected length of the sequence to a constant value.

tvm.relay.nn.batch_matmul

Compute batch matrix multiplication of tensor_a and tensor_b.

tvm.relay.nn.adaptive_max_pool2d

2D adaptive max pooling operator.

tvm.relay.nn.adaptive_avg_pool2d

2D adaptive average pooling operator.

tvm.relay.one_hot

Returns a one-hot tensor where the locations repsented by indices take value on_value, other locations take value off_value.

Level 11: Dialect Operators

This level supports dialect operators.

tvm.relay.qnn.op.add

Quantized addition with numpy-style broadcasting.

tvm.relay.qnn.op.batch_matmul

Computes batch matrix multiplication of x and y when x and y are data in batch.

tvm.relay.qnn.op.concatenate

Concatenate the quantized input tensors along the given axis.

tvm.relay.qnn.op.conv2d

Quantized 2D convolution.

tvm.relay.qnn.op.conv2d_transpose

This operator deconvolves quantized data with quantized kernel.

tvm.relay.qnn.op.dense

Qnn Dense operator.

tvm.relay.qnn.op.dequantize

Dequantize op This operator takes quantized int8 and unit8 as input and produces dequantized float32 as output.

tvm.relay.qnn.op.mul

Quantized multiplication with numpy-style broadcasting.

tvm.relay.qnn.op.quantize

Quantize op This operator takes float32 as input and produces quantized int8 or unit8 as output.

tvm.relay.qnn.op.requantize

Requantized operator.

tvm.relay.qnn.op.rsqrt

Quantized reciprocal square root.

tvm.relay.qnn.op.simulated_dequantize

Simulated Dequantize op Mimics the dequantize op but has more flexibility in valid inputs and always outputs the same type as the input.

tvm.relay.qnn.op.simulated_quantize

Simulated Quantize op Mimics the quantize op but has more flexibility in valid inputs and always outputs the same type as the input.

tvm.relay.qnn.op.subtract

Quantized subtraction with numpy-style broadcasting.