tvm
|
Layout expression to describe the data organization of a tensor. And BijectiveLayout to mapping two data layouts between each other. More...
#include <tvm/tir/expr.h>
#include <tvm/tir/op.h>
#include <algorithm>
#include <sstream>
#include <string>
#include <utility>
#include <vector>
Go to the source code of this file.
Classes | |
class | tvm::tir::LayoutAxis |
class | tvm::tir::LayoutNode |
Layout is to describe how data is organized within an N-dimention tensor. It is composed of upper cases, lower cases and numbers, where upper case indicates a primal axis and the corresponding lower case with factor size indicates the subordinate axis. For example, NCHW16c can describe a 5-D tensor of [batch_size, channel, height, width, channel_block]. Here subordinate axis channel_block=16 is the factor size of the primal axis C (channel). Layout for scalar is defined, while both its name and axes have size 0. More... | |
class | tvm::tir::Layout |
Managed reference to LayoutNode. More... | |
class | tvm::tir::BijectiveLayoutNode |
class | tvm::tir::BijectiveLayout |
Bijective function mapping for data layout transformation. Given two Layout, BijectiveLayout build and store the mapping rules, provides API to transform N-dimention tensor from the source indices (i0, i1, .., im) to the destination indices (j0, j1, .., jm). More... | |
Namespaces | |
tvm | |
runtime implementation for LibTorch/TorchScript. | |
tvm::tir | |
Layout expression to describe the data organization of a tensor. And BijectiveLayout to mapping two data layouts between each other.