Einsum
Last updated
Was this helpful?
Last updated
Was this helpful?
Please read the post for better understanding of einsum.
If you've ever wrote a pytorch code to build a Neural Network, then you have experienced tensor contraction code. Tensor contraction is a fancy term meaning combine tensors and build a tensor
Dot product, Cross product, Matrix multiplication, Matrix element-wise multiplication etc. All kinds of operations we know are actually a subset of tensor contraction.
In this post, I will give you powerful tool to express tensor contraction in a single line(without any dirty unsqueeze, transpose, axis swap)
Let's say we have two tensors
Notation means element of index
Then we can express einsum as following:
Einsum may be more optimizable because it compress various operations into compact expression. This gives compiler more opportunity to optimize.
Also, it is beutiful!
You can try out some examples!
[1]