>>13896329There is no reason to learn a particular rank of tensor. There is way more insight in understanding the general concept.
The Wikipedia page should literally surfice for this.
There are only a few key concepts to be understood about them to work with them.
The abstract definition over multilinear maps and universal properties is not needed.
At the end of the day you will come to understand (except if youre a mathematician) that "tensors are
objects which transform as tensors". This is most of the time frouned upon but in the end it really captures the essence.
You probably already know vectors. Vectors are a special case of tensor, namely rank (1,0).
And the point is given a basis
a vector is represented by its components wrt to to the basis.
given a new basis with
we get
So we see that the components transform with the inverse (transpose) of the transform "contravariant"
In the same way a one form (co vector) will have the opposite transformation law for its components "covariant"
Now if you have a tensor product (or linear combination of tensorproducts) of forms and vectors each index of the component transforms like a vector or form- component. ie
preciesly because you can pull scalars and sums through all tensor products (hence multilinear).
This is the basic concept. Now notable things are also that 1) The tensorproduct of two vector spaces is way larger than the direct sum since the basis of the product is all combinations of tensor products of the bases of the seperate vector spaces.