# Tensors, Physics Approach
### Key ideas
* Geometric vectors are tensors. They have one physical axis, so they are said to be tensors of *rank 1*
* A scalar is said to be a rank 0 tensor, where rank is different than dimension
### Change of basis rules (forward and backward transformations)
**Forward Transformation**
We can transform between old and new basis via:

Note that above there is a slight mistake; the video author mistakenly used $F^T$ instead of $F$.
**Backward Transformation**
We can transform the other direction via:

**Multiply F and B**
We see multiplying together simply yields the identity matrix:

In other words, our transformations are inverses.
**Generalize to n dimensions**
We start with $n$ old basis vectors, $\vec{e}_1, \dots, \vec{e}_n$, and we have $n$ new basis vectors $\vec{\tilde{e}}_1, \dots, \vec{\tilde{e}}_n$:

To avoid needing to write out all of these equations, we can come up with the general formula:


So, the final forward and backward transforms are:

### Vectors
Vectors are our first example of a tensor.

Above, the list of numbers are actually the vector *components*.


This 3rd definition is very abstract; we are really just left with a bunch of rules.
How are the below two sets of components related to each other?

Recall the forward and backward transformations:

Maybe the forward transformation will take our vector from the old CS to the new CS?

Nope! That doesn't work! What if we try the backward transformation?


That does work! Why exactly does it work? We need to recall [Change of Basis Physics](Change%20of%20Basis%20Physics.md).
Our rules can be described as: To move from old components to new components:

And to move from new components to old components:

**Summary: How basis vectors transform (left) and how vectors transform (right)**

Because vector components behave *contrary* to the basis vectors, we say that vector components are **contravariant**. To remind ourselves that the components are contravariant we will place there indices as superscripts (while the basis indices will remain subscripts):

### Covectors
A covector (row vector) is simply a linear function (see [here](https://www.youtube.com/watch?v=LNoQ_Q5JQMY&list=PLJHszsWbB6hrkmmq57lX8BV-o-YIOFsiG&index=7)).

We can visualize our covector as:

Where this is very similar to a topological map. We have a location on the surface of the earth (2d coordinate pair, lat long) and we use contours to represent the height along a specific curve:

One useful way to visualize a covector acting on a vector is show below:

We see that we only need to count the number of lines that $\vec{v}$ pierces. Why is this? Remember, one of these lines represents a *constant* result (scalar output result of covector applied to vector) based on our covector acting on *any* vector input that falls on the line.
Worth keeping in mind is that this contour plot is meant to represent a surface that is above the x-y surface. For any (x,y) tuple we can map to a particular resulting scalar on the plane. The benefit of linearity is that our contours are lines.

Now if we wanted to increase the size of a covector, we would want to make our stack (contours) denser:

And to decrease the size of our covector:

We can think about adding covectors as follows:

Okay so we have shown that covectors have sensible scaling and adding rules. That means that we have a [vector space](Abstract%20Vector%20Spaces.md). Now, say we have some ordinary vector space, $V$, with the scaling and adding rules, then the set of all covectors which act on $V$, form a new vector space called the **dual vector space**, which we call $V^{*}$. This has a different set of adding and scaling rules:

In summary:

We can also summarize and state:
* **Covectors** are **invariant** (they are purely geometric objects and they do not depend on a coordinate system)
* **Covector components** are **not invariant** (a covector will be represented by different row vectors with different components depending on which coordinate system we are using)
We know that a column vector represents a vectors components in a given basis. Now we may want to ask: What exactly do row vectors represent? Do they represent covectors components? Yes!


So we can see that our our $\epsilon