# Tensors, Math Approach
We can define tensors simply as:
> A **tensor** is any **multilinear map** from a **vector space** to a **scalar field**.
This definition as a multilinear maps is another reason people think tensors are generalization of matrices, because matrices are linear maps just like tensors. But the distinction is that matrices take a vector space to itself, while tensors take a vector space to a scalar field. So a matrix is not strictly speaking a tensor.
### Example - The [Dot Product](Dot%20Product.md)
The dot product itself is an example of a tensor. It is a [Bilinear map](Bilinear%20map.md) (a type of multilinear map) from a vector space to a scalar field (the real numbers).
There are two things that dot products and matrices have in common. The first, and most important thing, is that **they are both n-linear maps**. *This* is why tensors are *almost* generalizations of matrices. The second, and more misleading, thing is that they can be represented as a 2d array of numbers. This second thing is a huge, and I mean HUGE red herring, and has undoubtedly caused an innumerable number of people to be confused.
### TLDR
A few key points to remember:
> 1. **Tensors are not generalizations or formalizations of vectors or matrices. **
> Oh sorry, you didn’t hear? I’ll say it again. Come closer. No no, closer. TENSORS ARE NOT GENERALIZATIONS OR FORMALIZATIONS OF VECTORS OR MATRICES
> 2. Some tensors can be represented as 2d arrays, but these 2d arrays do not necessarily work anything like matrices. The numerical values in a matrix’s representation represent entirely different things than the numerical values in a tensor’s definition.
3. Vectors can be Nicolas Cage dvds, cats, or strands of Donald Trump’s toupee if you choose addition and scalar multiplication the right way.
4. Vectors are generalizations of tensors.
5. The fundamental definition of the dot product of two vectors $x$ and $y$ is not $𝑥_1𝑦_1+𝑥_2𝑦_2$, it is $||x|| ||y|| cos(\theta)$. The former is just a convenient computational shortcut when working in Cartesian coordinates.
6. Covariant and Contravariant tensors don’t exist. Physicists don’t know this because they have only watched the anime, they haven’t read the manga.
7. The definition of a tensor: A tensor is any multilinear map from a vector space to a scalar field.
### In Machine Learning
TLDR, in machine learning tensor's are really just used to refer to multidimensional arrays. See more [here](https://stats.stackexchange.com/questions/198061/why-the-sudden-fascination-with-tensors/198127). They are not used in the pure mathematics sense.
---
Date: 20211214
Links to: [Linear Transformations](Linear%20Transformations.md) [Bilinear map](Bilinear%20map.md)
Tags:
References:
* [Read and reread this quora answer](https://www.quora.com/What-is-a-tensor/answer/William-Oliver-1?ch=10&share=764c7f16&srid=srcU)