# Embeddings When our target space is a subset of the real line, $X \subseteq \mathbb{R}$, there is a natural **embedding** of $X$ into $\mathbb{R}$: ![](Screen%20Shot%202022-05-22%20at%206.52.57%20PM.png) ![](Screen%20Shot%202022-05-22%20at%206.53.34%20PM.png) ![](Screen%20Shot%202022-05-22%20at%206.53.44%20PM.png) ### Another Intuition An embedding is a map from $N$ objects to a vector $x \in \mathbb{R}^d$, usually with the restriction to the unit sphere. The objects might be words, sentences, nodes in a graph, products at Amazon, or elements of any other well-defined collection. The key property is that our idea of **similarity** between objects corresponds to the **distance** between object embeddings. See more [here](https://randorithms.com/2020/11/17/Adding-Embeddings.html). --- Date: 20220522 Links to: [Mathematics MOC](Mathematics%20MOC.md) [Projection vs Embedding](Projection%20vs%20Embedding.md) Tags: #review References: * Notability: Probability theory, 2.2.1 (betancourt) * [Why is it Okay to Average Embeddings? - Randorithms](https://randorithms.com/2020/11/17/Adding-Embeddings.html) * [Shape Analysis (Lecture 10): Metric spaces and embeddings - YouTube](https://www.youtube.com/watch?v=gxOohqi2l0g&list=PLQ3UicqQtfNtUcdTMLgKSTTOiEsCw2VBW&index=16&t=1207s)