# Encoder Bottleneck AK outlines this nicely [here](https://youtu.be/XfpMkf4rD6E?t=991), but the encoder bottleneck is basically that in, say sequence to sequence (e.g. translation) learning, we end up packing an entire english sentence that we are trying to condition on into a **single vector** that goes from the encoder to the decoder. --- Date: 20230601 Links to: Tags: References: * []()