Latent Positional Information is in the Self-Attention Variance of Transformer Language Models Without Positional Embeddings Paper • 2305.13571 • Published May 23, 2023 • 2