"All You Need" is Not All You Need for a Paper Title: On the Origins of a Scientific Meme
Discuss this preprint
Start a discussion What are Sciety discussions?Listed in
This article is not in any list yet, why not save it to one of your lists.Abstract
The 2017 paper "Attention Is All You Need" introduced the Transformer architecture, and inadvertently spawned one of machine learning's most persistent naming conventions. We analyze 717 arXiv preprints containing "All You Need" in their titles (2009-2025), finding exponential growth (R 2 > 0.994) following the original paper, with 200 titles in 2025 alone. Among papers following the canonical "X [Is] All You Need" structure, "Attention" remains the most frequently claimed necessity (28 occurrences). Situating this phenomenon within memetic theory, we argue the pattern's success reflects competitive pressures in scientific communication that increasingly favor memorability over precision. Whether this trend represents harmless academic whimsy or symptomatic sensationalism, we leave-with appropriate self-awareness-to the reader.