Here's something that might bother you: the number 1 is not prime. It seems like it should be. It's only divisible by 1 and itself. That's the rule, right? Divisible only by 1 and itself?
Well, yes. Except when "1" and "itself" are the same number. Then things get weird.
For a long time, mathematicians did call 1 prime. Legendre included it. So did many others through the 19th century. It wasn't some obvious oversight — serious people looked at 1 and said, "Yeah, that counts." The definition seemed to permit it.
So what changed? The Fundamental Theorem of Arithmetic. This is the theorem that says every integer greater than 1 can be expressed as a product of primes in exactly one way. The number 12 is 2 × 2 × 3. The number 45 is 3 × 3 × 5. There's only one way to break each number down into its prime ingredients, if you ignore the order. One recipe per number. That's it.
This theorem is gorgeous. It's the reason prime numbers are called the "atoms" of arithmetic. But watch what happens if you let 1 be prime.
Suddenly 12 = 2 × 2 × 3. But also 12 = 1 × 2 × 2 × 3. And 12 = 1 × 1 × 2 × 2 × 3. And 12 = 1 × 1 × 1 × 1 × 1 × 1 × 2 × 2 × 3. You can sprinkle in as many 1s as you want. Unique factorization — that beautiful, clean, one-recipe-per-number idea — shatters into infinity.
Now, you could save the theorem by rewriting it: "every integer has a unique prime factorization, excluding 1s." But mathematicians noticed they were writing that exception everywhere. In theorem after theorem, proof after proof, the phrase "for all primes greater than 1" kept appearing like an uninvited guest. It was clutter. It was noise.
So they made a choice. Instead of patching every theorem with an asterisk, they simply refined the definition. A prime number is a natural number greater than 1 that has no positive divisors other than 1 and itself. Two conditions now. The "greater than 1" part isn't arbitrary — it's load-bearing.
This happens more often in math than you'd think. Definitions aren't handed down from the universe on stone tablets. They're tools, and mathematicians sharpen them over time to make the machinery run cleanly. The number 1 wasn't "wrong" — it was just making everything harder for no good reason.
There's a term for what 1 is: a unit. In number theory, units are the numbers that have multiplicative inverses within the integers — numbers you can multiply by to get 1. Among the positive integers, only 1 qualifies. It's not prime, and it's not composite. It occupies its own category entirely.
So the loneliest number really is 1. Not because of the old song, but because mathematics itself looked at every other positive integer, sorted them into "prime" or "composite," and left 1 standing by itself. A category of one, for the number that is one.
Which raises a question worth sitting with: how many other things do we define a certain way not because the universe demands it, but because the alternative would be exhaustingly inconvenient?