A few days ago, we tweeted this:
ELEVEN literally means "one left above ten".
— HaggardHawks Words (@HaggardHawks) March 20, 2015
Which led to this:
@HaggardHawks I've love to see evolution of that if you have it. I've always wondered why we use eleven instead of "one-teen" or something!
— Brian Conor (@iambriam) March 20, 2015
It’s a good question—why do we say eleven and twelve, but then thirteen and fourteen? Why not oneteen and twoteen? Or threelve and fourlve?
Unsurprisingly the –teen suffix is a derivative of ten. Thirteen is literally “three and ten”, fourteen is “four and ten”, and so on. It’s a fairly ancient formation: thirteen was þreotene right back in Old English, a straightforward compound of þreo, “three”, and tene, a form of “ten”. The same goes for fourteen (derived from Old English feowertyne), and fifteen (Old English fiftene), all the way up to twenty—which was twentig, or literally “two groups of ten”.
But eleven was enleofan in Old English, which took its initial en– from the Old English word for “one”, ane. Twelve, likewise, was twelf, with its initial twe– taken from the Old English “two”, twa. The remaining –leofan and –elf parts have noting to do with the “teen” suffix, but instead represent hangovers from some ancient, pre-Old English word, probably meaning “to leave over”, or “to omit”. So eleven was literally the number “left over” after you’d counted up to ten, and twelve was literally “two left over after ten”.
But why were eleven and twelve given different names from all the other teens? Why weren’t they just ane-tene and twa-tene?
But why were eleven and twelve given different names from all the other teens? Why weren’t they just ane-tene and twa-tene?
The problem is that we’re now hardwired to think of our numbers decimally—in 10s, 100s and 1000s. There’s a good reason for doing so, of course, because 10 is such an easy number to work with. You can count to 10 using your fingers (which is called dactylonomy, by the way), and calculations involving 10 are effortlessly simple. 79 multiplied by 10, you say? 790. Easy. But this decimal way of thinking is actually a relatively recent invention, spurred on by the development of the metric system in the Middle Ages. Historically, many of our numbering and measuring systems were based around 12, not 10—and hence there are twelve inches in a foot, and two sets of twelve hours in a day.
It’s a much more complicated number to deal with arithmetically of course (79 multiplied by 12? Give me a minute...) but there’s a very practical reason for counting in terms of 12 rather than 10—because 12 is a much more mathematically productive number.
A set of 10, for instance, can only be split equally into two sets of five, or five sets of two. A set of 12, however, can be split into 2, 3, 4 or 6. Likewise a set of 20 can only be divided into 2, 4, 5 or 10, but a set of 24 can be divided into 2, 3, 4, 6, 8 or 12. And even 100 has barely half the number of factors (2, 4, 5, 10, 20, 25, 50) than 144 (2, 3, 4, 6, 8, 9, 12, 16, 18, 24, 36, 48, 72).
12 homemade cookies. Soon to be 0 homemade cookies. |
The fact that 12 could be so conveniently divided in so many different ways made it particularly useful, in everyday terms, in dealing with fractions, proportions, allocations, and measurements. It even led to some separate words for a set of twelve (dozen) and a set of twelve twelves (gross) entering our language—and to many ancient number systems, including the one we use today, using a base of 12, not 10. Ultimately twelve, and thereby eleven, earned names distinct from all those numbers above them, and it’s only our modern, decimal-based perspective that makes this seem strange.
Oh—948! Got there eventually...
Thank you oneteen times for this very useful explanation.
ReplyDelete