Various tools and units of measurement have served ancient and modern civilizations alike in helping to build, travel, and develop land. The earliest measurements were often based on materials directly available to people — namely, body parts and nature. But discrepancies arose from there. For instance, one person’s hand might be larger than another’s, so the measurements would vary. As more precise measurements were needed, these units of measurement were standardized. However, a lot of those words for how we measure distances and area stayed the same.
Using the Body as a Ruler
The human body was an easily accessible measuring tool, but over time, some standard measurements were added to the definitions.
- “Foot” was the length of the average man’s foot at 12 inches (originally it was 11.42 inches).
- “Inch” was the width of a man’s thumb.
- “Span” was the length of a spread hand, roughly 9 inches.
- “Yard” was the width of a man’s waistline. In the 12th century, King Henry I determined that a yard was the distance from his nose to his thumb when his arm was outstretched.
- “Handbreadth” is the length of the average hand, generally accepted to be 4 inches.
- “Pace” is taking one step while “double pace” is taking a step with each foot.
The “cubit” was a unit of measurement used by many ancient civilizations, originating in ancient Greece around 3000 BCE. This was also based on the human body, with 1 cubit equaling the space between the tip of one’s elbow and the tip of their middle finger (roughly 2 spans, or 18 inches). The first measurement tool was the ancient Egyptian royal cubit, which was a metal rod that included measurements of 7 palms, which were divided into 4 fingers each.
Fun fact: The Bible includes cubits, known as biblical cubits, measured by handbreadths, which have largely been translated into modern measurements.
Measuring From Farm to Table
Outside the United States, the metric system is now widely used, but the imperial system was adopted through the British Weights and Measures Act of 1824. It replaced the Winchester system, which was in use from about the 15th century. Additional legislation refined the imperial system, and it’s still used in the U.K. (in addition to the metric system) and in some countries that were once part of the British Empire.
Imperial measurements were based on nature, everyday activities, and, like ancient measurement systems, on the human body. As agriculture expanded in England, larger measurements were needed.
The imperial system established some terms and units of measurement for area, in particular.
- 1 thou = 1 thousandth of an inch
- 1 barleycorn = ⅓ of an inch
- 1 chain = 66 feet
- 1 furlong = 10 chains
- 1 league = 3 miles
- 1 perch = 272.25 square feet
- 1 rood = 40 perches
- 1 acre = 4 roods
- 1 square mile = 640 acres, or 27,878,400 square feet
How Long Is a Mile?
Before the metric system was adopted in most European countries, the mile had origins in many languages. The Old English mil evolved into the Old Norse mila and the English “mile.” The Germanic root milja led to the Dutch mijl, the Middle Dutch mile, the German meile, and the Old High German mila. Latin-influenced languages derived from milia, with French becoming mille, Italian miglio and Spanish milla.
While the words may have been (almost) the same, the actual distance was not standardized. In ancient Rome, a mile equaled 1,000 double paces, roughly 4,860 feet. Old European interpretations of the mile varied, with a Medieval English mile measuring 6,610 feet and the Old London mile measuring 5,000 feet. The Latin word milia was also applied to the Germanic rasta, which was 3.25 to 6 English miles. In England, Queen Elizabeth I’s reign set a statute that a mile was 320 perches, or 5,280 feet.
Fun fact: In Middle English, a mile was also a measurement of time of about 20 minutes, which was how long it took to walk a mile.
Featured image credit: Paul Bradbury/ iStock