Three interesting food numbers
A dietary recommendation for servicemen in the 1940s led to modern-day nutrition labels and the five-second rule may not be a safe bet when it comes to food on the floor… 1. 2.3 milligrams of B1: The recommendation that won a war
Food nutrition labels were originally designed to do a lot more than make you feel guilty about eating Cheetos. The dietary recommendations were created in the 1940s to help America accomplish one of the most important missions in its history – defeating Hitler.
On the brink of entering World War II, US military leaders discovered an unexpected problem. Soldiers weren’t only hungry for victory; they were just plain hungry!
After screening some one million young men for potential service in the armed forces, the Selective Service discovered that about one in seven candidates suffered from “disabilities directly or indirectly connected with nutrition”.
The recruits were unfit for duty, and the nation needed a way to turn these malnourished men into Axis-pummeling Captain Americas.
The administration pounced on the problem. President Franklin Roosevelt gathered a committee of nutrition experts to create a practical diet that would keep Americans in shape – both at home and while fighting abroad.
Within months, the committee released its “Recommended Dietary Allowances” for each nutrient. For example, a “very active” man would need 2.3 mg of vitamin B1 per day, while a “very active” woman would need about 1.8 mg.
The system worked, and today, the recommendations have morphed into the nutrition labels now standard on packaged foods. Every few years, the numbers are revised and expanded to reflect new developments in nutrition science, and they’ve picked up the snazzy name “Dietary Reference Intakes.”
But don’t be fooled by the titling. At their core, they’re still the same recommendations that helped a nutrient-starved nation defeat the Nazis.
2. 100 proof: The measurement that gets you drunk
Proof labels on alcohol bottles were born from the needs of sailors, who wanted assurances about the quality of their booze at sea. Beginning in 1731, members of the British Royal Navy were given an alcohol ration of half a pint of rum per day. (That practice continued, albeit with reduced quantities, until 1970.)
The men loved their rum, but they often became suspicious that their superiors were watering down the goods. To test the rum’s potency, sailors would douse a small pile of gunpowder with the liquor and attempt to set it on fire.
If the powder lit instantly, the sailors took it as “proof” that the rum was strong enough. But if the powder fizzled, the booze was deemed unfit to drink. Because spirits need to be at least 57.06 percent alcohol to combust, that threshold became known as “100 degrees proof.”
The British system eventually made it across The Pond, where Americans simplified the idea by redefining “proof” as twice the percentage of alcohol by volume.
Sure, it’s not as visually impressive as the sailors’ method, but it beats having to take a handful of gunpowder into a bar with you.
3. Five seconds: The rule that can make you sick
At some time or another, with or without witnesses present, we’ve all used the five-second rule to justify eating a cookie that’s touched the floor. After all, everyone knows that if a tasty treat spends less than five seconds on the ground, it doesn’t collect germs.
Well, not exactly. In 2003, high school student Jillian Clarke performed the first known scientific tests on the five-second rule. While interning at the food science laboratory at the University of Illinois at Urbana-Champaign, Clarke tested the theory by placing gummy bears and cookies on ceramic tiles contaminated with E. coli.
Her results revealed bad news for clumsy snackers: The munchies picked up the bacteria within the five-second window. Clark’s quirky experiment inspired other food researchers to further investigate the matter.
One such scientist, Dr Paul Dawson of Clemson University, showed that food actually follows a “zero-second rule”, meaning that bacteria such as salmonella transfer onto food instantly upon contact.
Thankfully, the news isn’t as dire as it sounds. In a follow-up set of experiments, Clarke tested the bacteria levels of the university’s floors.
Her team found very little contamination, even in the most highly trafficked areas of campus. As it turns out, most floors at the University of Illinois are so clean you can eat off of them.
Source: CNN: visit mentalfloss.com
Trackback from your site.180 Views