Gluten intolerance

Getting to grips with gluten: the modern immune system in a fritz?

The human default response to gluten is to treat it as the harmless protein it is — to not respond. So the real mystery of coeliac disease is what breaks that tolerance, and whatever that agent is, why has it become more common in recent decades? This article investigates…

AS many as one in three Americans tries to avoid gluten, a protein found in wheat, barley and rye. Gluten-free menus, gluten-free labels and gluten-free guests at summer dinners have proliferated.

Some of the anti-glutenists argue that we haven’t eaten wheat for long enough to adapt to it as a species. Agriculture began just 12,000 years ago, not enough time for our bodies, which evolved over millions of years, primarily in Africa, to adjust. According to this theory, we’re intrinsically hunter-gatherers, not bread-eaters. If exposed to gluten, some of us will develop coeliac disease or gluten intolerance, or we’ll simply feel lousy.

Most of these assertions, however, are contradicted by significant evidence, and distract us from our actual problem: an immune system that has become overly sensitive.

Wheat was first domesticated in southeastern Anatolia perhaps 11,000 years ago. (An archaeological site in Israel, called Ohalo II, indicates that people have eaten wild grains, like barley and wheat, for much longer — about 23,000 years.)

Is this enough time to adapt? To answer that question, consider how some populations have adapted to milk consumption. We can digest lactose, a sugar in milk, as infants, but many stop producing the enzyme that breaks it down — called lactase — in adulthood. For these “lactose intolerant” people, drinking milk can cause bloating and diarrhea. To cope, milk-drinking populations have evolved a trait called “lactase persistence”: the lactase gene stays active into adulthood, allowing them to digest milk.

Milk-producing animals were first domesticated about the same time as wheat in the Middle East. As the custom of dairying spread, so did lactase persistence. What surprises scientists today, though, is just how recently, and how completely, that trait has spread in some populations. Few Scandinavian hunter-gatherers living 5,400 years ago had lactase persistence genes, for example. Today, most Scandinavians do.

Here’s the lesson: Adaptation to a new food stuff can occur quickly — in a few millenniums in this case. So if it happened with milk, why not with wheat?

“If eating wheat was so bad for us, it’s hard to imagine that populations that ate it would have tolerated it for 10,000 years,” Sarah A Tishkoff, a geneticist at the University of Pennsylvania who studies lactase persistence, told me.

For Dr Bana Jabri, director of research at the University of Chicago coeliac Disease Center, it’s the genetics of coeliac disease that contradict the argument that wheat is intrinsically toxic.

Active coeliac disease can cause severe health problems, from stunting and osteoporosis to miscarriage. It strikes a relatively small number of people — just around 1 percent of the population. Yet given the significant costs to fitness, you’d anticipate that the genes associated with coeliac would be gradually removed from the gene pool of those eating wheat.

A few years ago, Jabri and the population geneticist Luis B Barreiro tested that assumption and discovered precisely the opposite. Not only were coeliac-associated genes abundant in the Middle Eastern populations whose ancestors first domesticated wheat; some coeliac-linked variants showed evidence of having spread in recent millenniums.

People who had them, in other words, had some advantage compared with those who didn’t.

Barreiro, who’s at the University of Montreal, has observed this pattern in many genes associated with autoimmune disorders…..

New York Times: Read the full article