Showing posts with label diseases of civilization. Show all posts
Showing posts with label diseases of civilization. Show all posts

Atherosclerosis in Ancient Mummies Revisited

Many of you are already aware of the recent study that examined atherosclerosis in 137 ancient mummies from four different cultures (1).  Investigators used computed tomography (CT; a form of X-ray) to examine artery calcification in mummies from ancient Egypt, Peru, Puebloans, and arctic Unangan hunter-gatherers.  Artery calcification is the accumulation of calcium in the vessel wall, and it is a marker of severe atherosclerosis.  Where there is calcification, the artery wall is thickened and extensively damaged.  Not surprisingly, this is a risk factor for heart attack.  Pockets of calcification are typical as people age.

I'm not going to re-hash the paper in detail because that has been done elsewhere.  However, I do want to make a few key points about the study and its interpretation.  First, all groups had atherosclerosis to a similar degree, and it increased with advancing age.  This suggests that atherosclerosis may be part of the human condition, and not a modern disease.  Although it's interesting to have this confirmed in ancient mummies, we already knew this from cardiac autopsy data in a variety of non-industrial cultures (2, 3, 4, 5).
Read more »

Lessons From Ötzi, the Tyrolean Ice Man. Part II

Otzi's Diet

Otzi's digestive tract contains the remains of three meals.  They were composed of cooked grains (wheat bread and wheat grains), meat, roots, fruit and seeds (1, 2).  The meat came from three different animals-- chamois, red deer and ibex.  The "wheat" was actually not what we would think of as modern wheat, but an ancestral variety called einkorn.

Isotope analysis indicates that Otzi's habitual diet was primarily centered around plant foods, likely heavily dependent on grains but also incorporating a variety of other plants (3).  He died in the spring with a belly full of einkorn wheat.  Since wheat is harvested in the fall, this suggests that his culture stored grain and was dependent on it for most if not all of the year.  However, he also clearly ate meat and used leather made from his prey.  Researchers are still debating the quantity of meat in his diet, but it was probably secondary to grains and other plant foods. It isn't known whether or not he consumed dairy.

Read more »

Paleo Diet Article in Sound Consumer

I recently wrote an article for my local natural foods grocery store, PCC, about the "Paleolithic" diet.  You can read it online here.  I explain the basic rationale for Paleo diets, some of the scientific support behind it, and how it can be helpful for people with certain health problems.  I focused in particular on the research of Dr. Staffan Lindeberg at the University of Lund, who has studied non-industrial populations using modern medical techniques and also conducted clinical diet trials using the Paleo diet.
Read more »

Junk Free January

Last year, Matt Lentzner organized a project called Gluten Free January, in which 546 people from around the world gave up gluten for one month.  The results were striking: a surprisingly large proportion of participants lost weight, experienced improved energy, better digestion and other benefits (1, 2).  This January, Lentzner organized a similar project called Junk Free January.  Participants can choose between four different diet styles:
  1. Gluten free
  2. Seed oil free (soybean, sunflower, corn oil, etc.)
  3. Sugar free
  4. Gluten, seed oil and sugar free
Wheat, seed oils and added sugar are three factors that, in my opinion, are probably linked to the modern "diseases of affluence" such as obesity, diabetes and coronary heart disease.  This is particularly true if the wheat is eaten in the form of white flour products, and the seed oils are industrially refined and used in high-heat cooking applications.

If you've been waiting for an excuse to improve your diet, why not join Junk Free January?

New Ancestral Diet Review Paper

Pedro Carrera-Bastos and his colleagues Maelan Fontes-Villalba, James H. O'Keefe, Staffan Lindeberg and Loren Cordain have published an excellent new review article titled "The Western Diet and Lifestyle and Diseases of Civilization" (1). The paper reviews the health consequences of transitioning from a traditional to a modern Western diet and lifestyle. Pedro is a knowledgeable and tireless advocate of ancestral, primarily paleolithic-style nutrition, and it has been my privilege to correspond with him regularly. His new paper is the best review of the underlying causes of the "diseases of civilization" that I've encountered. Here's the abstract:
It is increasingly recognized that certain fundamental changes in diet and lifestyle that occurred after the Neolithic Revolution, and especially after the Industrial Revolution and the Modern Age, are too recent, on an evolutionary time scale, for the human genome to have completely adapted. This mismatch between our ancient physiology and the western diet and lifestyle underlies many so-called diseases of civilization, including coronary heart disease, obesity, hypertension, type 2 diabetes, epithelial cell cancers, autoimmune disease, and osteoporosis, which are rare or virtually absent in hunter–gatherers and other non-westernized populations. It is therefore proposed that the adoption of diet and lifestyle that mimic the beneficial characteristics of the preagricultural environment is an effective strategy to reduce the risk of chronic degenerative diseases.
At 343 references, the paper is an excellent resource for anyone with an academic interest in ancestral health, and in that sense it reminds me of Staffan Lindeberg's book Food and Western Disease. One of the things I like most about the paper is that it acknowledges the significant genetic adaptation to agriculture and pastoralism that has occurred in populations that have been practicing it for thousands of years. It hypothesizes that the main detrimental change was not the adoption of agriculture, but the more recent industrialization of the food system. I agree.

I gave Pedro my comments on the manuscript as he was editing it, and he was kind enough to include me in the acknowledgments.

How lean should one be?

Loss of muscle mass is associated with aging. It is also associated with the metabolic syndrome, together with excessive body fat gain. It is safe to assume that having low muscle and high fat mass, at the same time, is undesirable.

The extreme opposite of that, achievable though natural means, would be to have as much muscle as possible and as low body fat as possible. People who achieve that extreme often look a bit like “buff skeletons”.

This post assumes that increasing muscle mass through strength training and proper nutrition is healthy. It looks into body fat levels, specifically how low body fat would have to be for health to be maximized.

I am happy to acknowledge that quite often I am working on other things and then become interested in a topic that is brought up by Richard Nikoley, and discussed by his readers (I am one of them). This post is a good example of that.

Obesity and the diseases of civilization

Obesity is strongly associated with the diseases of civilization, of which the prototypical example is perhaps type 2 diabetes. So much so that sometimes the impression one gets is that without first becoming obese, one cannot develop any of the diseases of civilization.

But this is not really true. For example, diabetes type 1 is also one of the diseases of civilization, and it often strikes thin people. Diabetes type 1 results from the destruction of the beta cells in the pancreas by a person’s own immune system. The beta cells in the pancreas produce insulin, which regulates blood glucose levels.

Still, obesity is undeniably a major risk factor for the diseases of civilization. It seems reasonable to want to move away from it. But how much? How lean should one be to be as healthy as possible? Given the ubiquity of U-curve relationships among health variables, there should be a limit below which health starts deteriorating.

Is the level of body fat of the gentleman on the photo below (from: ufcbettingtoday.com) low enough? His name is Fedor; more on him below. I tend to admire people who excel in narrow fields, be they intellectual or sport-related, even if I do not do anything remotely similar in my spare time. I admire Fedor.


Let us look at some research and anecdotal evidence to see if we can answer the question above.

The buff skeleton look is often perceived as somewhat unattractive

Being in the minority is not being wrong, but should make one think. Like Richard Nikoley’s, my own perception of the physique of men and women is that, the leaner they are, the better; as long as they also have a reasonable amount of muscle. That is, in my mind, the look of a stage-ready competitive natural bodybuilder is close to the healthiest look possible.

The majority’s opinion, however, seems different, at least anecdotally. The majority of women that I hear or read voicing their opinions on this matter seem to find the “buff skeleton” look somewhat unattractive, compared with a more average fit or athletic look. The same seems to be true for perceptions of males about females.

A little side note. From an evolutionary perspective, perceptions of ancestral women about men must have been much more important than perceptions of ancestral men about women. The reason is that the ancestral women were the ones applying sexual selection pressures in our ancestral past.

For the sake of discussion, let us define the buff skeleton look as one of a reasonably muscular person with a very low body fat percentage; pretty much only essential fat. That would be 10-13 percent for women, and 5-8 percent for men.

The average fit look would be 21-24 percent for women, and 14-17 percent for men. Somewhere in between, would be what we could call the athletic look, namely 14-20 percent for women, and 6-13 percent for men. These levels are exactly the ones posted on this Wikipedia article on body fat percentages, at the time of writing.

From an evolutionary perspective, attractiveness to members of the opposite sex should be correlated with health. Unless we are talking about a costly trait used in sexual selection by our ancestors; something analogous to the male peacock’s train.

But costly traits are usually ornamental, and are often perceived as attractive even in exaggerated forms. What prevents male peacock trains from becoming the size of a mountain is that they also impair survival. Otherwise they would keep growing. The peahens find them sexy.

Being ripped is not always associated with better athletic performance

Then there is the argument that if you carried some extra fat around the waist, then you would not be able to fight, hunt etc. as effectively as you could if you were living 500,000 years ago. Evolution does not “like” that, so it is an unnatural and maladaptive state achieved by modern humans.

Well, certainly the sport of mixed martial arts (MMA) is not the best point of comparison for Paleolithic life, but it is not such a bad model either. Look at this photo of Fedor Emelianenko (on the left, clearly not so lean) next to Andrei Arlovski (fairly lean). Fedor is also the one on the photo at the beginning of this post.

Fedor weighed about 220 lbs at 6’; Arlovski 250 lbs at 6’4’’. In fact, Arlovski is one of the leanest and most muscular MMA heavyweights, and also one of the most highly ranked. Now look at Fedor in action (see this YouTube video), including what happened when Fedor fought Arlovski, at around the 4:28 mark. Fedor won by knockout.

Both Fedor and Arlovski are heavyweights; which means that they do not have to “make weight”. That is, they do not have to lose weight to abide by the regulations of their weight category. Since both are professional MMA fighters, among the very best in the world, the weight at which they compete is generally the weight that is associated with their best performance.

Fedor was practically unbeaten until recently, even though he faced a very high level of competition. Before Fedor there was another professional fighter that many thought was from Russia, and who ruled the MMA heavyweight scene for a while. His name is Igor Vovchanchyn, and he is from the Ukraine. At 5’8’’ and 230 lbs in his prime, he was a bit chubby. This YouTube video shows him in action; and it is brutal.

A BMI of about 25 seems to be the healthiest for long-term survival

Then we have this post by Stargazey, a blogger who likes science. Toward the end the post she discusses a study suggesting that a body mass index (BMI) of about 25 seems to be the healthiest for long-term survival. That BMI is between normal weight and overweight. The study suggests that both being underweight or obese is unhealthy, in terms of long-term survival.

The BMI is calculated as an individual’s body weight divided by the square of the individual’s height. A limitation of its use here is that the BMI is a more reliable proxy for body fat percentage for women than for men, and can be particularly misleading when applied to muscular men.

The traditional Okinawans are not super lean

The traditional Okinawans (here is a good YouTube video) are the longest living people in the world. Yet, they are not super lean, not even close. They are not obese either. The traditional Okinawans are those who kept to their traditional diet and lifestyle, which seems to be less and less common these days.

There are better videos on the web that could be used to illustrate this point. Some even showing shirtless traditional karate instructors and students from Okinawa, which I had seen before but could not find again. Nearly all of those karate instructors and students were a bit chubby, but not obese. By the way, karate was invented in Okinawa.

The fact that the traditional Okinawans are not ripped does not mean that the level of fat that is healthy for them is also healthy for someone with a different genetic makeup. It is important to remember that the traditional Okinawans share a common ancestry.

What does this all mean?

Some speculation below, but before that let me tell this: as counterintuitive as it may sound, excessive abdominal fat may be associated with higher insulin sensitivity in some cases. This post discusses a study in which the members of a treatment group were more insulin sensitive than the members of a control group, even though the former were much fatter; particularly in terms of abdominal fat.

It is possible that the buff skeleton look is often perceived as somewhat unattractive because of cultural reasons, and that it is associated with the healthiest state for humans. However, it seems a bit unlikely that this applies as a general rule to everybody.

Another possibility, which appears to be more reasonable, is that the buff skeleton look is healthy for some, and not for others. After all, body fat percentage, like fat distribution, seems to be strongly influenced by our genes. We can adapt in ways that go against genetic pressures, but that may be costly in some cases.

There is a great deal of genetic variation in the human species, and much of it may be due to relatively recent evolutionary pressures.

Life is not that simple!

References

Buss, D.M. (1995). The evolution of desire: Strategies of human mating. New York, NY: Basic Books.

Cartwright, J. (2000). Evolution and human behavior: Darwinian perspectives on human nature. Cambridge, MA: The MIT Press.

Miller, G.F. (2000). The mating mind: How sexual choice shaped the evolution of human nature. New York, NY: Doubleday.

Zahavi, A. & Zahavi, A. (1997). The Handicap Principle: A missing piece of Darwin’s puzzle. Oxford, England: Oxford University Press.

China Study Problems of Interpretation

The China study was an observational study that collected a massive amount of information about diet and health in 65 different rural regions of China. It's been popularized by Dr. T. Colin Campbell, who has argued that the study shows that plant foods are generally superior to animal foods for health, and even a small amount of animal food is harmful. Campbell's book has been at the center of the strict vegetarian (vegan) movement since its publication.

Richard from Free the Animal just passed on some information that many of you may find interesting. A woman named Denise Minger recently published a series of posts on the China study. She looked up the raw data and applied statistics to it. It's the most thorough review of the data I've seen so far. She raises some points about Campbell's interpretation of the data that are frankly disturbing. As I like to say, the problem is usually not in the data-- it's in the interpretation.

One of the things Minger points out is that wheat intake had a massive correlation with coronary heart disease-- one of the strongest correlations the investigators found. Is that because wheat causes CHD, or is it because wheat eating regions tend to be further North and thus have worse vitamin D status? I don't know, but it's an interesting observation nevertheless. Check out Denise Minger's posts... if you have the stamina:

The China Study: Fact or Fallacy

Also, see posts on the China study by Richard Nikoley, Chris Masterjohn and Anthony Colpo:

T. Colin Campbell's the China Study
The Truth About the China Study
The China Study: More Vegan Nonsense

And my previous post on the association between wheat intake and obesity in China:

Wheat in China

In Search of Traditional Asian Diets

It's been difficult for me to find good information on Asian diets prior to modernization. Traditional Chinese, Taiwanese and Japanese diets are sometimes portrayed as consisting mostly of white rice, with vegetables and a bit of meat and soy, but I find that implausible. Rice doesn't grow everywhere, and removing all the bran was prohibitively labor-intensive before the introduction of modern machine milling. One hundred years ago, bran was partially removed by beating or grinding in a mortar and pestle, as it still is in parts of rural Asia today. Only the wealthy could afford true white rice.

Given the difficulty of growing rice in most places, and hand milling it, the modern widespread consumption of white rice in Asia must be a 20th century phenomenon, originating in the last 20-100 years depending on location. Therefore, white rice consumption does not predate the emergence of the "diseases of civilization" in Asia.
In the book Western Diseases: Their Emergence and Prevention, there are several accounts of traditional Asian diets I find interesting.

Taiwan in 1980

The staple constituent of the diet is polished white rice. Formerly in the poorer areas along the sea coast the staple diet was sweet potato, with small amounts of white rice added. Formerly in the mountains sweet potato, millet and taro were the staple foods. During the last 15 years, with the general economic development of the whole island, white polished rice has largely replaced other foods. There is almost universal disinclination to eat brown (unpolished) rice, because white rice is more palatable, it bears kudos, cooking is easier and quicker, and it can be stored for a much longer period.

Traditionally, coronary heart disease and high blood pressure were rare, but the prevalence is now increasing rapidly. Stroke is common. Diabetes was rare but is increasing gradually.

Mainland China

China is a diverse country, and the food culture varies by region.

Snapper (1965)… quoted an analysis by Guy and Yeh of Peiping (Peking) diets in 1938. There was a whole cereal/legume/vegetable diet for poorer people and a milled-cereal/meat/vegetable diet for the richer people.

Symptoms of vitamin A, C and D deficiency were common in the poor, although coronary heart disease and high blood pressure were rare. Diabetes occurred at a higher rate than in most traditionally-living populations.

Japan

On the Japanese island of Okinawa, the traditional staple is the sweet potato, with a smaller amount of rice eaten as well. Seafood, vegetables, pork and soy are also on the menu. In Akira Kurosawa’s movie Seven Samurai, set in 16th century mainland Japan, peasants ate home-processed millet and barley, while the wealthy ate white rice. Although a movie may not be the best source of information, I suspect it has some historical basis.


White Rice: a Traditional Asian Staple?

It depends on your perspective. How far back do you have to go before you can call a food traditional? Many peoples' grandparents ate white rice, but I doubt their great great grandparents ate it frequently. White rice may have been a staple for the wealthy for hundreds of years in some places. But for most of Asia, in the last few thousand years, it was probably a rare treat. The diet most likely resembled that of many non-industrial African cultures: an assortment of traditionally prepared grains, root vegetables, legumes, vegetables and a little meat.

Please add any additional information you may have about traditional Asian diets to the comments section.

Intermittent fasting as a form of liberation

I have been doing a lot of reading over the years on isolated hunter-gatherer populations; see three references at the end of this post, all superb sources (Chagnon’s book on the Yanomamo, in particular, is an absolute page turner). I also take every opportunity I have to talk with anthropologists and other researchers who have had field experience with hunter-gatherer groups. Even yesterday I was talking to a researcher who spent many years living among isolated native Brazilian groups in the Amazon.

Maybe I have been reading too much into those descriptions, but it seems to me that one distinctive feature of many adults in hunter-gatherer populations, when compared with adults in urban populations, is that the hunter-gatherers are a lot less obsessed with food.

Interestingly, this seems to be a common characteristic of physically active children. They want to play, and eating is often an afterthought, an interruption of play. Sedentary children, who play indoors, can and often want to eat while they play.

Perhaps adult hunter-gatherers are more like physically active children than adults in modern urban societies. Maybe this is one of the reasons why adult hunter-gatherers have much less body fat. Take a look at the photo below (click to enlarge), from Wikipedia. It was reportedly taken in 1939, and shows three Australian aboriginals.


Hunter-gatherers do not have supermarkets, and active children need food to grow healthy. Adult urbanites have easy access to an abundance of food in supermarkets, and they do not need food to grow, at least not vertically.

Still, adult hunter-gatherers and children who are physically active are generally much less concerned about food than adults in modern urban societies.

It seems illogical, a bit like a mental disorder of some sort that has been plaguing adults in modern urban societies. A mental disorder that contributes to making them obese.

Modern urbanites are constantly worried about food. And also about material possessions, bills, taxes etc. They want to accumulate as much wealth as their personal circumstances allow them, so that they can retire and pay for medical expenses. They must worry about paying for their children’s education. Food is one of their many worries; for many it is the biggest of them all. Too much food makes you fat, too little makes you lose muscle (not really true, but a widespread belief).

Generally speaking, intermittent fasting is very good for human health. Humans seem to have evolved to be episodic eaters, being in the fasted state most of the time. This is perhaps why intermittent fasting significantly reduces levels of inflammation markers, promotes the recycling of “messed up” proteins (e.g., glycated proteins), and increases leptin and insulin sensitivity. It is something natural. I am talking about fasting 24 h at a time (or a bit more, but not much more than that), with plenty of water but no calories. Even skipping a meal now and then, when you are busy with other things, is a form of intermittent fasting.

Now, the idea that our hominid ancestors were starving most of the time does not make a lot of sense, at least not when we think about Homo sapiens, as opposed to earlier ancestors (e.g., the Australopithecines). Even archaic Homo sapiens, dating back to 500 thousand years ago, were probably too smart to be constantly starving. Moreover, the African savannas, where Homo sapiens emerged, were not the type of environment where a smart and social species would be hungry for too long.

Yet, intermittent fasting probably happened frequently among our Homo sapiens ancestors, for the same reason that it happens among hunter-gatherers and active children today. My guess is that, by and large, our ancestors were simply not too worried about food. They ate it because they were hungry, probably at regular times – as most hunter-gatherers do. They skipped meals from time to time.

They certainly did not eat to increase their metabolism, raise their thyroid hormone levels, or have a balanced macronutrient intake.

There were no doubt special occasions when people gathered for a meal as a social activity, but probably the focus was on the social activity, and secondarily on the food.

Of course, they did not have doughnuts around, or foods engineered to make people addicted to them. That probably made things a little easier.

Successful body fat loss through intermittent fasting requires a change in mindset.

References:

Boaz, N.T., & Almquist, A.J. (2001). Biological anthropology: A synthetic approach to human evolution. Upper Saddle River, NJ: Prentice Hall.

Chagnon, N.A. (1977). Yanomamo: The fierce people. New York, NY: Holt, Rinehart and Winston.

Price, W.A. (2008). Nutrition and physical degeneration. La Mesa, CA: Price-Pottenger Nutrition Foundation.

Adiponectin supplementation: Body fat loss

Adiponectin is a hormone exclusively secreted by body fat. This hormone has been recently gaining attention from researchers because of some of its functions. Two important ones are the regulation of glucose and fat metabolism.

Elevated levels of adiponectin are associated with increased insulin sensitivity, and increased fat catabolism (i.e., fat burning). And these associations appear to be causal. That is, adiponectin levels do not seem to be only markers, but causes of increased insulin sensitivity and fat catabolism.

In other words, an increase in circulating adiponectin seems to lead to increased insulin sensitivity and increased fat catabolism. Insulin sensitivity is the opposite of insulin resistance. The latter is a precursor to diabetes type 2, and is associated with elevated fasting and postprandial (i.e., after a meal) glucose levels.

Adiponectin also seems to work closely with leptin, another hormone implicated in a number of diseases of civilization. It appears that adiponecting and leptin modulate each other’s secretion and effects in metabolic processes.

So what do we do to increase our levels of circulating adiponectin?

Well, apparently there is only one guaranteed way, and that is to lose body fat!

Adiponectin is unique among hormones secreted by body fat in that it increases as body fat decreases. Other important body fat hormones, such as leptin, decrease with body fat loss.

The figure below (from: Poppitt et al., 2008) shows a graph where adiponectin levels are plotted against body mass index (BMI). BMI is strongly correlated with body fat percentage.

As you can see from the figure above adiponectin levels more than double when BMI goes from 26 to 20. One does not need to be obese to take advantage of this effect, and to benefit from having increased adiponectin levels.

The linear (Pearson) correlation between BMI and adiponectin levels is indicated as a high 0.551. The fluctuations around the line (the "line" looks more like a quasi-linear curve obtained through quadratic regression), which are why the correlation is not 1, are probably due chiefly to two factors:

    - BMI is not a very precise measure of body fat. A very muscular person will have a high BMI and low body fat. That person will consequently have much higher adiponectin levels than an obese person with equal BMI.

    - Adiponectin levels are naturally higher in women than in men. This is another point in favor of adiponectin, as women have always been the evolutionary bottleneck among our Paleolithic ancestors.

Now you know why doctors prescribe weight loss to patients with diabetes type 2.

And, when we look at various hunter-gatherer groups that were apparently free of diseases of civilization prior to westernization, there are only a few common denominators. Diet was not one of them, as Weston Price and others have shown us, at least not in the sense of what they included in their diet.

One of the few common denominators was arguably the fact that those hunter-gatherers typically had relatively low levels of body fat; an almost universal feature among non-westernized hunter-gatherers.

Reference:

Poppitt, S.D. et al. (2008). Postprandial response of adiponectin, interleukin-6, tumor necrosis factor-α, and C-reactive protein to a high-fat dietary load. Nutrition, 24(4), 322-329.

Lindeberg on Obesity

I'm currently reading Dr. Staffan Lindeberg's magnum opus Food and Western Disease, recently published in English for the first time. Dr. Lindeberg is one of the world's leading experts on the health and diet of non-industrial cultures, particularly in Papua New Guinea. The book contains 2,034 references. It's also full of quotable statements. Here's what he has to say about obesity:
Middle-age spread is a normal phenomenon - assuming you live in the West. Few people are able to maintain their [youthful] waistline after age 50. The usual explanation - too little exercise and too much food - does not fully take into account the situation among traditional populations. Such people are usually not as physically active as you may think, and they usually eat large quantities of food.

Overweight has been extremely rare among hunter-gatherers and other traditional cultures [18 references]. This simple fact has been quickly apparent to all foreign visitors...

The Kitava study measured height, weight, waist circumference, subcutaneous fat thickness at the back of the upper arm (triceps skinfold) and upper arm circumference on 272 persons ages 4-86 years. Overweight and obesity were absent and average [body mass index] was low across all age groups. ...no one was larger around their waist than around their hips.

...The circumference of the upper arm [mostly indicating muscle mass] was only negligibly smaller on Kitava [compared with Sweden], which indicates that there was no malnutrition. It is obvious from our investigations that lack of food is an unknown concept, and that the surplus of fruits and vegetables regularly rots or is eaten by dogs.

The Population of Kitava occupies a unique position in the world in terms of the negligible effect that the Western lifestyle has had on the island.
The only obese Kitavans Dr. Lindeberg observed were two people who had spent several years off the island living a modern, urban lifestyle, and were back on Kitava for a visit.

I'd recommend this book to anyone who has a scholarly interest in health and nutrition, and somewhat of a background in science and medicine. It's extremely well referenced, which makes it much more valuable.

Vitamin D levels: Sunlight, age, and toxicity

Calcidiol is a pre-hormone that is produced based on vitamin D3 in the liver. Blood concentration of calcidiol is considered to be a reliable indicator of vitamin D status. In the research literature, calcidiol is usually referred to as 25-Hydroxyvitamin or 25(OH)D. Calcidiol is converted in the kidneys into calcitriol, which is the active form of vitamin D.

The table below (from: Vieth, 1999; full reference at the end of this post; click on it to enlarge), shows the average blood vitamin D levels of people living or working in sun-rich environments. To convert from nmol/L to ng/mL, divide by 2.496. For example, 100 nmol/L = 100 / 2.496 ng/mL = 40.1 ng/mL. At the time of this writing, Vieth (1999) had 692 citations on Google Scholar, and probably more than that on Web of Science. This article has had, and continues having, a high impact among researchers.


The maximum average level of blood (or serum) vitamin D shown in the table is 163 nmol/L (65 ng/mL). Given that the human body produces vitamin D naturally from sunlight, it is reasonable to assume that those blood vitamin D levels are not yet at the toxic range. In fact, one of the individuals, a farmer in Puerto Rico, had a level of 225 nmol/L (90 ng/mL). That individual had no signs of toxicity.

Several studies show that pre-sunburn full-body exposure to sunlight is equivalent to an oral vitamin D intake of approximately 250 µg (10,000 IU).

In spite of claims to the contrary, vitamin D production based on sunlight does not cease after 40 years of age or so. Studies reviewed by Vieth suggest that among the elderly (i.e., those aged 65 or above) pre-sunburn full-body exposure to sunlight is equivalent to an oral vitamin D intake of 218 µg (8,700 IU).

Sunlight-induced vitamin D production does seem to decrease with age, but not dramatically.

Post-sunburn sunlight exposure does not increase vitamin D production. Since each person is different, a good rule of thumb to estimate the number of minutes of sunlight exposure needed to maximize vitamin D production is the number of minutes preceding sunburn. For a light-skinned person, this can be as little as 7 minutes.

Vitamin D accumulation in the body follows a battery-like pattern, increasing and decreasing gradually. The figure below, from Vieth’s article, shows the gradual increase in blood vitamin D concentrations following the start of daily supplementation. This suggests that levels start to plateau at around 1 month, with higher levels reaching a plateau after 2 months.


While sunlight exposure does not lead to toxic levels of vitamin D, oral intake may. Below is a figure, also from Vieth’s article, that plots blood levels of vitamin D against oral intake amounts. The X’s indicate points at which intoxication symptoms were observed. While typically intoxication starts at the 50,000 IU intake level, one individual displayed signs of intoxication at 10,000 IU. That individual received a megadose that was supposed to provide vitamin D for an extended period of time.


Non-toxic levels of 10,000 IU are achieved naturally through sunlight exposure. This applies to modern humans and probably our Paleolithic ancestors. Yet, modern humans normally limit their sun exposure and intake of vitamin D to levels (400 IU) that are only effective to avoid osteomalacia, the softening of the bones due to poor mineralization.

Very likely the natural production of 10,000 IU based on sunlight was adaptive in our evolutionary past, and also necessary for good health today. This is consistent with the many reports of diseases associated with chronic vitamin D deficiency, even at levels that avoid osteomalacia. Among those diseases are: hypertension, tuberculosis, various types of cancer, gingivitis, multiple sclerosis, chronic inflammation, seasonal affective disorder, and premature senescence.

Reference:

Reinhold Vieth (May 1999). Vitamin D supplementation, 25-hydroxyvitamin D concentrations, and safety. American Journal of Clinical Nutrition, Vol. 69, No. 5, 842-856.

Vitamin D deficiency, seasonal depression, and diseases of civilization

George Hamilton admits that he has been addicted to sunbathing for much of his life. The photo below (from: phoenix.fanster.com), shows him at the age of about 70. In spite of possibly too much sun exposure, he looks young for his age, in remarkably good health, and free from skin cancer. How come? Maybe his secret is vitamin D.


Vitamin D is a fat-soluble pro-hormone; not actually a vitamin, technically speaking. That is, it is a substance that is a precursor to hormones, which are known as calcipherol hormones (calcidiol and calcitriols). The hormones synthesized by the human body from vitamin D have a number of functions. One of these functions is the regulation of calcium in the bloodstream via the parathyroid glands.

The biological design of humans suggests that we are meant to obtain most of our vitamin D from sunlight exposure. Vitamin D is produced from cholesterol as the skin is exposed to sunlight. This is one of the many reasons (see here for more) why cholesterol is very important for human health.

Seasonal depression is a sign of vitamin D deficiency. This often occurs during the winter, when sun exposure is significantly decreased, a phenomenon known as seasonal affective disorder (SAD). This alone is a cause of many other health problems, as depression (even if it is seasonal) may lead to obesity, injury due to accidents, and even suicide.

For most individuals, as little as 10 minutes of sunlight exposure generates many times the recommended daily value of vitamin D (400 IU), whereas a typical westernized diet yields about 100 IU. The recommended 400 IU (1 IU = 25 ng) is believed by many researchers to be too low, and levels of 1,000 IU or more to be advisable. The upper limit for optimal health seems to be around 10,000 IU. It is unlikely that this upper limit can be exceeded due to sunlight exposure, as noted below.

Cod liver oil is a good source of vitamin D, with one tablespoon providing approximately 1,360 IU. Certain oily fish species are also good sources; examples are herring, salmon and sardines. For optimal vitamin and mineral intake and absorption, it is a good idea to eat these fish whole. (See here for a post on eating sardines whole.)

Periodic sun exposure (e.g., every few days) has a similar effect to daily exposure, because vitamin D has a half-life of about 25 days. That is, without any use by the body, it would take approximately 25 days for vitamin D levels to fall to half of their maximum levels.

The body responds to vitamin D intake in a "battery-like" manner, fully replenishing the battery over a certain amount of time. This could be achieved by moderate (pre-sunburn) and regular sunlight exposure over a period of 1 to 2 months for most people. Like most fat-soluble vitamins, vitamin D is stored in fat tissue, and slowly used by the body.

Whenever sun exposure is limited or sunlight scarce for long periods of time, supplementation may be needed. Excessive supplementation of vitamin D (i.e., significantly more than 10,000 IU per day) can cause serious problems, as the relationship between vitamin D levels and health complications follows a U curve pattern. These problems can be acute or chronic. In other words, too little vitamin D is bad for our health, and too much is also bad.

The figure below (click on it to enlarge), from Tuohimaa et al. (2009), shows two mice. The one on the left has a genetic mutation that leads to high levels of vitamin D-derived hormones in the blood. Both mice have about the same age, 8 months, but the mutant mouse shows marked signs of premature aging.


It is important to note that the skin wrinkles of the mice on the left have nothing to do with sun exposure; they are associated with excessive vitamin D-derived hormone levels in the body (hypervitaminosis D) and related effects. They are a sign of accelerated aging.

Production of vitamin D and related hormones based on sunlight exposure is tightly regulated by various physiological and biochemical mechanisms. Because of that, it seems to be impossible for someone to develop hypervitaminosis D due to sunlight exposure. This does NOT seem to be the case with vitamin D supplementation, which can cause hypervitaminosis D.

In addition to winter depression, chronic vitamin D deficiency is associated with an increased risk of the following chronic diseases: osteoporosis, cancer, diabetes, autoimmune disorders, hypertension, and atherosclerosis.

The fact that these diseases are also known as the diseases of civilization should not be surprising to anyone. Industrialization has led to a significant decrease in sunlight exposure. In cold weather, our Paleolithic ancestors would probably seek sunlight. That would be one of their main sources of warmth. In fact, one does not have to go back that far in time (100 years should be enough) to find much higher average levels of sunlight exposure than today.

Modern humans, particularly in urban environments, have artificial heating, artificial lighting, and warm clothes. There is little or no incentive for them to try to increase their skin's sunlight exposure in cold weather.

References:

W. Hoogendijk, A. Beekman, D. Deeg, P. Lips, B. Penninx. Depression is associated with decreased 25-hydroxyvitamin-D and increased parathyroid hormone levels in old age. European Psychiatry, Volume 24, Supplement 1, 2009, Page S317.

P. Tuohimaa, T. Keisala, A. Minasyan, J. Cachat, A. Kalueff. Vitamin D, nervous system and aging. Psychoneuroendocrinology, Volume 34, Supplement 1, December 2009, Pages S278-S286.

Cancer patterns in Inuit populations: 1950-1997

Some types of cancer have traditionally been higher among the Inuit than in other populations, at least according to data from the 1950s, when a certain degree of westernization had already occurred. The incidence of the following types of cancer among the Inuit has been particularly high: nasopharynx, salivary gland, and oesophageal.

The high incidence of these “traditional” types of cancer among the Inuit is hypothesized to have a strong genetic basis. Nevertheless some also believe these cancers to be associated with practices that were arguably not common among the ancestral Inuit, such as preservation of fish and meat with salt.

Genetic markers in the present Inuit population show a shared Asian heritage, which is consistent with the higher incidence of similar types of cancer among Asians, particularly those consuming large amounts of salt-preserved foods. (The Inuit are believed to originate from East Asia, having crossed the Bering Strait about 5,000 years ago.)

The incidence of nasopharynx, salivary gland, and oesophageal cancer has been relatively stable among the Inuit from the 1950s on. More modern lifestyle-related cancers, on the other hand, have increased dramatically. Examples are cancers of the lung, colon, rectum, and female breast.

The figure below (click on it to enlarge), from Friborg & Melbye (2008), shows the incidence of more traditional and modern lifestyle-related cancers among Inuit males (top) and females (bottom).


Two main lifestyle changes are associated with this significant increase in modern lifestyle-related cancers. One is increased consumption of tobacco. The other, you guessed it, is a shift to refined carbohydrates, from animal protein and fat, as the main source of energy.

Reference:

Friborg, J.T., & Melbye, M. (2008). Cancer patterns in Inuit populations. The Lancet Oncology, 9(9), 892-900.

Malocclusion: Disease of Civilization, Part IX

A Summary

For those who didn't want to wade through the entire nerd safari, I offer a simple summary.

Our ancestors had straight teeth, and their wisdom teeth came in without any problem. The same continues to be true of a few non-industrial cultures today, but it's becoming rare. Wild animals also rarely suffer from orthodontic problems.

Today, the majority of people in the US and other affluent nations have some type of malocclusion, whether it's crooked teeth, overbite, open bite or a number of other possibilities.

There are three main factors that I believe contribute to malocclusion in modern societies:
  1. Maternal nutrition during the first trimester of pregnancy. Vitamin K2, found in organs, pastured dairy and eggs, is particularly important. We may also make small amounts from the K1 found in green vegetables.
  2. Sucking habits from birth to age four. Breast feeding protects against malocclusion. Bottle feeding, pacifiers and finger sucking probably increase the risk of malocclusion. Cup feeding and orthodontic pacifiers are probably acceptable alternatives.
  3. Food toughness. The jaws probably require stress from tough food to develop correctly. This can contribute to the widening of the dental arch until roughly age 17. Beef jerky, raw vegetables, raw fruit, tough cuts of meat and nuts are all good ways to exercise the jaws.
And now, an example from the dental literature to motivate you. In 1976, Dr. H. L. Eirew published an interesting paper in the British Dental Journal. He took two 12-year old identical twins, with identical class I malocclusions (crowded incisors), and gave them two different orthodontic treatments. Here's a picture of both girls before the treatment:


In one, he made more space in her jaws by extracting teeth. In the other, he put in an apparatus that broadened her dental arch, which roughly mimics the natural process of arch growth during childhood and adolescence. This had profound effects on the girls' subsequent occlusion and facial structure:

The girl on the left had teeth extracted, while the girl on the right had her arch broadened. Under ideal circumstances, this is what should happen naturally during development. Notice any differences?

Thanks to the Weston A Price foundation's recent newsletter for the study reference.

Malocclusion: Disease of Civilization, Part VIII

Three Case Studies in Occlusion

In this post, I'll review three cultures with different degrees of malocclusion over time, and try to explain how the factors I've discussed may have played a role.

The Xavante of Simoes Lopes

In 1966, Dr. Jerry D. Niswander published a paper titled "The Oral Status of the Xavantes of Simoes Lopes", describing the dental health and occlusion of 166 Brazilian hunter-gatherers from the Xavante tribe (free full text). This tribe was living predominantly according to tradition, although they had begun trading with the post at Simoes Lopes for some foods. They made little effort to clean their teeth. They were mostly but not entirely free of dental cavities:
Approximately 33% of the Xavantes at Simoes Lopes were caries free. Neel et al. (1964) noted almost complete absence of dental caries in the Xavante village at Sao Domingos. The difference in the two villages may at least in part be accounted for by the fact that, for some five years, the Simoes Lopes Xavante have had access to sugar cane, whereas none was grown at Sao Domingos. It would appear that, although these Xavantes still enjoy relative freedom from dental caries, this advantage is disappearing after only six years of permanent contact with a post of the Indian Protective Service.
The most striking thing about these data is the occlusion of the Xavante. 95 percent had ideal occlusion. The remaining 5 percent had nothing more than a mild crowding of the incisors (front teeth). Niswander didn't observe a single case of underbite or overbite. This would have been truly exceptional in an industrial population. Niswander continues:
Characteristically, the Xavante adults exhibited broad dental arches, almost perfectly aligned teeth, end-to-end bite, and extensive dental attrition. At 18-20 years of age, the teeth were so worn as to almost totally obliterate the cusp patterns, leaving flat chewing surfaces.
The Xavante were clearly hard on their teeth, and their predominantly hunter-gatherer lifestyle demanded it. They practiced a bit of "rudimentary agriculture" of corn, beans and squash, which would sustain them for a short period of the year devoted to ceremonies. Dr. James V. Neel describes their diet (free full text):
Despite a rudimentary agriculture, the Xavante depend very heavily on the wild products which they gather. They eat numerous varieties of roots in large quantities, which provide a nourishing, if starchy, diet. These roots are available all year but are particularly important in the Xavante diet from April to June in the first half of the dry season when there are no more fruits. The maize harvest does not last long and is usually saved for a period of ceremonies. Until the second harvest of beans and pumpkins, the Xavante subsist largely on roots and palmito (Chamacrops sp.), their year-round staples.

From late August until mid-February, there are also plenty of nuts and fruits available. The earliest and most important in their diet is the carob or ceretona (Ceretona sp.), sometimes known as St. John's bread. Later come the fruits of the buriti palm (Mauritia sp.) and the piqui (Caryocar sp.). These are the basis of the food supply throughout the rainy season. Other fruits, such as mangoes, genipapo (Genipa americana), and a number of still unidentified varieties are also available.

The casual observer could easily be misled into thinking that the Xavante "live on meat." Certainly they talk a great deal about meat, which is the most highly esteemed food among them, in some respects the only commodity which they really consider "food" at all... They do not eat meat every day and may go without meat for several days at a stretch, but the gathered products of the region are always available for consumption in the community.

Recently, the Xavante have begun to eat large quantities of fish.
The Xavante are an example of humans living an ancestral lifestyle, and their occlusion shows it. They have the best occlusion of any living population I've encountered so far. Here's why I think that's the case:
  • A nutrient-rich, whole foods diet, presumably including organs.
  • On-demand breast feeding for two or more years.
  • No bottle-feeding or modern pacifiers.
  • Tough foods on a regular basis.
I don't have any information on how the Xavante have changed over time, but Niswander did present data on another nearby (and genetically similar) tribe called the Bakairi that had been using a substantial amount of modern foods for some time. The Bakairi, living right next to the Xavante but eating modern foods from the trading post, had 9 times more malocclusion and nearly 10 times more cavities than the Xavante. Here's what Niswander had to say:
Severe abrasion was not apparent among the Bakairi, and the dental arches did not appear as broad and massive as in the Xavantes. Dental caries and malocclusion were strikingly more prevalent; and, although not recorded systematically, the Bakairi also showed considerably more periodontal disease. If it can be assumed that the Bakairi once enjoyed a freedom from dental disease and malocclusion equal to that now exhibited by the Xavantes, the available data suggest that the changes in occlusal patterns as well as caries and periodontal disease have been too rapid to be accounted for by an hypothesis involving relaxed [genetic] selection.
The Masai of Kenya

The Masai are traditionally a pastoral people who live almost exclusively from their cattle. In 1945, and again in 1952, Dr. J. Schwartz examined the teeth of 408 and 273 Masai, respectively (#1 free full text; #2 ref). In the first study, he found that 8 percent of Masai showed some form of malocclusion, while in the second study, only 0.4 percent of Masai were maloccluded. Although we don't know what his precise criteria were for diagnosing malocclusion, these are still very low numbers.

In both studies, 4 percent of Masai had cavities. Between the two studies, Schwartz found 67 cavities in 21,792 teeth, or 0.3 percent of teeth affected. This is almost exactly what Dr. Weston Price found when he visited them in 1935. From Nutrition and Physical Degeneration, page 138:
In the Masai tribe, a study of 2,516 teeth in eighty-eight individuals distributed through several widely separated manyatas showed only four individuals with caries. These had a total of ten carious teeth, or only 0.4 per cent of the teeth attacked by tooth decay.
Dr. Schwartz describes their diet:
The principal food of the Masai is milk, meat and blood, the latter obtained by bleeding their cattle... The Masai have ample means with which to get maize meal and fresh vegetables but these foodstuffs are known only to those who work in town. It is impossible to induce a Masai to plant their own maize or vegetables near their huts.
This is essentially the same description Price gave during his visit. The Masai were not hunter-gatherers, but their traditional lifestyle was close enough to allow good occlusion. Here's why I think the Masai had good occlusion:
  • A nutrient-dense diet rich in protein and fat-soluble vitamins from pastured dairy.
  • On-demand breast feeding for two or more years.
  • No bottle feeding or modern pacifiers.
The one factor they lack is tough food. Their diet, composed mainly of milk and blood, is predominantly liquid. Although I think food toughness is a factor, this shows that good occlusion is not entirely dependent on tough food.

Sadly, the lifestyle and occlusion of the Masai has changed in the intervening decades. A paper from 1992 described their modern diet:
The main articles of diet were white maize, [presumably heavily sweetened] tea, milk, [white] rice, and beans. Traditional items were rarely eaten... Milk... was not mentioned by 30% of mothers.
A paper from 1993 described the occlusion of 235 young Masai attending rural and peri-urban schools. Nearly all showed some degree of malocclusion, with open bite alone affecting 18 percent.

Rural Caucasians in Kentucky

It's always difficult to find examples of Caucasian populations living traditional lifestyles, because most Caucasian populations adopted the industrial lifestyle long ago. That's why I was grateful to find a study by Dr. Robert S. Corruccini, published in 1981, titled "Occlusal Variation in a Rural Kentucky Community" (ref).

This study examined a group of isolated Caucasians living in the Mammoth Cave region of Kentucky, USA. Corruccini arrived during a time of transition between traditional and modern foodways. He describes the traditional lifestyle as follows:
Much of the traditional way of life of these people (all white) has been maintained, but two major changes have been the movement of industry and mechanized farming into the area in the last 25 years. Traditionally, tobacco (the only cash crop), gardens, and orchards were grown by each family. Apples, pears, cherries, plums, peaches, potatoes, corn, green beans, peas, squash, peppers, cucumbers, and onions were grown for consumption, and fruits and nuts, grapes, and teas were gathered by individuals. In the diet of these people, dried pork and fried [presumably in lard], thick-crust cornbread (which were important winter staples) provided consistently stressful chewing. Hunting is still very common in the area.
Although it isn't mentioned in the paper, this group, like nearly all traditionally-living populations, probably did not waste the organs or bones of the animals it ate. Altogether, it appears to be an excellent and varied diet, based on whole foods, and containing all the elements necessary for good occlusion and overall health.

The older generation of this population has the best occlusion of any Caucasian population I've ever seen, rivaling some hunter-gatherer groups. This shows that Caucasians are not genetically doomed to malocclusion. The younger generation, living on more modern foods, shows very poor occlusion, among the worst I've seen. They also show narrowed arches, a characteristic feature of deteriorating occlusion. One generation is all it takes. Corruccini found that a higher malocclusion score was associated with softer, more industrial foods.

Here are the reasons I believe this group of Caucasians in Kentucky had good occlusion:
  • A nutrient-rich, whole foods diet, presumably including organs.
  • Prolonged breast feeding.
  • No bottle-feeding or modern pacifiers.
  • Tough foods on a regular basis.
Common Ground

I hope you can see that populations with excellent teeth do certain things in common, and that straying from those principles puts the next generation at a high risk of malocclusion. Malocclusion is a serious problem that has major implications for health, well-being and finances. In the next post, I'll give a simplified summary of everything I've covered in this series. Then it's back to our regularly scheduled programming.

Malocclusion: Disease of Civilization, Part VII

Jaw Development During Adolescence

Beginning at about age 11, the skull undergoes a growth spurt. This corresponds roughly with the growth spurt in the rest of the body, with the precise timing depending on gender and other factors. Growth continues until about age 17, when the last skull sutures cease growing and slowly fuse. One of these sutures runs along the center of the maxillary arch (the arch in the upper jaw), and contributes to the widening of the upper arch*:

This growth process involves MGP and osteocalcin, both vitamin K-dependent proteins. At the end of adolescence, the jaws have reached their final size and shape, and should be large enough to accommodate all teeth without crowding. This includes the third molars, or wisdom teeth, which will erupt shortly after this period.

Reduced Food Toughness Correlates with Malocclusion in Humans

When Dr. Robert Corruccini published his seminal paper in 1984 documenting rapid changes in occlusion in cultures around the world adopting modern foodways and lifestyles (see this post), he presented the theory that occlusion is influenced by chewing stress. In other words, the jaws require good exercise on a regular basis during growth to develop normal-sized bones and muscles. Although Dr. Corruccini wasn't the first to come up with the idea, he has probably done more than anyone else to advance it over the years.

Dr. Corruccini's paper is based on years of research in transitioning cultures, much of which he conducted personally. In 1981, he published a study of a rural Kentucky community in the process of adopting the modern diet and lifestyle. Their traditional diet was predominantly dried pork, cornbread fried in lard, game meat and home-grown fruit, vegetables and nuts. The older generation, raised on traditional foods, had much better occlusion than the younger generation, which had transitioned to softer and less nutritious modern foods. Dr. Corruccini found that food toughness correlated with proper occlusion in this population.

In another study published in 1985, Dr. Corruccini studied rural and urban Bengali youths. After collecting a variety of diet and socioeconomic information, he found that food toughness was the single best predictor of occlusion. Individuals who ate the toughest food had the best teeth. The second strongest association was a history of thumb sucking, which was associated with a higher prevalence of malocclusion**. Interestingly, twice as many urban youths had a history of thumb sucking as rural youths.

Not only do hunter-gatherers eat tough foods on a regular basis, they also often use their jaws as tools. For example, the anthropologist and arctic explorer Vilhjalmur Stefansson described how the Inuit chewed their leather boots and jackets nearly every day to soften them or prepare them for sewing. This is reflected in the extreme tooth wear of traditional Inuit and other hunter-gatherers.

Soft Food Causes Malocclusion in Animals

Now we have a bunch of associations that may or may not represent a cause-effect relationship. However, Dr. Corruccini and others have shown in a variety of animal models that soft food can produce malocclusion, independent of nutrition.

The first study was conducted in 1951. Investigators fed rats typical dry chow pellets, or the same pellets that had been crushed and softened in water. Rats fed the softened food during growth developed narrow arches and small mandibles (lower jaws) relative to rats fed dry pellets.

Other research groups have since repeated the findings in rodents, pigs and several species of primates (squirrel monkeys, baboons, and macaques). Animals typically developed narrow arches, a central aspect of malocclusion in modern humans. Some of the primates fed soft foods showed other malocclusions highly reminiscent of modern humans as well, such as crowded incisors and impacted third molars. These traits are exceptionally rare in wild primates.

One criticism of these studies is that they used extremely soft foods that are softer than the typical modern diet. This is how science works: you go for the extreme effects first. Then, if you see something, you refine your experiments. One of the most refined experiments I've seen so far was published by Dr. Daniel E. Leiberman of Harvard's anthropology department. They used the rock hyrax, an animal with a skull that bears some similarities to the human skull***.

Instead of feeding the animals hard food vs. mush, they fed them raw and dried food vs. cooked. This is closer to the situation in humans, where food is soft but still has some consistency. Hyrax fed cooked food showed a mild jaw underdevelopment reminiscent of modern humans. The underdeveloped areas were precisely those that received less strain during chewing.

Implications and Practical Considerations

Besides the direct implications for the developing jaws and face, I think this also suggests that physical stress may influence the development of other parts of the skeleton. Hunter-gatherers generally have thicker bones, larger joints, and more consistently well-developed shoulders and hips than modern humans. Physical stress is part of the human evolutionary template, and is probably critical for the normal development of the skeleton.

I think it's likely that food consistency influences occlusion in humans. In my opinion, it's a good idea to regularly include tough foods in a child's diet as soon as she is able to chew them properly and safely. This probably means waiting at least until the deciduous (baby) molars have erupted fully. Jerky, raw vegetables and fruit, tough cuts of meat, nuts, dry sausages, dried fruit, chicken bones and roasted corn are a few things that should stress the muscles and bones of the jaws and face enough to encourage normal development.


* These data represent many years of measurements collected by Dr. Arne Bjork, who used metallic implants in the maxilla to make precise measurements of arch growth over time in Danish youths. The graph is reproduced from the book A Synopsis of Craniofacial Growth, by Dr. Don M. Ranly. Data come from Dr. Bjork's findings published in the book Postnatal Growth and Development of the Maxillary Complex. You can see some of Dr. Bjork's data in the paper "Sutural Growth of the Upper Face Studied by the Implant Method" (free full text).


** I don't know if this was statistically significant at p less than 0.05. Dr. Corruccini uses a cutoff point of p less than 0.01 throughout the paper. He's a tough guy when it comes to statistics!

*** Retrognathic.

Malocclusion: Disease of Civilization, Part VI

Early Postnatal Face and Jaw Development

The face and jaws change more from birth to age four than at any other period of development after birth. At birth, infants have no teeth and their skull bones have not yet fused, allowing rapid growth. This period has a strong influence on the development of the jaws and face. The majority of malocclusions are established by the end this stage of development. Birth is the point at which the infant begins using its jaws and facial musculature in earnest.

The development of the jaws and face is very plastic, particularly during this period. Genes do not determine the absolute size or shape of any body structure. Genes carry the blueprint for all structures, and influence their size and shape, but structures develop relative to one another and in response to the forces applied to them during growth. This is how orthodontists can change tooth alignment and occlusion by applying force to the teeth and jaws.

Influences on Early Postnatal Face and Jaw Development

In 1987, Miriam H. Labbok and colleagues published a subset of the results of the National Health Interview survey (now called NHANES) in the American Journal of Preventive Medicine. Their article was provocatively titled "Does Breast-feeding Protect Against Malocclusion"? The study examined the occlusion of nearly 10,000 children, and interviewed the parents to determine the duration of breast feeding. Here's what they found:

The longer the infants were breastfed, the lower their likelihood of major malocclusion. The longest category was "greater than 12 months", in which the prevalence of malocclusion was less than half that of infants who were breastfed for three months or less. Hunter-gatherers and other non-industrial populations typically breastfeed for 2-4 years, but this is rare in affluent nations. Only two percent of the mothers in this study breastfed for longer than one year.

The prevalence and duration of breastfeeding have increased dramatically in the US since the 1970s, with the prevalence doubling between 1970 and 1980 (NHANES). The prevalence of malocclusion in the US has decreased somewhat in the last half-century, but is still very common (NHANES).

Several, but not all studies have found that infants who were breastfed have a smaller risk of malocclusion later in life (1, 2, 3). However, what has been more consistent is the association between non-nutritive sucking and malocclusion. Non-nutritive sucking (NNS) is when a child sucks on an object without getting calories out of it. This includes pacifier sucking, which is strongly associated with malocclusion*, and finger sucking, which is also associated to a lesser degree.

The longer a child engages in NNS, the higher his or her risk of malocclusion. The following graph is based on data from a study of nearly 700 children in Iowa (free full text). It charts the prevalence of three types of malocclusion (anterior open bite, posterior crossbite and excessive overjet) broken down by the duration of the NNS habit:

As you can see, there's a massive association. Children who sucked pacifiers or their fingers for more than four years had a 71 percent chance of having one of these three specific types of malocclusion, compared with 14 percent of children who sucked for less than a year. The association between NNS and malocclusion appeared after two years of NNS. Other studies have come to similar conclusions, including a 2006 literature review (1, 2, 3).

Bottle feeding, as opposed to direct breast feeding, is also associated with a higher risk of malocclusion (1, 2). One of the most important functions of breast feeding may be to displace NNS and bottle feeding. Hunter-gatherers and non-industrial cultures breast fed their children on demand, typically for 2-4 years, in addition to giving them solid food.

In my opinion, it's likely that NNS beyond two years of age, and bottle feeding to a lesser extent, cause a large proportion of the malocclusions in modern societies. Pacifier use seems to be particularly problematic, and finger sucking to a lesser degree.

How Do Breastfeeding, Bottle Feeding and NNS Affect Occlusion?

Since jaw development is influenced by the forces applied to them, it makes sense that the type of feeding during this period could have a major impact on occlusion. Children who have a prolonged pacifier habit are at high risk for open bite, a type of malocclusion in which the incisors don't come together when the jaws are closed. You can see a picture here. The teeth and jaws mold to the shape of the pacifier over time. This is because the growth patterns of bones respond to the forces that are applied to them. I suspect this is true for other parts of the skeleton as well.

Any force applied to the jaws that does not approximate the natural forces of breastfeeding or chewing and swallowing food, will put a child at risk of malocclusion during this period of his or her life. This includes NNS and bottle feeding. Pacifier sucking, finger sucking and bottle feeding promote patterns of muscular activity that result in weak jaw muscles and abnormal development of bony structures, whereas breastfeeding, chewing and swallowing strengthen jaw muscles and promote normal development (review article). This makes sense, because our species evolved in an environment where the breast and solid foods were the predominant objects that entered a child's mouth.

What Can We do About it?

In an ideal world (ideal for occlusion), mothers would breast feed on demand for 2-4 years, and introduce solid food about halfway through the first year, as our species has done since the beginning of time. For better or worse, we live in a different world than our ancestors, so this strategy will be difficult or impossible for many people. Are there any alternatives?

Parents like bottle feeding because it's convenient. Milk can be prepared in advance, the mother doesn't have to be present, feeding takes less time, and the parents can see exactly how much milk the child has consumed. One alternative to bottle feeding that's just as convenient is cup feeding. Cup feeding, as opposed to bottle feeding, promotes natural swallowing motions, which are important for correct development. The only study I found that examined the effect of cup feeding on occlusion found that cup-fed children developed fewer malocclusion and breathing problems than bottle-fed children.

Cup feeding has a long history of use. Several studies have found it to be safe and effective. It appears to be a good alternative to bottle feeding, that should not require any more time or effort.

What about pacifiers? Parents know that pacifiers make babies easier to manage, so they will be reluctant to give them up. Certain pacifier designs may be more detrimental than others. I came across the abstract of a study evaluating an "orthodontic pacifier" called the Dentistar, made by Novatex. The frequency of malocclusion was much lower in children who did not use a pacifier or used the Dentistar, than in those who used a more conventional pacifier. This study was funded by Novatex, but was conducted at Heinrich Heine University in Dusseldorf, Germany**. The pacifier has a spoon-like shape that allows normal tongue movement and exerts minimal pressure on the incisors. There may be other brands with a similar design.

The ideal is to avoid bottle feeding and pacifiers entirely. However, cup feeding and orthodontic pacifiers appear to be acceptable alternatives that minimize the risk of malocclusion during this critical developmental window.


* Particularly anterior open bite and posterior crossbite.

** I have no connection whatsoever to this company. I think the results of the trial are probably valid, but should be replicated.

Malocclusion: Disease of Civilization, Part V

Prenatal Development of the Face and Jaws

The structures of the face and jaws take shape during the first trimester of pregnancy. The 5th to 11th weeks of pregnancy are particularly crucial for occlusion, because this is when the jaws, nasal septum and other cranial structures form. The nasal septum is the piece of cartilage that forms the structure of the nose and separates the two air passages as they enter the nostrils.


Maternal Nutritional Status Affects Fetal Development


Abnormal nutrient status can lead to several types of birth defects. Vitamin A is an essential signaling molecule during development. Both deficiency and excess can cause birth defects, with the effects predominantly targeting the cranium and nervous system, respectively. Folic acid deficiency causes birth defects of the brain and spine. Other nutrients such as vitamin B12 may influence the risk of birth defects as well*.


The Role of Vitamin K


As early as the 1970s, physicians began noting characteristic developmental abnormalities in infants whose mothers took the blood-thinning drug warfarin (coumadin) during the first trimester of pregnancy. These infants showed an underdevelopment of the nasal septum, the maxilla (upper jaw), small or absent sinuses, and a characteristic "dished" face. This eventually resulted in narrow dental arches, severe malocclusion and tooth crowding**. The whole spectrum was called Binder's syndrome, or warfarin embryopathy.

Warfarin works by inhibiting vitamin K recycling, thus depleting a nutrient necessary for normal blood clotting.
It's now clear that Binder's syndrome can result from anything that interferes with vitamin K status during the first trimester of pregnancy. This includes warfarin, certain anti-epilepsy drugs, certain antibiotics, genetic mutations that interfere with vitamin K status, and celiac disease (intestinal damage due to gluten).

Why is vitamin K important for the development of the jaws and face of the fetus? Vitamin K is required to activate a protein called matrix gla protein (MGP), which prevents unwanted calcification of the nasal septum in the developing fetus (among
other things). If this protein isn't activated by vitamin K during the critical developmental window, calcium deposits form in the nasal septum, stunting its growth and also stunting the growth of the maxilla and sinuses. Low activity of MGP appears to be largely responsible for Binder's syndrome, since the syndrome can be caused by genetic mutations in MGP in humans. Small or absent sinuses are common in the general population.

One of the interesting things about MGP is its apparent preference for vitamin K2 over vitamin K1.
Vitamin K1 is found predominantly in green vegetables, and is sufficient to activate blood clotting factors and probably some other vitamin K-dependent proteins. "Vitamin K2" refers to a collection of molecules known as menaquinones. These are denoted as "MK", followed by a number indicating the length of the side chain attached to the quinone ring.

Biologically important menaquinones are MK-4 through MK-12 or so. MK-4 is the form that animals synthesize from vitamin K1 for their own use. Certain organs (brain, pancreas, salivary gland, arteries) preferentially accumulate K2 MK-4, and certain cellular processes are also selective for K2 MK-4 (
MGP activation, PKA-dependent transcriptional effects). Vitamin K2 MK-4 is found almost exclusively in animal foods, particularly pastured butter, organs and eggs. It is always found in foods designed to nourish growing animals, such as eggs and milk.

Humans have the ability to convert K1 to K2 when K1 is ingested in artificially large amounts. However, due to the limited absorption of normal dietary sources of K1 and the unknown conversion efficiency, it's unclear how much green vegetables contribute to K2 status. Serum vitamin K1 reaches a plateau at about 200 micrograms per day of dietary K1 intake, the equivalent of 1/4 cup of cooked spinach (see figure 1 of this paper). Still, I think eating green vegetables regularly is a good idea, and may contribute to K2 status.
Other menaquinones such as MK-7 (found in natto) may contribute to K2 status as well, but this question has not been resolved.

Severe vitamin K deficiency clearly impacts occlusion. Could more subtle deficiency lead to a less pronounced form of the same developmental syndrome? Here are a few facts about vitamin K relevant to this question:
  • In industrial societies, newborns are typically vitamin K deficient. This is reflected by the fact that in the US, nearly all newborns are given vitamin K1 at birth to prevent potentially fatal hemorrhage. In Japan, infants are given vitamin K2 MK-4, which is equally effective at preventing hemmorhage.
  • Fetuses generally have low vitamin K status, as measured by the activity of their clotting factors.
  • The human placenta transports vitamin K across the placental barrier and accumulates it. This transport mechanism is highly selective for vitamin K2 MK-4 over K1.
  • The concentration of K1 in maternal blood is much higher than its concentration in umbilical cord blood, whereas the concentration of K2 in maternal blood is similar to the concentration in cord blood. Vitamin K2 MK-7 is undetectable in cord blood, even when supplemented, suggesting that MK-7 is not an adequate substitute for MK-4 during pregnancy.
  • In rat experiments, arterial calcification due to warfarin was inhibited by vitamin K2 MK-4, but not vitamin K1. This is probably due to K2's ability to activate MGP, the same protein required for the normal development of the human face and jaws.
  • The human mammary gland appears to be the most capable organ at converting vitamin K1 to K2 MK-4.
Together, this suggests that in industrial societies, fetuses and infants are vitamin K deficient, to the point of being susceptible to fatal hemorrhage. It also suggests that vitamin K2 MK-4 plays a critical role in fetal and early postnatal development. Could subclinical vitamin K2 deficiency be contributing to the high prevalence of malocclusion in modern societies?

An Ounce of Prevention


Vitamin A, folic acid, vitamin D and vitamin K2 are all nutrients with a long turnover time. Body stores of these nutrients depend on long-term intake. Thus, the nutritional status of the fetus during the first trimester reflects what the mother has been eating for several months
before conception.

Dr. Weston Price noted that a number of the traditional societies he visited prepared women of childbearing age for healthy pregnancies by giving them special foods rich in fat-soluble vitamins. This allowed them to gestate and rear healthy, well-formed children.
Nutrient-dense animal foods and green vegetables are a good idea before, during and after pregnancy.


* Liver is the richest source of vitamin A, folic acid and B12.


** Affected individuals may show class I, II, or III malocclusion.
Image and video hosting by TinyPic