Friday, December 25, 2009

Rabbits on a High-Saturated Fat Diet Without Added Cholesterol

I just saw another study that supports my previous post Animal Models of Atherosclerosis: LDL. The hypothesis is that in the absence of excessive added dietary cholesterol, saturated fat does not influence LDL or atherosclerosis in animal models, relative to other fats (although omega-6 polyunsaturated oils do lower LDL in some animal models). This appears to be consistent with what we see in humans.

In this study, they fed four groups of rabbits different diets:
  1. Regular low-fat rabbit chow
  2. Regular low-fat rabbit chow plus 0.5 g cholesterol per day
  3. High-fat diet with 30% calories as coconut oil (saturated) and no added cholesterol
  4. High-fat diet with 30% calories as sunflower oil (polyunsaturated) and no added cholesterol
LDL at 6 months was the same in groups 1, 3 and 4, but was increased more than 20-fold in group 2. It's not the fat, it's the fact that they're overloading herbivores with dietary cholesterol!

Total cholesterol was also the same between all groups except the cholesterol-fed group. TBARS, a measure of lipid oxidation in the blood, was elevated in the cholesterol and sunflower oil groups but not in the chow or coconut groups. Oxidation of blood lipids is one of the major factors in atherosclerosis, the vascular disease that narrows arteries and increases the risk of having a heart attack. Serum vitamin C was lower in the cholesterol-fed groups but not the others.

This supports the idea that saturated fat does not inherently increase LDL, and in fact in most animals it does not. This appears to be the case in humans as well, where long-term trials have shown no difference in LDL between people eating more saturated fat and people eating less, on timescales of one year or more (some short trials show a modest LDL-raising effect, but even this appears to be due to an increase in particle size rather than particle number). Since these trials represent the average of many people, they may hide some individual variability: it may actually increase LDL in some people and decrease it in others.

Merry Christmas!

Tuesday, December 22, 2009

What's the Ideal Fasting Insulin Level?

Insulin is an important hormone. Its canonical function is to signal cells to absorb glucose from the bloodstream, but it has many other effects. Chronically elevated insulin is a marker of metabolic dysfunction, and typically accompanies high fat mass, poor glucose tolerance (prediabetes) and blood lipid abnormalities. Measuring insulin first thing in the morning, before eating a meal, reflects fasting insulin. High fasting insulin is a marker of metabolic problems and may contribute to some of them as well.

Elevated fasting insulin is a hallmark of the metabolic syndrome, the quintessential modern metabolic disorder that affects 24% of Americans (NHANES III). Dr. Lamarche and colleagues found that having an insulin level of 13 uIU/mL in Canada correlated with an 8-fold higher heart attack risk than a level of 9.3 uIU/mL (1; thanks to NephroPal for the reference). So right away, we can put our upper limit at 9.3 uIU/mL. The average insulin level in the U.S., according to the NHANES III survey, is 8.8 uIU/mL for men and 8.4 for women (2). Given the degree of metabolic dysfunction in this country, I think it's safe to say that the ideal level of fasting insulin is probably below 8.4 uIU/mL as well.

Let's dig deeper. What we really need is a healthy, non-industrial "negative control" group. Fortunately, Dr. Staffan Lindeberg and his team made detailed measurements of fasting insulin while they were visiting the isolated Melanesian island of Kitava (3). He compared his measurements to age-matched Swedish volunteers. In male and female Swedes, the average fasting insulin ranges from 4-11 uIU/mL, and increases with age. From age 60-74, the average insulin level is 7.3 uIU/mL.

In contrast, the range on Kitava is 3-6 uIU/mL, which does not increase with age. In the 60-74 age group, in both men and women, the average fasting insulin on Kitava is 3.5 uIU/mL. That's less than half the average level in Sweden and the U.S. Keep in mind that the Kitavans are lean and have an undetectable rate of heart attack and stroke.

Another example from the literature are the Shuar hunter-gatherers of the Amazon rainforest. Women in this group have an average fasting insulin concentration of 5.1 uIU/mL (4; no data was given for men).

I found a couple of studies from the early 1970s as well, indicating that African pygmies and San bushmen have rather high fasting insulin. Glucose tolerance was excellent in the pygmies and poor in the bushmen (5, 6, free full text). This may reflect differences in carbohydrate intake. San bushmen consume very little carbohydrate during certain seasons, and thus would likely have glucose intolerance during that period. There are three facts that make me doubt the insulin measurements in these older studies:
  1. It's hard to be sure that they didn't eat anything prior to the blood draw.
  2. From what I understand, insulin assays were variable and not standardized back then.
  3. In the San study, their fasting insulin was 1/3 lower than the Caucasian control group (10 vs. 15 uIU/mL). I doubt these active Caucasian researchers really had an average fasting insulin level of 15 uIU/mL. Both sets of measurements are probably too high.
Now you know the conflicting evidence, so you're free to be skeptical if you'd like.

We also have data from a controlled trial in healthy urban people eating a "paleolithic"-type diet. On a paleolithic diet designed to maintain body weight (calorie intake had to be increased substantially to prevent fat loss during the diet), fasting insulin dropped from an average of 7.2 to 2.9 uIU/mL in just 10 days. The variation in insulin level between individuals decreased 9-fold, and by the end, all participants were close to the average value of 2.9 uIU/mL. This shows that high fasting insulin is correctable in people who haven't yet been permanently damaged by the industrial diet and lifestyle. The study included men and women of European, African and Asian descent (7).

One final data point. My own fasting insulin, earlier this year, was 2.3 uIU/mL. I believe it reflects a good diet, regular exercise, sufficient sleep, a relatively healthy diet growing up, and the fact that I managed to come across the right information relatively young. It does not reflect: carbohydrate restriction, fat restriction, or saturated fat restriction. Neither does the low fasting insulin of healthy non-industrial cultures.

So what's the ideal fasting insulin level? My current feeling is that we can consider anything between 2 and 6 uIU/mL within our evolutionary template, although the lower half of that range may be preferable.

Monday, December 14, 2009

The Dirty Little Secret of the Diet-Heart Hypothesis

The diet-heart hypothesis is the idea that saturated fat, and in some versions cholesterol, raises blood cholesterol and contributes to the risk of having a heart attack. To test this hypothesis, scientists have been studying the relationship between saturated fat consumption and heart attack risk for more than half a century. To judge by the grave pronouncements of our most visible experts, you would think these studies had found an association between the two. It turns out, they haven't.

The fact is, the vast majority of high-quality observational studies have found no connection whatsoever between saturated fat consumption and heart attack risk. The scientific literature contains dozens of these studies, so let's narrow the field to prospective studies only, because they are considered the most reliable. In this study design, investigators find a group of initially healthy people, record information about them (in this case what they eat), and watch who gets sick over the years.

A Sampling of Unsupportive Studies

Here are references to ten high-impact prospective studies, spanning half a century, showing no association between saturated fat consumption and heart attack risk. Ignore the saturated-to-polyunsaturated ratios, Keys/Hegsted scores, etc. What we're concerned with is the straightforward question: do people who eat more saturated fat have more heart attacks? Many of these papers allow free access to the full text, so have a look for yourselves if you want:

A Longitudinal Study of Coronary Heart Disease. Circulation. 1963.

Diet and Heart: a Postscript. British Medical Journal. 1977. Saturated fat was unrelated to heart attack risk, but fiber was protective.

Dietary Intake and the Risk of Coronary Heart Disease in Japanese Men Living in Hawaii. American Journal of Clinical Nutrition. 1978.

Relationship of Dietary Intake to Subsequent Coronary Heart Disease Incidence: the Puerto Rico Heart Health Program. American Journal of Clinical Nutrition. 1980.

Diet, Serum Cholesterol, and Death From Coronary Heart Disease: The Western Electric Study. New England Journal of Medicine. 1981.

Diet and 20-year Mortality in Two Rural Population Groups of Middle-Aged Men in Italy. American Journal of Clinical Nutrition. 1989. Men who died of CHD ate significantly less saturated fat than men who didn't.

Diet and Incident Ischaemic Heart Disease: the Caerphilly Study. British Journal of Nutrition. 1993. They measured animal fat intake rather than saturated fat in this study.

Dietary Fat and Risk of Coronary Heart Disease in Men: Cohort Follow-up Study in the United States. British Medical Journal. 1996. This is the massive Physicians Health Study. Don't let the abstract fool you! Scroll down to table 2 and see for yourself that the association between saturated fat intake and heart attack risk disappears after adjustment for several factors including family history of heart attack, smoking and fiber intake. That's because, as in most modern studies, people who eat steak are also more likely to smoke, avoid vegetables, eat fast food, etc.

Dietary Fat Intake and the Risk of Coronary Heart Disease in Women. New England Journal of Medicine. 1997. From the massive Nurse's Health study. This one fooled me for a long time because the abstract is misleading. It claims that saturated fat was associated with heart attack risk. However, the association disappeared without a trace when they adjusted for monounsaturated and polyunsaturated fat intake. Have a look at table 3.

Dietary Fat Intake and Early Mortality Patterns-- Data from the Malmo Diet and Cancer Study. Journal of Internal Medicine. 2005.
I just listed 10 prospective studies published in top peer-reviewed journals that found no association between saturated fat and heart disease risk. This is less than half of the prospective studies that have come to the same conclusion, representing by far the majority of studies to date. If saturated fat is anywhere near as harmful as we're told, why are its effects essentially undetectable in the best studies we can muster?

Studies that Support the Diet-Heart Hypothesis

To be fair, there have been a few that have found an association between saturated fat consumption and heart attack risk. Here's a list of all four that I'm aware of, with comments:

Ten-year Incidence of Coronary Heart Disease in the Honolulu Heart Program: relationship to nutrient intake. American Journal of Epidemiology. 1984. "Men who developed coronary heart disease also had a higher mean intake of percentage of calories from protein, fat, saturated fatty acids, and polyunsaturated fatty acids than men who remained free of coronary heart disease." The difference in saturated fat intake between people who had heart attacks and those who didn't, although statistically significant, was minuscule.

Diet and 20-Year Mortality From Coronary Heart Disease: the Ireland-Boston Diet-Heart Study. New England Journal of Medicine. 1985. "Overall, these results tend to support the hypothesis that diet is related, albeit weakly, to the development of coronary heart disease."

Relationship Between Dietary Intake and Coronary Heart Disease Mortality: Lipid Research Clinics Prevalence Follow-up Study. Journal of Clinical Epidemiology. 1996. "...increasing percentages of energy intake as total fat (RR 1.04, 95% CI = 1.01 – 1.08), saturated fat (RR 1.11, CI = 1.04 – 1.18), and monounsaturated fat (RR 1.08, CI = 1.01 – 1.16) were significant risk factors for CHD mortality among 30 to 59 year olds... None of the dietary components were significantly associated with CHD mortality among those aged 60–79 years." Note that the associations were very small, also included monounsaturated fat (like in olive oil), and only applied to the age group with the lower risk of heart attack.

The Combination of High Fruit and Vegetable and Low Saturated Fat Intakes is More Protective Against Mortality in Aging Men than is Either Alone. Journal of Nutrition. 2005. Higher saturated fat intake was associated with a higher risk of heart attack; fiber was strongly protective.

The Review Papers

Over 25 high-quality studies conducted, and only 4 support the diet-heart hypothesis. If this substance is truly so fearsome, why don't people who eat more of it have more heart attacks? In case you're concerned that I'm cherry-picking studies that conform to my beliefs, here are links to review papers on the same data that have reached the same conclusion:

The Questionable Role of Saturated and Polyunsaturated Fatty Acids in Cardiovascular Disease. Journal of Clinical Epidemiology. 1998. Dr. Uffe Ravnskov challenges the diet-heart hypothesis simply by collecting all the relevant studies and summarizing their findings.

A Systematic Review of the Evidence Supporting a Causal Link Between Dietary Factors and Coronary Heart Disease. Archives of Internal Medicine. 2009. "Insufficient evidence (less than or equal to 2 criteria) of association is present for intake of supplementary vitamin E and ascorbic acid (vitamin C); saturated and polyunsaturated fatty acids; total fat; alpha-linolenic acid; meat; eggs; and milk" They analyzed prospective studies representing over 160,000 patients from 11 studies meeting their rigorous inclusion criteria, and found no association between saturated fat consumption and heart attack risk.

Where's the Disconnect?

The first part of the diet-heart hypothesis states that dietary saturated fat raises the cholesterol/LDL concentration of the blood. The second part states that increased blood cholesterol/LDL increases the risk of having a heart attack. What part of this is incorrect?

There's definitely an association between blood cholesterol/LDL level and heart attack risk in certain populations, including Americans. MRFIT, among other studies, showed this definitively, although the lowest risk of all-cause mortality was at an average level of cholesterol.

So we're left with the first premise: that saturated fat increases blood cholesterol/LDL. This may be  a short-term effect, and it isn't necessarily true in animal models of heart disease if you exclude those that use large doses of dietary cholesterol. In the 1950s, Dr. Ancel Keys created a formula designed to predict changes in blood cholesterol based on the consumption of dietary saturated and polyunsaturated fats. However, it has shown limited predictive value in long-term diet modification trials such as MRFIT and the Women's Health Initiative.


Wednesday, December 2, 2009

Malocclusion: Disease of Civilization, Part IX

A Summary

For those who didn't want to wade through the entire nerd safari, I offer a simple summary.

Our ancestors had straight teeth, and their wisdom teeth came in without any problem. The same continues to be true of a few non-industrial cultures today, but it's becoming rare. Wild animals also rarely suffer from orthodontic problems.

Today, the majority of people in the US and other affluent nations have some type of malocclusion, whether it's crooked teeth, overbite, open bite or a number of other possibilities.

There are three main factors that I believe contribute to malocclusion in modern societies:
  1. Maternal nutrition during the first trimester of pregnancy. Vitamin K2, found in organs, pastured dairy and eggs, is particularly important. We may also make small amounts from the K1 found in green vegetables.
  2. Sucking habits from birth to age four. Breast feeding protects against malocclusion. Bottle feeding, pacifiers and finger sucking probably increase the risk of malocclusion. Cup feeding and orthodontic pacifiers are probably acceptable alternatives.
  3. Food toughness. The jaws probably require stress from tough food to develop correctly. This can contribute to the widening of the dental arch until roughly age 17. Beef jerky, raw vegetables, raw fruit, tough cuts of meat and nuts are all good ways to exercise the jaws.
And now, an example from the dental literature to motivate you. In 1976, Dr. H. L. Eirew published an interesting paper in the British Dental Journal. He took two 12-year old identical twins, with identical class I malocclusions (crowded incisors), and gave them two different orthodontic treatments. Here's a picture of both girls before the treatment:


In one, he made more space in her jaws by extracting teeth. In the other, he put in an apparatus that broadened her dental arch, which roughly mimics the natural process of arch growth during childhood and adolescence. This had profound effects on the girls' subsequent occlusion and facial structure:

The girl on the left had teeth extracted, while the girl on the right had her arch broadened. Under ideal circumstances, this is what should happen naturally during development. Notice any differences?

Thanks to the Weston A Price foundation's recent newsletter for the study reference.

Tuesday, November 17, 2009

Malocclusion: Disease of Civilization, Part VI

Early Postnatal Face and Jaw Development

The face and jaws change more from birth to age four than at any other period of development after birth. At birth, infants have no teeth and their skull bones have not yet fused, allowing rapid growth. This period has a strong influence on the development of the jaws and face. The majority of malocclusions are established by the end this stage of development. Birth is the point at which the infant begins using its jaws and facial musculature in earnest.

The development of the jaws and face is very plastic, particularly during this period. Genes do not determine the absolute size or shape of any body structure. Genes carry the blueprint for all structures, and influence their size and shape, but structures develop relative to one another and in response to the forces applied to them during growth. This is how orthodontists can change tooth alignment and occlusion by applying force to the teeth and jaws.

Influences on Early Postnatal Face and Jaw Development

In 1987, Miriam H. Labbok and colleagues published a subset of the results of the National Health Interview survey (now called NHANES) in the American Journal of Preventive Medicine. Their article was provocatively titled "Does Breast-feeding Protect Against Malocclusion"? The study examined the occlusion of nearly 10,000 children, and interviewed the parents to determine the duration of breast feeding. Here's what they found:

The longer the infants were breastfed, the lower their likelihood of major malocclusion. The longest category was "greater than 12 months", in which the prevalence of malocclusion was less than half that of infants who were breastfed for three months or less. Hunter-gatherers and other non-industrial populations typically breastfeed for 2-4 years, but this is rare in affluent nations. Only two percent of the mothers in this study breastfed for longer than one year.

The prevalence and duration of breastfeeding have increased dramatically in the US since the 1970s, with the prevalence doubling between 1970 and 1980 (NHANES). The prevalence of malocclusion in the US has decreased somewhat in the last half-century, but is still very common (NHANES).

Several, but not all studies have found that infants who were breastfed have a smaller risk of malocclusion later in life (1, 2, 3). However, what has been more consistent is the association between non-nutritive sucking and malocclusion. Non-nutritive sucking (NNS) is when a child sucks on an object without getting calories out of it. This includes pacifier sucking, which is strongly associated with malocclusion*, and finger sucking, which is also associated to a lesser degree.

The longer a child engages in NNS, the higher his or her risk of malocclusion. The following graph is based on data from a study of nearly 700 children in Iowa (free full text). It charts the prevalence of three types of malocclusion (anterior open bite, posterior crossbite and excessive overjet) broken down by the duration of the NNS habit:

As you can see, there's a massive association. Children who sucked pacifiers or their fingers for more than four years had a 71 percent chance of having one of these three specific types of malocclusion, compared with 14 percent of children who sucked for less than a year. The association between NNS and malocclusion appeared after two years of NNS. Other studies have come to similar conclusions, including a 2006 literature review (1, 2, 3).

Bottle feeding, as opposed to direct breast feeding, is also associated with a higher risk of malocclusion (1, 2). One of the most important functions of breast feeding may be to displace NNS and bottle feeding. Hunter-gatherers and non-industrial cultures breast fed their children on demand, typically for 2-4 years, in addition to giving them solid food.

In my opinion, it's likely that NNS beyond two years of age, and bottle feeding to a lesser extent, cause a large proportion of the malocclusions in modern societies. Pacifier use seems to be particularly problematic, and finger sucking to a lesser degree.

How Do Breastfeeding, Bottle Feeding and NNS Affect Occlusion?

Since jaw development is influenced by the forces applied to them, it makes sense that the type of feeding during this period could have a major impact on occlusion. Children who have a prolonged pacifier habit are at high risk for open bite, a type of malocclusion in which the incisors don't come together when the jaws are closed. You can see a picture here. The teeth and jaws mold to the shape of the pacifier over time. This is because the growth patterns of bones respond to the forces that are applied to them. I suspect this is true for other parts of the skeleton as well.

Any force applied to the jaws that does not approximate the natural forces of breastfeeding or chewing and swallowing food, will put a child at risk of malocclusion during this period of his or her life. This includes NNS and bottle feeding. Pacifier sucking, finger sucking and bottle feeding promote patterns of muscular activity that result in weak jaw muscles and abnormal development of bony structures, whereas breastfeeding, chewing and swallowing strengthen jaw muscles and promote normal development (review article). This makes sense, because our species evolved in an environment where the breast and solid foods were the predominant objects that entered a child's mouth.

What Can We do About it?

In an ideal world (ideal for occlusion), mothers would breast feed on demand for 2-4 years, and introduce solid food about halfway through the first year, as our species has done since the beginning of time. For better or worse, we live in a different world than our ancestors, so this strategy will be difficult or impossible for many people. Are there any alternatives?

Parents like bottle feeding because it's convenient. Milk can be prepared in advance, the mother doesn't have to be present, feeding takes less time, and the parents can see exactly how much milk the child has consumed. One alternative to bottle feeding that's just as convenient is cup feeding. Cup feeding, as opposed to bottle feeding, promotes natural swallowing motions, which are important for correct development. The only study I found that examined the effect of cup feeding on occlusion found that cup-fed children developed fewer malocclusion and breathing problems than bottle-fed children.

Cup feeding has a long history of use. Several studies have found it to be safe and effective. It appears to be a good alternative to bottle feeding, that should not require any more time or effort.

What about pacifiers? Parents know that pacifiers make babies easier to manage, so they will be reluctant to give them up. Certain pacifier designs may be more detrimental than others. I came across the abstract of a study evaluating an "orthodontic pacifier" called the Dentistar, made by Novatex. The frequency of malocclusion was much lower in children who did not use a pacifier or used the Dentistar, than in those who used a more conventional pacifier. This study was funded by Novatex, but was conducted at Heinrich Heine University in Dusseldorf, Germany**. The pacifier has a spoon-like shape that allows normal tongue movement and exerts minimal pressure on the incisors. There may be other brands with a similar design.

The ideal is to avoid bottle feeding and pacifiers entirely. However, cup feeding and orthodontic pacifiers appear to be acceptable alternatives that minimize the risk of malocclusion during this critical developmental window.


* Particularly anterior open bite and posterior crossbite.

** I have no connection whatsoever to this company. I think the results of the trial are probably valid, but should be replicated.

Tuesday, November 10, 2009

Malocclusion: Disease of Civilization, Part V

Prenatal Development of the Face and Jaws

The structures of the face and jaws take shape during the first trimester of pregnancy. The 5th to 11th weeks of pregnancy are particularly crucial for occlusion, because this is when the jaws, nasal septum and other cranial structures form. The nasal septum is the piece of cartilage that forms the structure of the nose and separates the two air passages as they enter the nostrils.


Maternal Nutritional Status Affects Fetal Development


Abnormal nutrient status can lead to several types of birth defects. Vitamin A is an essential signaling molecule during development. Both deficiency and excess can cause birth defects, with the effects predominantly targeting the cranium and nervous system, respectively. Folic acid deficiency causes birth defects of the brain and spine. Other nutrients such as vitamin B12 may influence the risk of birth defects as well*.


The Role of Vitamin K


As early as the 1970s, physicians began noting characteristic developmental abnormalities in infants whose mothers took the blood-thinning drug warfarin (coumadin) during the first trimester of pregnancy. These infants showed an underdevelopment of the nasal septum, the maxilla (upper jaw), small or absent sinuses, and a characteristic "dished" face. This eventually resulted in narrow dental arches, severe malocclusion and tooth crowding**. The whole spectrum was called Binder's syndrome, or warfarin embryopathy.

Warfarin works by inhibiting vitamin K recycling, thus depleting a nutrient necessary for normal blood clotting.
It's now clear that Binder's syndrome can result from anything that interferes with vitamin K status during the first trimester of pregnancy. This includes warfarin, certain anti-epilepsy drugs, certain antibiotics, genetic mutations that interfere with vitamin K status, and celiac disease (intestinal damage due to gluten).

Why is vitamin K important for the development of the jaws and face of the fetus? Vitamin K is required to activate a protein called matrix gla protein (MGP), which prevents unwanted calcification of the nasal septum in the developing fetus (among
other things). If this protein isn't activated by vitamin K during the critical developmental window, calcium deposits form in the nasal septum, stunting its growth and also stunting the growth of the maxilla and sinuses. Low activity of MGP appears to be largely responsible for Binder's syndrome, since the syndrome can be caused by genetic mutations in MGP in humans. Small or absent sinuses are common in the general population.

One of the interesting things about MGP is its apparent preference for vitamin K2 over vitamin K1.
Vitamin K1 is found predominantly in green vegetables, and is sufficient to activate blood clotting factors and probably some other vitamin K-dependent proteins. "Vitamin K2" refers to a collection of molecules known as menaquinones. These are denoted as "MK", followed by a number indicating the length of the side chain attached to the quinone ring.

Biologically important menaquinones are MK-4 through MK-12 or so. MK-4 is the form that animals synthesize from vitamin K1 for their own use. Certain organs (brain, pancreas, salivary gland, arteries) preferentially accumulate K2 MK-4, and certain cellular processes are also selective for K2 MK-4 (
MGP activation, PKA-dependent transcriptional effects). Vitamin K2 MK-4 is found almost exclusively in animal foods, particularly pastured butter, organs and eggs. It is always found in foods designed to nourish growing animals, such as eggs and milk.

Humans have the ability to convert K1 to K2 when K1 is ingested in artificially large amounts. However, due to the limited absorption of normal dietary sources of K1 and the unknown conversion efficiency, it's unclear how much green vegetables contribute to K2 status. Serum vitamin K1 reaches a plateau at about 200 micrograms per day of dietary K1 intake, the equivalent of 1/4 cup of cooked spinach (see figure 1 of this paper). Still, I think eating green vegetables regularly is a good idea, and may contribute to K2 status.
Other menaquinones such as MK-7 (found in natto) may contribute to K2 status as well, but this question has not been resolved.

Severe vitamin K deficiency clearly impacts occlusion. Could more subtle deficiency lead to a less pronounced form of the same developmental syndrome? Here are a few facts about vitamin K relevant to this question:
  • In industrial societies, newborns are typically vitamin K deficient. This is reflected by the fact that in the US, nearly all newborns are given vitamin K1 at birth to prevent potentially fatal hemorrhage. In Japan, infants are given vitamin K2 MK-4, which is equally effective at preventing hemmorhage.
  • Fetuses generally have low vitamin K status, as measured by the activity of their clotting factors.
  • The human placenta transports vitamin K across the placental barrier and accumulates it. This transport mechanism is highly selective for vitamin K2 MK-4 over K1.
  • The concentration of K1 in maternal blood is much higher than its concentration in umbilical cord blood, whereas the concentration of K2 in maternal blood is similar to the concentration in cord blood. Vitamin K2 MK-7 is undetectable in cord blood, even when supplemented, suggesting that MK-7 is not an adequate substitute for MK-4 during pregnancy.
  • In rat experiments, arterial calcification due to warfarin was inhibited by vitamin K2 MK-4, but not vitamin K1. This is probably due to K2's ability to activate MGP, the same protein required for the normal development of the human face and jaws.
  • The human mammary gland appears to be the most capable organ at converting vitamin K1 to K2 MK-4.
Together, this suggests that in industrial societies, fetuses and infants are vitamin K deficient, to the point of being susceptible to fatal hemorrhage. It also suggests that vitamin K2 MK-4 plays a critical role in fetal and early postnatal development. Could subclinical vitamin K2 deficiency be contributing to the high prevalence of malocclusion in modern societies?

An Ounce of Prevention


Vitamin A, folic acid, vitamin D and vitamin K2 are all nutrients with a long turnover time. Body stores of these nutrients depend on long-term intake. Thus, the nutritional status of the fetus during the first trimester reflects what the mother has been eating for several months
before conception.

Dr. Weston Price noted that a number of the traditional societies he visited prepared women of childbearing age for healthy pregnancies by giving them special foods rich in fat-soluble vitamins. This allowed them to gestate and rear healthy, well-formed children.
Nutrient-dense animal foods and green vegetables are a good idea before, during and after pregnancy.


* Liver is the richest source of vitamin A, folic acid and B12.


** Affected individuals may show class I, II, or III malocclusion.

Friday, November 6, 2009

Omega-3 Eggs

Eggs are an exceptionally nutritious food, as are all foods destined to nourish a growing animal. However, one concern lies in eggs' high concentration of arachidonic acid (AA), a long-chain omega-6 fat that is the precursor to many eicosanoids. Omega-6 derived eicosanoids are essential molecules that are involved in healing, development and defense. Some of them are inflammatory mediators that can contribute to disease when present in excess. Eggs are one of the main sources of AA in the modern diet.

The percent long-chain omega-6 fats (including AA) in red blood cell membranes associates quite well with heart attack risk. You can see the relationship in
this graph compiled by Dr. Bill Lands. However, egg consumption has never been convincingly linked to heart attack risk or any other disorder I'm aware of, despite dire warnings about eggs' cholesterol content. Nevertheless, conventionally raised eggs are unnaturally rich in AA, and unnaturally low in omega-3, due to the hens' diet of grains and soybeans.

The ideal egg is one that comes from a hen raised outdoors (often on pasture), in a place where she can eat a variety of green plants and insects. Hens raised this way typically still eat grain-based feed, but supplemented with a significant amount of foraged food. This dramatically increases the nutritional value of the eggs, as I've
noted before. Modern hens lay nearly one egg a day, which is a rate of production that can not be sustained without a large amount of calorie-dense food. They can't eat enough to lay at this rate by foraging.

Not everyone has access to pastured eggs. "Omega-3 eggs" come from hens fed an omega-3 enriched diet*. Not only do they have a much higher omega-3 content than conventional eggs, they also contain less AA.
One study found that omega-3 eggs contain 39% less AA than conventional and organic eggs. Omega-3 eggs were also rich in short- and long-chain omega-3 fats. Omega-3 eggs are certainly not nutritionally equivalent to pastured eggs, but they're a step in the right direction.

I don't really know if the AA content of eggs is a concern. Eicosanoid biology is complex and it doesn't like to fit into simple models. I'll look forward to seeing more research on the matter. In the meantime, I'll be eating pastured eggs, and when they're not available I'll eat omega-3 eggs.


*Typically from flax seeds, but some operations also use seaweed. The hens in the paper I cited were fed flax. The hens managed to convert a substantial portion of the alpha-linolenic acid into the important animal fat DHA, and presumably EPA although it was not measured.

Tuesday, November 3, 2009

Impressions of Hawai'i

I recently went to Hawai'i for the American Society of Human Genetics meeting in Waikiki, followed by a one-week vacation on Kaua'i with friends. It was my first time in Hawai'i and I really enjoyed it. The Hawai'ians I encountered were kind and generous people.

Early European explorers remarked on the beauty, strength, good nature and exellent physical development of the native Hawai'ians. The traditional Hawai'ian diet consisted mostly of taro root, sweet potatoes, yams, breadfruit, coconut, fish, occasional pork, fowl including chicken, taro leaves, seaweed and a few sweet fruits. It would have been very low (but adequate) in omega-6, because there simply isn't much of it available in this environment. Root crops and most fruit are virtually devoid of fat; seafood and coconut contain very little omega-6; and even the pork and chicken would have been low in omega-6 due to their diets. Omega-3 would have been plentiful from marine foods, and saturated fats would have come from coconut. All foods were fresh and unrefined. Abundant exercise and sunlight would have completed their salubrious lifestyle.

The traditional Hawai'ian diet was rich in easily digested starch, mainly in the form of poi, which is fermented mashed taro. I ate poi a number of times while I was on Kaua'i, and really liked it. It's mild, similar to mashed potatoes, but with a slightly sticky consistency and a purple color (due to the particular variety of taro that's traditionally used to make it).

I had the opportunity to try a number of traditional Polynesian foods while I was on Kaua'i. One plant that particularly impressed me is breadfruit. It's a big tree that makes cantaloupe-sized starchy green fruit. Breadfruit is incredibly versatile, because it can be used at different stages of ripeness for different purposes. Very young, it's like a vegetable, at full size, it's a bland starch, and fully ripe it's starchy and sweet like a sweet potato. It can be baked, boiled, fried and even dried for later use. It has a mild flavor and a texture similar to soft white bread. It's satisfying and fairly rich in micronutrients. On the right are breadfruit, coconut and sugarcane, three traditional Hawai'ian foods.

I find perennial staple crops such as breadfruit very interesting, because they're much less destructive to soil quality than annual crops, and they're a breeze to maintain. I could walk into the backyard of the apartment I was renting and pick a breadfruit, soak it, throw it in the oven and I had something nutritious to eat in just over an hour. It's like picking a bag of potatoes right off a tree. Insects and birds didn't seem to like it at all, possibly because the raw fruit exudes a bitter, rubbery sap when damaged. Unfortunatley, breadfruit is a tropical plant. Temperate starchy staples that were exploited by native North Americans include the majestic American chestnut in the Appalachians, and acorns in the West. These are both more work than breadfruit to prepare, particularly acorns which must be extensively soaked to remove bitter tannins.

One of the foods Polynesian settlers brought to Hawai'i was sugar cane. I had the opportunity to try fresh sugar cane for the first time while I was on Kaua'i. You cut off the outer skin, then cut it into strips and chew to get the sweet juice. It was mild but tasty. I don't know if it was a coincidence or not, but I ended up feeling unwell after eating several pieces. It may simply have been too much sugar for me.

Modern Hawai'i is a hunter-gatherer's dream. There are fruit trees everywhere, including papayas, wild and cultivated guavas, mangoes, avocados, passion fruit, breadfruit, bananas, citrus fruits and many others. Many of those fruits did not predate European contact however. Even pineapples were introduced to Hawai'i after European contact. Coconuts are everywhere, and we could pick one up for a drink and snack on almost any beach. The forests are full of wild chickens (such as the one at left) and pigs, both having resulted from the escape and subsequent mixing of Polynesian and European breeds. Kaua'ians frequently hunt the pigs, which are environmentally damaging due to their habit of rooting through topsoil for food. Large areas of forest on Kaua'i look like they've been ploughed due to the pigs' rooting. Humans are their only predators and their food is abundant.

While I was on Kaua'i, I ate mostly seafood (including delicious raw tuna poke), poi, breadfruit, coconut and sweet fruits-- a real Polynesian style hunter-gatherer diet! I swam every day, hiked in the lovely interior, and kayaked. It was a great trip, and I hope to return someday.
.

Tuesday, October 27, 2009

Heart Attack Risk Reduction: The Low-Hanging Fruit

Dr. Yongsoon Park and colleagues recently published a great article in the British Journal of Nutrition titled "Erythrocyte fatty acid profiles can predict acute non-fatal myocardial infarction". Stated simply, the title says that the fat in your red blood cell membranes, which reflects dietary fat composition, can predict your likelihood of having a heart attack*. More accurately than standard measures of heart attack risk such as blood cholesterol.

Let's cut to the data. The investigators examined the fat composition of red blood cells in people who had suffered a heart attack, versus an equal number who had not. Participants who had heart attacks had less omega-3, more long-chain omega-6, and particularly higher trans fat in their red blood cells. In fact, 96% of the heart attack patients had elevated trans fat levels, compared to 34% of those without heart attacks. This is consistent with a number of other studies showing a strong association between blood levels of trans fat and heart attack risk (ref).

92% of heart attack patients were in the lowest category of EPA in their red blood cells, as opposed to 32% of those without heart attacks. EPA is an omega-3 fat that comes from fish, and is also made by the body if there's enough omega-3 alpha-linolenic acid (think flax and greens) around and not too much linoleic acid (industrial vegetable oil) to inhibit its production. 96% of heart attack patients were in the lowest category for alpha-linolenic acid, compared to 34% of the comparison group. 0% of the heart attack patients were in the highest category for alpha-linolenic acid.

62% of heart attack patients were in the highest category of arachidonic acid (AA), compared to 34% of the comparison group. AA is made from linoleic acid, and is also found in animal foods such as eggs and liver. Animal foods from pasture-raised animals are lower in AA than their conventionally-raised counterparts, and also contain more omega-3 fats to balance it.

The investigators found that low omega-3, high AA and high trans fats in red blood cells associate with heart attack risk far better than the Framingham risk score, a traditional and widely-used measure that incorporates age, sex, smoking status, total cholesterol, HDL, hypertension and diabetes.

If the associations in this study represent cause-and-effect, which I believe they do based on their consistency with other observational studies and controlled trials, they imply that we can have a very powerful effect on heart attack risk by taking a few simple steps:
  1. Avoid trans fat. It's found in margarine, shortening, refined soy and canola oils, many deep fried foods and processed foods in general.
  2. Avoid industrial vegetable oils and other sources of excess omega-6. Eating pastured or omega-3 eggs, rather than conventional eggs, can help reduce dietary AA as well.
  3. Ensure a regular intake of omega-3 fats from seafood, or small doses of high-vitamin cod liver oil or fish oil. Flax oil is also helpful, but it's an inferior substitute for fish oil.
This study was conducted in Korea. It's a striking confirmation that basic nutritional principles span races and cultures, likely affecting disease risk in all humans.

In the future, I hope that most doctors will measure blood fatty acids to predict heart attack risk, with more success than current approaches. Instead of measuring cholesterol and prescribing a statin drug, doctors will prescribe fish oil and easy-to-follow diet advice**. Fortunately, some doctors are beginning to measure red blood cell fatty acid levels in their patients. The forward-thinking cardiologist Dr. William Davis has discussed this on his blog here. Take a good look at the graphs he posted if you get the chance.


*The title of the study is misleading because it implies a prospective design, in which blood fatty acids would be measured and volunteers followed to see who develops heart disease at a later time point. This study was cross-sectional (also called case-control), meaning they found people who had just had a heart attack and measured their blood fatty acids retrospectively. The other study I referenced above was prospective, which is a nice confirmation of the principle.

**"Eat butter on your toast. Ditch the margarine."

Wednesday, October 21, 2009

Butter vs. Margarine Showdown

I came across a gem of a study the other day, courtesy of Dr. John Briffa's blog. It's titled "Margarine Intake and Subsequent Coronary Heart Disease in Men", by Dr. William P. Castelli's group. It followed participants of the Framingham Heart study for 20 years, and recorded heart attack incidence*. Keep in mind that 20 years is an unusually long follow-up period.

The really cool thing about this study is they also tracked butter consumption. So it's really a no-holds barred showdown between the two fats. Here's a graph of the overall results, by teaspoons of butter or margarine eaten per day:

Heart attack incidence increased with increasing margarine consumption (statistically significant) and decreased slightly with increasing butter consumption (not statistically significant). That must have been a bitter pill for Castelli to swallow!

It gets better. Let's have a look at some of the participant characteristics, broken down by margarine consumption:

People who ate the least margarine had the highest prevalence of glucose intolerance (pre-diabetes), smoked the most cigarettes, drank the most alcohol, and ate the most saturated fat and butter. These were the people who cared the least about their health. Yet they had the fewest heart attacks. Imagine that. The investigators corrected for the factors listed above in their assessment of the contribution of margarine to disease risk, however, the fact remains that the group eating the least margarine was the least health conscious. This affects disease risk in many ways, measurable or not. I've written about that before, here and here.

Can this study get any better? Yes it can. The investigators broke down the data into two halves: the first ten years, and the second ten. In the first ten years, there was no significant association between margarine intake and heart attack incidence. In the second ten, the group eating the most margarine had 77% more heart attacks than the group eating none:

So it appears that margarine takes a while to work its magic.

They didn't publish a breakdown of heart attack incidence with butter consumption over the two periods. Perhaps they didn't like what they saw when they crunched the numbers. I find it really incredible that we're told to avoid dairy fat with data like these floating around. The Framingham study is first-rate epidemiology. It fits in perfectly with most other observational studies showing that full-fat dairy intake is not associated with heart attack and stroke risk. In fact, several studies have indicated that people who eat the most full-fat dairy have the lowest risk of heart attack and stroke.


It's worth mentioning that this study was conducted from the late 1960s until the late 1980s. Artificial trans fat labeling laws were still decades away in the U.S., and margarine contained more trans fat than it does today. Currently, margarine can contain up to 0.5 grams of trans fat per serving and still be labeled "0 g trans fat" in the U.S. The high trans fat content of the older margarines probably had something to do with the result of this study.

That does not make today's margarine healthy, however. Margarine remains an industrially processed pseudo-food. I'm just waiting for the next study showing that some ingredient in the new margarines (plant sterols? dihydro vitamin K1?) is the new trans fat.

Butter, Margarine and Heart Disease
The Coronary Heart Disease Epidemic


* More precisely, "coronary heart disease events", which includes infarction, sudden cardiac death, angina, and coronary insufficiency.

Sunday, October 18, 2009

A Little Hiatus

I'm going to a conference next week, followed by a little vacation. I've written two posts that will publish automatically while I'm gone. I may or may not respond to comments for the next two weeks. I probably won't respond to e-mails. I'll resume the malocclusion series when I get back.

Wednesday, October 14, 2009

Malocclusion: Disease of Civilization, Part IV

There are three periods during the development of the face and jaws that are uniquely sensitive to environmental influences such as nutrition and muscle activity patterns.

1: Prenatal Period

The major structures of the human face and jaws develop during the first trimester of pregnancy. The maxilla (upper jaw) takes form between the 7th and 10th week after conception. The mandible (lower jaw) begins two weeks earlier. The nasal septum, which is the piece of cartilage that forms the structure of the nose and divides the nostrils, appears at week seven and grows most rapidly from weeks 8 to 11. Any disturbance of this developmental window can have major consequences for later occlusion.

2: Early Postnatal Period

The largest postnatal increment in face and jaw growth occurs from birth until age 4. During this period, the deciduous (baby) teeth erupt, and the activity patterns of the jaw and tongue influence the size and shape of the maxilla and the mandible as they grow. The relationship of the jaws to one another is mostly determined during this period, although it can still change later in development.

During this period, the dental arch widens from its center, called the midpalatal suture. This ensures that the jaws are the correct size and shape to eventually accept the permanent teeth without crowding them.

3: Adolescence

The third major developmental period occurs between ages 11 and 16, depending on the gender and individual, and happens roughly at the same time as the growth spurt in height. The dental arch continues to widen, reaching its final size and shape. Under ideal circumstances, at the end of this period the arch should be large enough to accommodate all teeth, including the third molars (wisdom teeth), without crowding. Narrow dental arches cause malocclusion and third molar crowding.

Growth of the Dental Arch Over Time

The following graph shows the widening of the dental arch over time*. The dotted line represents arch growth while the solid line represents growth in body height. You can see that arch development slows down after 6 years old, resumes around 11, and finally ends at about 18 years. This graph represents the average of many children, so not all children will see these changes at the age indicated. The numbers are in millimeters per year, but keep in mind that the difference between a narrow arch and a broad one is only a few millimeters.

In the next few posts, I'll describe the factors that I believe influence jaw and face structure during the three critical periods of development.


* These data represent many years of measurements collected by Dr. Arne Bjork, who used metallic implants in the maxilla to make precise measurements of arch growth over time in Danish youths. The graph is reproduced from the book A Synopsis of Craniofacial Growth, by Dr. Don M. Ranly. Data come from Dr. Bjork's findings published in the book Postnatal Growth and Development of the Maxillary Complex. You can see some of Dr. Bjork's data in the paper "Sutural Growth of the Upper Face Studied by the Implant Method" (free full text).

Saturday, October 10, 2009

Malocclusion: Disease of Civilization, Part III

Normal Human Occlusion

In 1967, a team of geneticists and anthropologists published an extensive study of a population of Brazilian hunter-gatherers called the Xavante (1). They made a large number of physical measurements, including of the skull and jaws. Of 146 Xavante examined, 95% had "ideal" occlusion, while the 5% with malocclusion had nothing more than mild cro
wding of the incisors (front teeth). The authors wrote:
Characteristically, the Xavante adults exhibited broad dental arches, almost perfectly aligned teeth, end-to-end bite, and extensive dental attrition [tooth wear].
In the same paper, the author presents occlusion statistics for three other cultures. According to the papers he cites, in Japan, the prevalence of malocclusion was 59%, and in the US (Utah), it was 64%. He also mentions another native group living near the Xavante, part of the Bakairi tribe, living at a government post and presumably eating processed food. The prevalence of malocclusion was 45% in this group.

In 1998, Dr. Brian Palmer (DDS) published a paper describing some of the collections of historical skulls he had examined over the years (2):
...I reviewed an additional twenty prehistoric skulls, some dated at 70,000 years old and stored in the Anthropology Department at the University of Kansas. Those skulls also exhibited positive [good] occlusions, minimal decay, broad hard palates, and "U-shaped" arches.

The final evaluations were of 370 skulls preserved at the Smithsonian Institution in Washington, D.C. The skulls were those of prehistoric North American plains Indians and more contemporary American skulls dating from the 1920s to 1940s. The prehistoric skulls exhibited the same features as mentioned above, whereas a significant destruction and collapse of the oral cavity were evident in the collection of the more recent skulls. Many of these more recent skulls revealed severe periodontal disease, malocclusions, missing teeth, and some dentures. This was not the case in the skulls from the prehistoric periods...
The arch is the part of the upper jaw inside the "U" formed by the teeth. Narrow dental arches are a characteristic feature of malocclusion-prone societies. The importance of arch development is something that I'll be coming back to repeatedly. Dr. Palmer's paper includes the following example of prehistoric (L) and modern (R) arches:


Dr. Palmer used an extreme example of a modern arch to illustrate his point, however, arches of this width are not uncommon today. Milder forms of this narrowing affect the majority of the population in industrial nations.

In 1962, Dr. D.H. Goose published a
study of 403 British skulls from four historical periods: Romano-British, Saxon, medieval and modern (3). He found that the arches of modern skulls were less broad than at any previous time in history. This followed an earlier study showing that modern British skulls had more frequent malocclusion than historical skulls (4). Goose stated that:
Although irregularities of the teeth can occur in earlier populations, for example in the Saxon skulls studied by Smyth (1934), the narrowing of the palate seems to have occurred in too short a period to be an evolutionary change. Hooton (1946) thinks it is a speeding up of an already long standing change under conditions of city life.
Dr. Robert Corruccini published several papers documenting narrowed arches in one generation of dietary change, or in genetically similar populations living rural or urban lifestyles (reviewed in reference #5). One was a st
udy of Caucasians in Kentucky, in which a change from a traditional subsistence diet to modern industrial food habits accompanied a marked narrowing of arches and increase in malocclusion in one generation. Another study examined older and younger generations of Pima Native Americans, which again showed a reduction in arch width in one generation. A third compared rural and urban Indians living in the vicinity of Chandigarh, showing marked differences in arch breadth and the prevalence of malocclusion between the two genetically similar populations. Corruccini states:
In Chandigarh, processed food predominates, while in the country coarse millet and locally grown vegetables are staples. Raw sugar cane is widely chewed for enjoyment rurally [interestingly, the rural group had the lowest incidence of tooth decay], and in the country dental care is lacking, being replaced by chewing on acacia boughs which clean the teeth and are considered medicinal.
Dr. Weston Price came to the same conclusion examining prehistoric skulls from South America, Australia and New Zealand, as well as their living counterparts throughout the world that had adhered to traditional cultures and foodways. From Nutrition and Physical Degeneration:
In a study of several hundred skulls taken from the burial mounds of southern Florida, the incidence of tooth decay was so low as to constitute an immunity of apparently one hundred per cent, since in several hundred skulls not a single tooth was found to have been attacked by tooth decay. Dental arch deformity and the typical change in facial form due to an inadequate nutrition were also completely absent, all dental arches having a form and interdental relationship [occlusion] such as to bring them into the classification of normal.
Price found that the modern descendants of this culture, eating processed food, suffered from malocclusion and narrow arches, while another group from the same culture living traditionally did not. Here's one of Dr. Price's images from Nutrition and Physical Degeneration (p. 212). This skull is from a prehistoric New Zealand Maori hunter-gatherer:


Note the well-formed third molars (wisdom teeth) in both of the prehistoric skulls I've posted. These people had ample room for them in their broad arches. Third molar crowding is a mild form of modern face/jaw deformity, and affects the majority of modern populations. It's the reason people have their wisdom teeth removed. Urban Nigerians in Lagos have 10 times more third molar crowding than rural Nigerians in the same state (10.7% of molars vs. 1.1%, reference #6).

Straight teeth and good occlusion are the human evolutionary norm. They're also accompanied by a wide dental arch and ample room for third molars in many traditionally-living cultures. The combination of narrow arches, malocclusion, third molar crowding, small or absent sinuses, and a characteristic underdevelopment of the middle third of the face, are part of a developmental syndrome that predominantly afflicts industrially living cultures.


(1) Am. J. Hum. Genet. 19(4):543. 1967. (free full text)
(2) J. Hum. Lact. 14(2):93. 1998
(3) Arch. Oral Biol. 7:343. 1962
(4) Brash, J.C.: The Aetiology of Irregularity and Malocclusion of the Teeth. Dental Board of the United Kingdom, London, 1929.
(5) Am J. Orthod. 86(5):419
(6) Odonto-Stomatologie Tropicale. 90:25. (free full text)

Saturday, October 3, 2009

Malocclusion: Disease of Civilization, Part II

The Nature of the Problem

In 1973, the US Centers for Disease Control and Prevention (CDC) published the results of a National Health Survey in which it examined the dental health of American youths nationwide. The following description was published in a special issue of the journal Pediatric Dentistry (1):
The 1973 National Health Survey reported 75% of children, ages 6 to 11 years, and 89% of youths, ages 12 to 17 years, have some degree of occlusal disharmony [malocclusion]; 8.7% of children and 13% of youth had what was considered a severe handicapping malocclusion for which treatment was highly desirable and 5.5% of children and 16% of youth had a severe handicapping malocclusion that required mandatory treatment.
89% of youths had some degree of malocclusion, and 29% had a severe handicapping malocclusion for which treatment was either highly desirable or mandatory. Fortunately, many of these received orthodontics so the malocclusion didn't persist into adulthood.

This is consistent with another survey conducted in 1977, in which 38% of American youths showed definite or severe malocclusion. 46% had occlusion that the authors deemed "ideal or acceptable" (2).

The trend continues. The CDC National Health and Nutrition Examination Survey III (NHANES III) found in 1988-1991 that approximately three fourths of Americans age 12 to 50 years had some degree of malocclusion (3).

The same holds true for Caucasian-Americans, African-Americans and Native Americans in the US, as well as other industrial nations around the world. Typically, only 1/3 to 1/2 of the population shows good (but not necessarily perfect) occlusion (4- 8).

In the next post, I'll review some of the data from non-industrial and transitioning populations.


Malocclusion: Disease of Civilization


1. Pediatr. Dent. 17(6):1-6. 1995-1996
2. USPHS Vital and Health Statistics Ser. 11, no 162. 1977
3. J. Dent. Res. Special issue. 75:706. 1996. Pubmed link.
4. The Evaluation of Canadian Dental Health. 1959. Describes Canadian occlusion.
5. The Effects of Inbreeding on Japanese Children. 1965. Contains data on Japanese occlusion.
6. J. Dent. Res. 35:115. 1956. Contains data on both industrial and non-industrial cultures (Pukapuka, Fiji, New Guinea, U.S.A. and New Zealand).
7. J. Dent. Res. 44:947. 1965 (
free full text). Contains data on Caucasian-Americans and African-Americans living in several U.S. regions, as well as data from two regions of Germany. Only includes data on Angle classifications, not other types of malocclusion such as crossbite and open bite (i.e., the data underestimate the total prevalence of malocclusion).
8. J. Dent. Res. 47:302. 1968 (free full text). Contains data on Chippewa Native Americans in the U.S., whose occlusion was particularly bad, especially when compared to previous generations.

Tuesday, September 29, 2009

Malocclusion: Disease of Civilization

In his epic work Nutrition and Physical Degeneration, Dr. Weston Price documented the abnormal dental development and susceptibility to tooth decay that accompanied the adoption of modern foods in a number of different cultures throughout the world. Although he quantified changes in cavity prevalence (sometimes finding increases as large as 1,000-fold), all we have are Price's anecdotes describing the crooked teeth, narrow arches and "dished" faces these cultures developed as they modernized.

Price published the first edition of his book in 1939. Fortunately,
Nutrition and Physical Degeneration wasn't the last word on the matter. Anthropologists and archaeologists have been extending Price's findings throughout the 20th century. My favorite is Dr. Robert S. Corruccini, currently a professor of anthropology at Southern Illinois University. He published a landmark paper in 1984 titled "An Epidemiologic Transition in Dental Occlusion in World Populations" that will be our starting point for a discussion of how diet and lifestyle factors affect the development of the teeth, skull and jaw (Am J. Orthod. 86(5):419)*.

First, some background. The word
occlusion refers to the manner in which the top and bottom sets of teeth come together, determined in part by the alignment between the upper jaw (maxilla) and lower jaw (mandible). There are three general categories:
  • Class I occlusion: considered "ideal". The bottom incisors (front teeth) fit just behind the top incisors.
  • Class II occlusion: "overbite." The bottom incisors are too far behind the top incisors. The mandible may appear small.
  • Class III occlusion: "underbite." The bottom incisors are beyond the top incisors. The mandible protrudes.
Malocclusion means the teeth do not come together in a way that's considered ideal. The term "class I malocclusion" is sometimes used to describe crowded incisors when the jaws are aligning properly.

Over the course of the next several posts, I'll give an overview of the extensive literature showing that hunter-gatherers past and present have excellent occlusion, subsistence agriculturalists generally have good occlusion, and the adoption of modern foodways directly causes the crooked teeth, narrow arches and/or crowded third molars (wisdom teeth) that affect the majority of people in industrialized nations. I believe this process also affects the development of the rest of the skull, including the face and sinuses.


In his 1984 paper, Dr. Corruccini reviewed data from a number of cultures whose occlusion has been studied in detail. Most of these cultures were observed by Dr. Corruccini personally. He compared two sets of cultures: those that adhere to a traditional style of life and those that have adopted industrial foodways. For several of the cultures he studied, he compared it to another that was genetically similar. For example, the older generation of Pima indians vs. the younger generation, and rural vs. urban Punjabis. He also included data from archaeological sites and nonhuman primates. Wild animals, including nonhuman primates, almost invariably show perfect occlusion.

The last graph in the paper is the most telling. He compiled all the occlusion data into a single number called the "treatment priority index" (TPI). This is a number that represents the overall need for orthodontic treatment. A TPI of 4 or greater indicates malocclusion (the cutoff point is subjective and depends somewhat on aesthetic considerations). Here's the graph: Every single urban/industrial culture has an average TPI of greater than 4, while all the non-industrial or less industrial cultures have an average TPI below 4. This means that in industrial cultures, the average person requires orthodontic treatment to achieve good occlusion, whereas most people in more traditionally-living cultures naturally have good occlusion.

The best occlusion was in the New Britain sample, a precontact Melanesian hunter-gatherer group studied from archaeological remains. The next best occlusion was in the Libben and Dickson groups, who were early Native American agriculturalists. The Pima represent the older generation of Native Americans that was raised on a somewhat traditional agricultural diet, vs. the younger generation raised on processed reservation foods. The Chinese samples are immigrants and their descendants in Liverpool. The Punjabis represent urban vs. rural youths in Northern India. The Kentucky samples represent a traditionally-living Appalachian community, older generation vs. processed food-eating offspring. The "early black" and "black youths" samples represent older and younger generations of African-Americans in the Cleveland and St. Louis area. The "white parents/youths" sample represents different generations of American Caucasians.


The point is clear: there's something about industrialization that causes malocclusion. It's not genetic; it's a result of changes in diet and/or lifestyle. A "disease of civilization". I use that phrase loosely, because malocclusion isn't really a disease, and some cultures that qualify as civilizations retain traditional foodways and relatively good teeth. Nevertheless, it's a time-honored phrase that encompasses the wide array of health problems that occur when humans stray too far from their ecological niche.
I'm going to let Dr. Corruccini wrap this post up for me:
I assert that these results serve to modify two widespread generalizations: that imperfect occlusion is not necessarily abnormal, and that prevalence of malocclusion is genetically controlled so that preventive therapy in the strict sense is not possible. Cross-cultural data dispel the notion that considerable occlusal variation [malocclusion] is inevitable or normal. Rather, it is an aberrancy of modern urbanized populations. Furthermore, the transition from predominantly good to predominantly bad occlusion repeatedly occurs within one or two generations' time in these (and other) populations, weakening arguments that explain high malocclusion prevalence genetically.

* This paper is worth reading if you get the chance. It should have been a seminal paper in the field of preventive orthodontics, which could have largely replaced conventional orthodontics by now. Dr. Corruccini is the clearest thinker on this subject I've encountered so far.