Exposome Perspectives Blog

The “Fetal Origins” of Adult Disease

Is it possible that your risk for disease was set when you were a baby? Evidence shows that our earliest years of life are perhaps the most important for understanding the origins of many health outcomes.

Is it possible that your risk for disease was set when you were a baby? Evidence shows that our earliest years of life are perhaps the most important for understanding the origins of many health outcomes


Exposome Perspectives Blog by Robert O. Wright, MD, MPH

Did you ever wonder how much your childhood influenced your health today? Is it possible that your risk for, say, heart disease was set when you were a baby? The idea that something that happened to you as an infant could give you a disease as an adult once seemed far-fetched—after all, we are talking decades of time. Surely that hamburger I ate yesterday is more likely to give me a heart attack than anything I did as a baby? Well, it turns out what happened to you and what you ate back then might mean more than you think. 

There is a lot of evidence that the reasons we get sick today have a lot to do with what happened to us as an infant, or even as a fetus in our mother’s womb. In the early 90’s, British physician and epidemiologist David Barker, PhD, was the first to make this observation, proposing that the “fetal origins” of adult disease are a major contributor to the risk of many common illnesses. While his journey was difficult, with few scientists accepting his early work, he ultimately revolutionized how we think about the environment and health.

We now have abundant evidence to show that our earliest years of life may be the most important years for understanding the origins of many, many diseases. Dr. Barker didn’t study what we traditionally think of as environmental health, but was instead interested in nutrition (although I would argue that our diet is part of our environment). But his concept has been incorporated into how we think about nearly all early life exposures and health outcomes. 

“We now have abundant evidence to show that our earliest years of life are perhaps the most important years for understanding the origins of many, many diseases.”

Dr. Robert Wright

The “fetal origins” of adult disease are logistically difficult to study. To prove Dr. Barker’s theory, we need to be able to measure something that happened 50-60 years ago. If I have a heart attack today, and the environmental exposures that helped cause that heart attack happened in 1963 when I was a fetus, how can anyone figure that out, short of a time machine? Do any of us remember what we ate in 1971? Or even more importantly, what our mothers ate when they were pregnant with us? Dr. Barker had to come up with a proxy measure for nutrition. Often in research, the easiest way to test a theory is to study extremes, even when you can’t address the details of what caused the extreme. Dr. Barker’s solution to identifying a proxy for measuring early-life nutrition turned out to be birth certificates and famine. 

In many countries, your birth weight is always recorded, and what parent doesn’t save their child’s birth certificate? By collecting birth certificates, Dr. Barker showed that people born with lower birth weights have a greater risk of developing coronary heart disease and diabetes 50-60 years later. He tested this finding in multiple populations in which mothers were pregnant during times of famine (for example, during World War II) to see if severe calorie restriction played a role. While they showed catch-up growth after birth, these children would eventually grow to be adults who were much more likely to have a heart attack when they reached the sixth decade of life. It didn’t matter where he did the study—Europe, Africa, South America, Australia—the results were consistent. This was the first evidence showing that a mother’s nutrition during pregnancy affects the health of her children—but only when they became adults. The many years in between didn’t seem to matter. 

People born with lower birth weights have a greater risk of developing coronary heart disease and diabetes 50-60 years later.

This groundbreaking finding was originally known as the “Barker hypothesis,” but had other names too, such as “gestational conflict.” This latter name arose because Dr. Barker hypothesized that fetuses learn to adapt to the environment they expect to enter once outside of the womb based on what they experience inside the womb. If maternal nutrition is poor during pregnancy, the fetus prepares itself for a scarcity of resources outside the womb. Thus, the fetal brain and metabolism are programmed by maternal nutritional status to eat more food, eat that food more impulsively, and not resist food temptations. Essentially, the quantity and quality of nutrients that cross the placenta give the fetus clues as to what the outside world is like. Its metabolism begins to adapt to prepare for the outside world. If the outside world turns out to be truly “hostile” with scarce resources, the child is better prepared to survive these conditions.

Obviously, if food resources are scarce, then eating right away without hesitation gives you a leg up on your food competitors and increases the probability you will survive over time. The problems occur when the outside world doesn’t match the fetal environment. If the outside world has conditions of plenitude when the child has been physiologically programmed to live in a world in which resources are scarce, he/she will eat and metabolize food in a way that is drastically different from this plentiful reality. This is what happened in Europe after World War II—fetuses expected famine as children, but instead grew up with plentiful food. Their fetal environment programmed them to eat that food impulsively, which was a mismatch with the conditions of the world outside the womb. Obesity, high cholesterol, and heart disease ultimately ensued in adult life as calories from this impulsive eating took their toll. Those who were not programmed to overeat or under-metabolize because their fetal environments were normal were at lower risk of these outcomes.  

Today this concept has led to a whole new field of science, the “Developmental Origins of Health and Disease,” or DOHaD. There are even research societies based on this concept. The field of epigenetics is based on similar work and helps to explain how this programming occurs biologically. 

I met Dr. Barker in 2007 when he spoke at Harvard. I had the honor of hosting him for dinner and was surprised at how gracious, funny, and humble he could be. I asked him what he thought of epigenetics. He replied that he believed that simpler scientific ideas had the most value because they were easy to understand. He told me that if he were doing a study now, he would just weigh the placenta—no fancy genomic markers. He thought the placenta’s weight meant more than any biomarker we could measure. While I like his idea of placental weight as a predictor, I don’t think it has to be mutually exclusive from measuring the molecules and chemicals that drive the programming that leads to DOHaD health effects. Nonetheless, I have taken to heart that simple messages are more effective and try my best to remember that advice when writing about science. 

Dr. Barker was extremely influential, to a level few of us will ever reach. He was elected a Fellow of the Royal Society of London and was president of the Association of Physicians of Great Britain and Ireland. He lectured and wrote extensively on maternal-fetal nutrition, and published the book Nutrition in the Womb. He passed away in 2013. 

Our Institute for Exposomic Research and the field of exposomics and epigenetics owe a great debt to his work. Indeed, while it may not seem obvious yet, even the field of genetics will benefit as environmental conditions control whether genes are turned on or off through epigenetic modifications. It turns out that early life is when the ease of flipping the switch is set. Dr. David Barker figured that out.