jlink
Wednesday, May 25, 2022
Advertisement

In 1991, I published a paper in JAMA on the epidemiology of warfare. What we showed was that, going back centuries, the bulk of deaths in war are not battlefield deaths, but derive from infectious disease and accidents. Reading the tales in the bible of the battles between the Israelites and their enemies and how God dealt with either side illustrates this point quite well, as does the way Cortez and his 500 conquistadores in 1519 decimated the Aztec population (numbering 2 million) with the hitherto unknown smallpox and brought it to its knees.

The turning point, depending on how you count the influenza pandemic of 1918, came with World War I, after which combat became more lethal, and infectious diseases were better controlled. But what of the long-term consequences of war, such as cancer, post-traumatic stress disorder, etc.? It does not stretch the imagination to assume that the survivors of World War II (including the civilian populations) had repercussions from a medical point of view. Cancer may or may not be an obvious further insult added to what they already suffered. Cancer-causing agents are not as commonplace as one might expect. Not only is it of historic interest to appreciate their impact, but perhaps we can learn lessons important to future generations. We shall explore some of these historical vignettes in our next few articles.

Let us consider the following Nazi-inspired tragedy during World War II, which can possibly shed light also on what happened to survivors of the Shoah. Very early in the war, the Germans occupied the Netherlands. While there was initially a reasonable amount of food available, as the war progressed, rations became more limited until the daily ration for each adult declined to approximately 1,500 calories per day. There was a Dutch government in exile that was working in concert with the Allies and, after D-Day in June of 1944, assisting the Allies in helping to liberate small parts of Holland. In addition, a railway strike was called in the Netherlands in September of 1944 in order to obstruct the advance of the German army to the Western Front.

The Nazis were incensed by this activity and in November of 1944 they imposed a blockade on the western part of the Netherlands, approximately one-third of the country, which contained four to five million people; this area included Amsterdam, Rotterdam and the Hague. The entry of foodstuffs was restricted and the daily ration during this time until the liberation in May of 1945 averaged 600 calories per capita. People were reduced to eating grass and tulip bulbs in order to survive. It is estimated that approximately 22,000 people died from the direct effects of the famine, known as the Dutch Famine, or Hongerwinter. Its effects were compounded by an unusually harsh winter.

This tragedy offered an unusual opportunity to study the effects of severe starvation. It occurred in a modern, literate society in which excellent records were extant on all the citizens in terms of residence during the Hongerwinter, caloric exposure and subsequent medical outcomes. Furthermore, the other two-thirds of Holland provides an excellent control group for comparison purposes where the lifestyle and other factors would be comparable—what epidemiologists refer to as a quasi-experimental design. Many studies have been published over the decades since then comparing these two populations to explore the impact of starvation on various diseases—diabetes, pregnancy and miscarriage, birthweight, etc. But we, of course, focus on cancer.

Between 1983 and 1986, followup was conducted on 15,000 women who were between the ages of 2 and 33 during the Famine. By 2000, 585 breast cancers occurred among these women. Those who had severe exposure had a statistically significant increased risk of breast cancer of 1.48, i.e., almost a 50% higher risk of breast cancer, as compared to those with no exposure to the Famine. The risk was worse for those exposed between the ages of 2 and 9, for whom the risk was doubled. Interestingly, a subsequent study showed that the exposure to the Famine did not increase the risk of other cancers—only breast cancer.

In another study, 475 women who were exposed to the famine in utero were followed up through 2005 (through age 61 years). For the women who were exposed to severe famine as compared to those who were not exposed (the records were outstanding in being able to describe every pregnant woman’s exact caloric intake, as well as the trimester(s) of her pregnancy during which she had reduced intake), the overall risk of breast cancer was 2.6. For those exposed to severe malnutrition during the first trimester, the relative risk of breast cancer was 4.8. Later studies suggested that this reflected epigenetic changes in the fetus.

What is interesting is that increased calories in nonpregnant women can increase the risk of breast cancer as well. So calories can represent a two-way street—both over- and under-exposure may be deleterious with regard to breast cancer risk.

Next week, more on World War II and cancer, specifically the Shoah, in Thoughts on Cancer.

Alfred I. Neugut, MD, PhD, is a medical oncologist and cancer epidemiologist at Columbia University Irving Medical Center/New York Presbyterian and Mailman School of Public Health in New York.


This article is for educational purposes only and is not intended to be a substitute for professional medical advice, diagnosis, or treatment, and does not constitute medical or other professional advice. Always seek the advice of your qualified health provider with any questions you may have regarding a medical condition or treatment.

Share
Sign up now!