Famine Myths: Five Misunderstandings Related to the 2011 Hunger Crisis in the Horn of Africa
by William G. Moseley1
Read the PDF of this article here: pdf
The 2011 famine in the Horn of Africa was one of the worst in recent decades in terms of loss of life and human suffering. While the UN has yet to release an official death toll, the British government estimates that between 50,000 and 100,000 people died, most of them children, between April and September of 2011. Although Kenya, Ethiopia, and Djibouti were all badly affected, the famine hit hardest in certain (mainly southern) areas of Somalia. This was the worst humanitarian disaster to strike the country since 1991-1992, with roughly a third of the Somali population displaced for some period of time.
Despite the scholarly and policy community’s tremendous advances in understanding famine over the past 40 years (Sen 1981; Watts and Bohle 1993; de Wall 2005), and increasingly sophisticated famine early warning systems (Moseley and Logan 2005), much of this knowledge and information was seemingly ignored or forgotten in 2011. While the famine had been forecasted nearly nine months in advance, the global community failed to prepare for, and react in a timely manner to, this event. The famine was officially declared in early July of 2011 by the United Nations and stated to be over on February 3, 2012. Despite the official end of the famine, 31% of the population (or 2.3 million people) in southern Somalia remains in crisis. Across the region, 9.5 million people continue to need assistance. Millions of Somalis remain in refugee camps in Ethiopia and Kenya.
The famine reached its height in the period from July to September 2011, with approximately 13 million people at risk of starvation. While this was a regional problem, it was most acute in southern Somalia because aid to this region was much delayed. Figure 1 provides a picture of food insecurity in the region in the November-December 2011 period (a few months after the peak of the crisis).
Figure 1: Food Insecurity in the Horn of Africa Region, November-December 2011. Based on data and assessment by FEWS-Net (a USAID sponsored program). Cartography by Ashley Nepp, Macalester College.
The 2011 famine received relatively little attention in the U.S. media and much of the coverage that did occur was biased, ahistorical, or perpetuated long standing misunderstandings about the nature and causes of famine. This article addresses these “famine myths” and is organized around five key misunderstandings related to the famine in the Horn of Africa. These myths are: 1) drought was the cause of the famine; 2) overpopulation was the cause of the famine; 3) increasing food production through advanced techniques will resolve the food insecurity over the long run; 4) U.S. foreign policy in the Horn of Africa was unrelated to the crisis; and 5) an austere response may be best in the long run. It is important to interrogate these misunderstandings, especially as another famine in the West African Sahel is unfolding as this article goes to press. The links between food security and food sovereignty are also significant. While famine is sometimes viewed as a phenomenon driven by endogenous factors (e.g., drought and population growth), this article draws strong connections between such crises and broader political economic factors – an approach which jibes well with number of food sovereignty concerns.
Myth #1: Drought was the cause of the famine
While drought certainly contributed to the crisis in the Horn of Africa, there were more fundamental causes at play. Drought is not a new environmental condition for much of Africa, but a recurring one. The Horn of Africa has long experienced erratic rainfall. While climate change may be exacerbating rainfall variability, traditional livelihoods in the region are adapted to deal with situations where rainfall is not dependable (Moseley 2011a).
The dominant livelihood in the Horn of Africa has long been herding, which is well adapted to the semi-arid conditions of the region (see Figure 2). Herders traditionally ranged widely across the landscape in search of better pasture, focusing on different areas depending on meteorological conditions. The approach worked because, unlike fenced in pastures in America, it was incredibly flexible and well adapted to variable rainfall conditions. As farming expanded, including large-scale commercial farms in some instances, the routes of herders became more concentrated, more vulnerable to drought, and more detrimental to the landscape.
Figure 2: Two young Somali women herd their goats to the Juba River in Dollow, Somalia (February 2, 2012). Credit: ©FAO/SIMON MAINA
Agricultural livelihoods also evolved in problematic ways. In anticipation of poor rainfall years, farming households and communities historically stored surplus crop production in granaries. Sadly this traditional strategy for mitigating the risk of drought was undermined from the colonial period moving forward as households were encouraged (if not coerced by taxation) to grow cash crops for the market and store less excess grain for bad years. This increasing market orientation was also encouraged by development banks, such as the World Bank, International Monetary Fund, and African Development Bank.
The moral of the story is that famine is not a natural consequence of drought (just as death from exposure is not the inherent result of a cold winter), but it is the structure of human society which often determines who is affected and to what degree.
Myth #2: Overpopulation was the cause of the famine
With nearly 13 million people at risk of starvation last fall in a region whose population doubled in the previous 24 years, one might assume that these two factors were causally related in the Horn of Africa. Ever since the British political economist Thomas Malthus wrote “An Essay on the Principle of Population” in 1798, we have been concerned that human population growth will outstrip available food supply. While the crisis in Somalia, Ethiopia and Kenya appeared to be perfect proof of the Malthusian scenario, we must be careful not to make overly simplistic assumptions (Moseley 2011b).
For starters, the semi-arid zones in the Horn of Africa are relatively lightly populated compared to other regions of the world. For example, the population density of Somalia is about 13 persons per sq. kilometer, whereas that of the U.S. state of Oklahoma is 21.1. The western half of Oklahoma is also semi-arid, suffered from a serious drought in 2011, and was the poster child for the 1930s Dust Bowl. Furthermore, if we take into account differing levels of consumption, with the average American consuming at least 28 times as much as the average Somali in a normal year, then Oklahoma’s population density of 21.1 persons per sq. kilometer equates to that of 591 Somalis.
Despite the fact that Oklahoma’s per capita impact on the landscape is over 45 times that of Somalia (when accounting for population density and consumption levels), we don’t talk about overpopulation in Oklahoma. This is because, in spite of the drought and the collapse of agriculture, there was no famine in Oklahoma. In contrast, the presence of famine in the Horn of Africa led many to assume that too many people was a key part of the problem.
Why is it that many assume that population growth is the driver of famine? For starters, perhaps we assume that reducing the birthrate, and thereby reducing the number of mouths to feed, is one of the easiest ways to prevent hunger. This is actually a difficult calculation for most families in rural Africa. It’s true that many families desire access to modern contraceptives, and filling this unmet need is important. However, for many others, children are crucial sources of farm labor or important wage earners who help sustain the family. Children also act as the old-age social security system for their parents. For these families, having fewer children is not an easy decision. Families in this region will have fewer children when it makes economic sense to do so. As we have seen over time and throughout the world, the average family size shrinks when economies develop and expectations for offspring change.
Second, many tend to focus on the additional resources required to nourish each new person, and often forget the productive capacity of these individuals. Throughout Africa, some of the most productive farmland is in those regions with the highest population densities. In Machakos, Kenya, for example, agricultural production and environmental conservation improved as population densities increased (Mortimore and Tiffen 1995). Furthermore, we have seen agricultural production collapse in some areas where population declined (often due to outmigration) because there was insufficient labor to maintain intensive agricultural production.
Third, we must not forget that much of the region’s agricultural production is not consumed locally. From the colonial era moving forward, farmers and herders have been encouraged to become more commercially oriented, producing crops and livestock for the market rather than home consumption. This might have been a reasonable strategy if the prices for exports from the Horn of Africa were high (which they rarely have been) and the cost of food imports low. Also, large land leases (or “land grabs”) to foreign governments and corporations in Ethiopia (and to a lesser extent in Kenya and Somalia) have further exacerbated this problem. These farms, designed solely for export production, effectively subsidize the food security of other regions of the world (most notably the Middle East and Asia) at the expense of populations in the Horn of Africa.
Land Grabs
Long term leases of African land for export-oriented food production, or “land grabs,” have been on the rise in the past decade. Rather than simply buying food and commodity crops from African farmers, foreign entities increasingly take control of ownership and management of farms on African soil. This trend stems from at least two factors. First, increasingly high global food prices are a problem for many Asian and Middle Eastern countries which depend on food imports. As such, foreign governments and sovereign wealth funds may engage in long-term leases of African land in order to supply their own populations with affordable food. Secondly, high global food prices are also seen as an opportunity for some Western investors who lease African land to produce crops and commodities for profitable global markets.
In the Horn of Africa, Ethiopia (which has historically been one of the world’s largest recipients of humanitarian food aid) has made a series of long term land leases to foreign entities. The World Bank estimates that at least 35 million hectares of land have been leased to 36 different countries, including China, Pakistan, India and Saudi Arabia. Supporters of these leases argue that they provide employment to local people and disseminate modern agricultural approaches. Critics counter that these leases undermine food sovereignty, or people’s ability to feed themselves via environmentally sustainable technologies which they control.
Myth #3: Increasing food production through advanced techniques will resolve food insecurity over the long run
As Sub-Saharan Africa has grappled with high food prices in some regions and famine in others, many experts argue that increasing food production through a program of hybrid seeds and chemical inputs (a so-called “New Green Revolution”) is the way to go.
While outsiders benefit from this New Green Revolution strategy (by selling inputs or purchasing surplus crops), it is not clear if the same is true for small farmers and poor households in Sub-Saharan Africa. For most food insecure households on the continent, there are at least two problems with this strategy. First, such an approach to farming is energy intensive because most fertilizers and pesticides are petroleum based. Inducing poor farmers to adopt energy-intensive farming methods is short sighted, if not unethical, when experts know that global energy prices are likely to rise. Second, irrespective of energy prices, the New Green Revolution approach requires farmers to purchase seeds and inputs, which means that it will be inaccessible to the poorest of the poor, i.e., those who are the most likely to suffer from periods of hunger.
If not the New Green Revolution approach, then what? Many forms of bio-intensive agriculture are, in fact, highly productive and much more efficient than those of industrial agriculture. For example, crops grown in intelligent combinations allow one plant to fix nitrogen for another rather than relying solely on increasingly expensive, fossil fuel-based inorganic fertilizers for these plant nutrients. Mixed cropping strategies are also less vulnerable to insect damage and require little to no pesticide use for a reasonable harvest. These techniques have existed for centuries in the African context and could be greatly enhanced by supporting collaboration among local people, African research institutes, and foreign scientists. Developing food production strategies that local people control has also been referred to as food sovereignty.
Myth #4: U.S. foreign policy in the Horn of Africa is unrelated to the crisis
Many Americans assume that U.S. foreign policy bears no blame for the food crisis in the Horn and, more specifically, Somalia. This is simply untrue. The weakness of the Somali state was and is related to U.S. policy, which interfered in Somali affairs based on Cold War politics (the case in the 1970s and 80s) or the War on Terror (the case in the 2000s).
During the Cold War, Somalia was a pawn in a U.S.-Soviet chess match in the geopolitically significant Horn of Africa region. In 1974, the U.S. ally Emperor Haile Selassie of Ethiopia was deposed in a revolution. He was eventually replaced by Mengistu Haile Mariam, a socialist. In response, the leader of Ethiopia’s bitter rival Somalia, Siad Barre, switched from being pro-Soviet to pro-western. Somalia was the only country in Africa to switch Cold War allegiances under the same government. The U.S. supported Siad Barre until 1989 (shortly before his demise in 1991). By doing this, the United States played a key role in supporting a long-running dictator and undermined democratic governance.
More recently, the Union of Islamic Courts (UIC) came to power in 2006. The UIC defeated the warlords, restored peace to Mogadishu for the first time in 15 years, and brought most of southern Somalia under its orbit. The United States and its Ethiopian ally claimed that these Islamists were terrorists and a threat to the region. In contrast, the vast majority of Somalis supported the UIC and pleaded with the international community to engage them peacefully. Unfortunately, this peace did not last. The U.S.-supported Ethiopian invasion of Somalia begun in December 2006 and displaced more than a million people and killed close to 15,000 civilians. Those displaced then became a part of last summer and fall’s famine victims (Samatar 2011).
The power vacuum created by the displacement of the more moderate UIC also led to the rise of its more radical military wing, al-Shabab. Al-Shabab emerged to engage the Transitional Federal Government (TFG), which was put in place by the international community and composed of the most moderate elements of the UIC (which were more favorable to the United States). The TFG was weak, corrupt, and ineffective, controlling little more than the capital Mogadishu, if that. A low-grade civil war emerged between these two groups in southern Somalia. Indeed, as we repeatedly heard in the media last year, it was al-Shabab that restricted access to southern Somalia for several months leading up to the crisis and greatly exacerbated the situation in this sub-region. Unfortunately, the history of factors which gave rise to al-Shabab was never adequately explained to the U.S. public. Until July 2011, the U.S. government forbade American charities from operating in areas controlled by al-Shabab—which delayed relief efforts in these areas.
Myth #5: An austere response may be best in the long run
Efforts to raise funds to address the famine in the Horn of Africa were well below those for previous (and recent) humanitarian crises. Why was this? Part of it likely had to do with the economic malaise in the U.S. and Europe. Many Americans suggested that we could not afford to help in this crisis because we had to pay off our own debt. This stinginess may, in part, be related to a general misunderstanding about how much of the U.S. budget goes to foreign assistance. A lot of Americans assume we spend over 25% of our budget on such assistance when it is actually less than one percent.
Furthermore, contemporary public discourse in America has become more inward-looking and isolationist than in the past. As a result, many Americans have difficulty relating to people beyond their borders. Sadly, it is now much easier to separate ourselves from them, to discount our common humanity, and to essentially suppose that it’s okay if they starve. This last point brings us back to Thomas Malthus, who was writing against the poor laws in England in the late 18th century. The poor laws were somewhat analogous to contemporary welfare programs and Malthus argued (rather problematically) that they encouraged the poor to have more children. His essential argument was that starvation is acceptable because it is a natural check to over-population. In other words, support for the poor will only exacerbate the situation. We see this in the way that some conservative commentators reacted to last year’s famine.
The reality was that a delayed response to the famine only made the situation worse. Of course, the worst-case scenario is death, but short of death, many households were forced to sell off all of their assets (cattle, farming implements, etc.) in order to survive. This sets up a very difficult recovery scenario because livelihoods are so severely compromised. We know from best practices among famine researchers and relief agencies that you not only need to detect a potential famine early, but also intervene before livelihoods are devastated. This means that households will recover more quickly and be more resilient in the face of future perturbations.
Conclusion
While the official famine in the Horn of Africa region is over, 9.5 million people continue to need assistance and millions of Somalis remain in refugee camps in Ethiopia and Kenya. Although this region of the world will always be drought prone, it needn’t be famine prone. The solution lies in rebuilding the Somali state and fostering more robust rural livelihoods in Somalia, western Ethiopia and northern Kenya. The former will likely mean giving the Somali people the space they need to rebuild their own democratic institutions (and not making them needless pawns in the War on Terror). The later will entail a new approach to agriculture that emphasizes food sovereignty, or locally appropriate food production technologies that are accessible to the poorest of the poor, as well as systems of grain storage at the local level that anticipate bad rainfall years. Finally, the international community should discourage wealthy, yet food insufficient, countries from preying on poorer countries in Sub Saharan African countries through the practice of land grabs.
_________________________________________
William G. Moseley is professor and chair of geography at Macalester College in Saint Paul, Minn.
References
de Waal, A. 2005. Famine That Kills: Darfur, Sudan. New York: Oxford University Press.
Mortimore, M. and M. Tiffen. 1995. “Population and Environment in Time Perspective: The Machakos Story.” In: Binns, Tony ed. People and Environment in Africa. New York: Wiley.
Moseley, W. G. 2011a. “Why They’re Starving: The Man-Made Roots of Famine in the Horn of Africa,” The Washington Post. July 29.
Moseley, W. G. 2011b. “Famine in the Horn of Africa: Malthus Beware.” Al jazeera English. August 23.
Moseley, W. G. and B. I. Logan. 2005. “Food Security” in B. Wisner, C. Toulmin and R. Chitiga eds. Toward a New Map of Africa. London: Earthscan Publications.
Samatar, A. I. 2011. “Genocidal Politics and the Somali Famine.” Al jazeera English, July 30.
Sen, A. 1981. Poverty and Famines. Oxford: Clarendon.
Watts, M. and H. Bohle, 1993. “The Space of Vulnerability: The Causal Structure of Hunger and Famine” Progress in Human Geography 17(1): 43-67.
Endnotes
- This article is a lightly edited version of the same article first published in Dollars & Sense, March/April, 2012, reprinted here with the kind permission of the publisher. [Back to Text ↩]