Wednesday, January 17, 2018

Halfdan Mahler, primary health care, and the warm decade for social justice

This is an expanded version of an article in the (50th) issue of BODHI Times. This is the occasional newsletter of the NGOs BODHI and BODHI Australia which I co-founded in 1989. It is based on several sources, including an excellent open access article by Theodore Brown, Elizabeth Fee and Victoria Stepanova called "Halfdan Mahler: Architect and Defender of the World Health Organization “Health for All by 2000” Declaration of 1978".

A pastor's son, "disease palaces" and the origins of primary health care

Dr Halfdan Mahler, who died in December 2016, was the third Director-General of the World Health Organization (WHO). Brown et al summarise his development: "The youngest child of a Baptist preacher, Mahler chose medicine over preaching." After training in tuberculosis care and prevention "he led the Red Cross campaign against the disease in Ecuador, before joining WHO in 1951. Experience in India convinced him that public health resources in developing countries were biased toward hospital-based medical care and that these priorities had to change."

Socrates Litsios, in a long personal reflection on primary health care and Health for All provides two relevant anecdotes about Mahler, one about the unfair distribution of health resources, the second about the difficulties of reform (see green).

Mahler related the example of a developing country in which health was declared as “a   universal human right” but in which one found, in one province, “80 per cent of the health budget being used to support one teaching hospital, whereas in outlying parts complete coverage is supposed to be achieved by one general purpose dispensary for half a million people.”
"During his work in India, Mahler had become convinced that tuberculosis could be successfully treated with ambulatory (non-hospital based) care: “[we went] with tears in our eyes, to the Minister and we said ‘Madame Minister, now that we have shown this you will have to close down all your tuberculosis hospitals because we need the money in order to do ambulatory kind of treatment’ and she looked at me and said ‘you must be a crazy man, even an elephant would cry over your naivete. How do you think I as a politician can close down the hospitals, you must be mad’.”

Mahler before a Mercator's projection; an unfortunate choice, given his dedication to social justice!
Primary Health Care, Disease Palaces and David Morley

A term reflecting the problem Mahler had perceived is "disease palaces". Popularised (and probably coined) by another Christian, David Morley, this refers to the disproportionate concentration of health resources in one, or perhaps a handful, of fairly elaborate hospitals, in countries which are overwhelmingly poor. The problem is not that a poor nation should not have a halfway decent (or even a good) hospital; it is that the amount of human and financial capital that such a hospital demands leaves little (from the health budget) for basic sanitation, nutrition, health education and other aspects of primary health care. More background about disease palaces is in Morley's book "Paediatric Priorities in Developing Countries". Published in 1973, the same year that Mahler was elected as WHO Director General, it too is a product of what Mahler called “a warm decade for social justice".

I never had the fortune to meet Morley, but I acquired and read his book in 1985, the year I focused on the study of health in what was then still largely called developing countries, including by participating for several months in health care in hospitals and remote rural settings (sometimes mobile) as a medical student in northern Nigeria and eastern Nepal. Another influential book was Maurice King's "Medical Care in Developing Countries". I have met Maurice several times, and in 1992 he became BODHI's public health advisor.

Primary health care (PHC) at its simplest, is "the place of first contact with a national health care system". In some countries, especially those that are poor, this means, for the overwhelming majority, health care that is not "tertiary", that is, not hospital-based. Using this definition, PHC need not be sophisticated, and can be delivered not only by people without university training, but without finishing high school. The most famous example of this were China's "barefoot doctors", whose actual benefit is hard to assess, due to the fog of propaganda.

Primary health care horror stories

Care delivered by barefoot doctors and their equivalent, at no or minimal cost to the patient, is obviously cheaper to provide than via medical schools, pharmacists, or properly trained nurses. In many poor settings, it may be better than fee for service alternatives, even if provided by supposed professionals. When I was more involved with the practical aspects of this topic (between 1985 and 1996) I heard horror stories of the financial abuse of patients (mainly in India and Nepal) by health professionals. 


An example I recall is of people with suspected tuberculosis (TB) travelling to a town for a chest X-ray when a sputum examination (at least in skilled hands) is more accurate and useful, and far cheaper. In low income settings (at least then) a chest X-ray, a test whose cost is modest to most readers, could lead to debt (not to mention the travel and foregone income from not working that day).

Also, as is still fairly common, powerful antibiotics were easily available over the counter (i.e. without a doctor's prescription). The risk of such self medication - especially for TB, which requires months of treatment - even if involving rifampicin - is of developing drug-resistant TB, which is an even more expensive and dangerous public health issue. This problem may be less now, due to DOTS - Directly Observed Treatment (short course), which was introduced to reduce this problem. 

In 1994 I spent one harrowing day seeing about 40 Tibetan refugees, as patients, in a camp in South India. Several were on "third line" anti-TB drugs. These third line drugs are expensive, have a high risk of severe side effects, and are not very effective. Patients get put on these drugs generally because the TB bacteria they harbour have evolved resistance to more effective drugs, because the patient did not take them for long enough (many months). The risk is that such resistant bacteria will cause a primary infection in a patient, such as a child, or even a friend or relative who has previously been cured. If it does, that child could be virtually untreatable, and so will the adult.

The possibility that the successes of PHC are over-stated

Ethical and appropriate care by barely trained community and village health workers cannot be guaranteed, and may well be uncommon, despite its supporters, who may "cherry pick" overlooking failures and exaggerating successes. Such care may be adequate for some health problems, many of which get better without treatment or proper diagnosis. Barefoot or village health care workers can also, in some settings, make simple diagnoses, give advice on hygiene, monitor infant and child growth and treat and prevent conditions such as diarrhoea. They might also apply first aid, administer vaccines, and treat a variety of other conditions. 

On the other hand, especially in very low income settings, many people do have significant underlying health conditions, not just TB, but depression, hypertension (very common in some low income settings, perhaps related to undernutrition in childhood) and chronic parasitic infections such as hookworm or schistosomiasis. (Note: hookworm can be easily treated but re-infection is common, where footwear and hygiene are poor, especially from a lack of toilets). 

I met some community health workers, in India, also in 1994. I felt very sorry for them; they had low status, were young, and appeared completely overwhelmed. There are many limitations to the effectiveness of primary health care but its promise to provide an affordable basic underpinning of health for the poor, especially in rural and remote settings, in ways that hospital based care cannot, was alluring, for a while. My personal experience also made me sceptical about some of the claims in David Morley's book. While the village (in Nigeria) he worked seemed a model of success, I was concerned that this depended too much on his nurse, a European with devotion and skills not easily reproduced and sustained in such settings, though, sometimes, religious faith can provide deep and enduring motivation.

But some primary care groups such as Jamkhed have an excellent reputation. With charismatic and dedicated leadership and support, fairly high quality health care can be provided by people without tertiary education. Such people acquire so many skills over time that they effectively are trained at a tertiary level, at least to the equivalent of a good nurse.

Leading to the "sacred moment" at Alma Ata: back to Mahler

Following the immense global trauma of World War II, so soon after the Depression, WHO and the other UN agencies were bathed in hope.

The struggle between social medicine and technical assistance

However, the inception of WHO (1948) was almost immediately followed by the withdrawal of the Soviets and several other socialist countries, from 1949 to 1957. Fee et al, in a useful recent paper (Nov 2016) in the American Journal of Public Health, trace some of the factors leading to this, especially the sacrifice made by the Soviets in World War II (at least 20 million dead, in comparison the US lost 400,000 troops) compounded by limited assistance to the Soviet Union and its allies, such as via the Marshall Plan. Of course, the US and its allies were both horrified and terrified by Communist beliefs, by its contempt for genuine democracy, by the ruthless purges of dissenters by Stalin and others and by the fact that the USSR was also developing nuclear weapons, as the Cold War progressed.

Some time after the USSR re-joined WHO, Dmitry Venediktov, the Soviet delegate to WHO, and a few others, supported by Mahler, called for a shift in focus by WHO, to better reflect the idealistic post WWII framing of health enshrined in the WHO constitution (see in green).
"Health is a state of complete physical, mental and social wellbeing and not merely the absence of disease or infirmity".

"the enjoyment of the highest attainable standard of health is one of the fundamental rights of every human being without distinction of race, religion, political belief, economic or social condition".

"governments have a responsibility for the health of their peoples which can be fulfilled only by the provision of adequate health and social measures ".

Many others in WHO, including Brock Chisholm, the first director-general (a Canadian), hoped that it would still stand by the social medicine principles embodied in its constitution. A key founding member, Andrija Stampar (born in modern Croatia, influenced in medical school in Vienna by Ludwig Teleky’s lectures on social medicine), argued that the WHO should concentrate on 4 principles:

A. social and economic security

B. education
C. nutrition
D. housing.

Taken seriously, a social medicine perspective required questioning the inequality of land ownership in rural areas and the striking inequities, poor housing, misery, and illness in urban areas. However, the United States, WHO's largest funder (at least then) was not interested in this approach and instead promoted the concept of “technical assistance” to the problems of health in what we now call the Global South.

“Technical assistance” conveyed the idea that developing countries were best helped through the transfer of knowledge of science and technology. As many critics have pointed out, this approach bypasses concern with the economic interests and social realities that led to and maintain underdevelopment.
In the 1950s, in its first full decade, WHO had led a massive attempt to eradicate malaria, a clear case of technical assistance. This was inspired by the then recent synthesis of DDT, and the discovery of its almost miraculous insect-killing properties.  It made an enormous difference in India, although most progress there occurred before World War I, which was, actually, the topic of my Master's thesis (at least in the Punjab, which then extended from Delhi right to the Afghan border.)

Fee et al argue that also contributing to the Soviet boycott was a growing conviction on the part of the Soviets and their allies that there were "two dramatically opposing views of public health: that of capitalism and that of communism". This dichotomy persists today.

The election of Mahler: a brief shift towards a social medicine approach by WHO

In 1973, elected WHO director-general, Mahler was able to help channel and shape the political and moral currents of the 1970s. In 1976, early in his directorship, Mahler and WHO accepted a proposal (made by the Soviet Union in 1974) to hold a conference, which was to be called the International Conference on Primary Health Care. In 1977, still led by Mahler, WHO adopted a resolution (WHA30.43) called “Health for All by the Year 2000”.

The conference, which received substantial Soviet support, was held the next year, in Alma Ata, today Almaty, the capital of Kazakhstan, but at that time in the Soviet Union. It was envisaged by Dr. Dmitry Venediktov, then Soviet deputy minister of health, to be on the same scale as the World Population Conference, held in Bucharest in 1974, and attended by more than 1400 people. According to Litsios, Mahler opposed this site, but various alternatives proved unfeasible.

This meeting included the statement that:

“an acceptable level of health for all the people of the world by the year 2000 can be attained.” 
The undermining of Primary Health Care

Soon after the "sacred moment" of Alma-Ata, efforts to weaken Health for All began, (and the broad view of primary health care) weakened, as neoliberalism increased in power. I, with Sharon Friel, also published on this in 2006) - it's also open access, in PLoS Med. We wrote:

"All social movements and scientific disciplines are subject to powerful institutional and natural forces that shape their social, economic, political, and environmental milieu .. like the health promotion movement, WHO is also subject to larger forces. Since Alma Ata, the rhetoric, aspiration, influence, and—arguably—the achievement of WHO has diminished, coincident with a decline in many public goods."

Mario Cuteo, in a 2005 editorial in the Bulletin of WHO also sheds light: "In its more radical version, the complete reform of public health structures and the promotion of major social changes were envisaged, with primary care as the new centre of health systems. In contrast, according to an instrumental interpretation, it was merely an entry point, a temporary relief or an extension of services to underserved areas. The latter interpretation could not avoid being perceived as second-class care, “poor” medicine for poor people."

Cuts to the WHO budget: the counter revolution, led by the US under President Reagan

Brown et al note: "Major donor nations, such as the UK and the US froze contributions to the WHO budget. Then, the Reagan administration decided to pay only 20% of its assessed contributions to all United Nations agencies. In 1979, the Rockefeller Foundation sponsored a small conference included representatives of the World Bank, the US Agency for International Development, the Rockefeller Foundation, and the Ford Foundation, which formulated an alternative “selective primary health care” agenda, which differed sharply from the agenda and spirit of Alma-Ata. Mahler’s principal lieutenant in the battle for Alma-Ata was Ken Newell, a New Zealander, whose edited book Health by the People was influential. 

Newell called these efforts a “counter-revolution.” The World Bank, the main international donor for health development, adopted a neoliberal policy of privatizing health services in the 1980s, further undercutting Alma Ata."

Also excellent and relevant in understanding the decline in the vision of Dr Mahler is the book by David Sanders and David Werner "The Politics of Primary Health Care and Child Survival". See this book review by Claudia Shuftan.

Mahler making enemies

Mahler alienated many people. Another anecdote from Litsios is that, when Director General elect, Mahler supposedly advised a “very powerful president of a developing country,” who had asked Mahler what he could do to develop a health care system (which Mahler told him he did not have) as follows: “I think the first step is to close the medical schools for two years. Then we can discuss what the medical schools were supposed to do, because they really constitute the main focus of resistance to change”. 

Back to the future: world health dominance by corporate-derived foundations 

Technical assistance such as to eradicate malaria was revived, but not by WHO in the 2000s. It is again likely to fail, this time in part because of increasing resistance to insecticides used to impregnate bednets, by mosquitoes, rather than resistance to DDT, the breathrough seen as miraculous in the 1950s. 

Mahler and Family Planning

After retiring from WHO in 1988, Mahler became Director of International Planned Parenthood Federation until 1995. Throughout his career, Mahler emphasised the role of women in promoting health. “Women are the raw material for development”, Worning remembers him saying. The economic and health benefits of spacing children has long been one of my own main pleas. I have written dozens of papers about this, but with little obvious result! (e.g. Reflections on human carrying capacity)

Mahler's final plea

Thirty years after the Alma-Ata Declaration, Mahler told the World Health Assembly in 2008: “To make real progress, we must, therefore, stop seeing the world through our medically tainted glasses. Discoveries on the multifactorial causation of disease have, for a long time, called attention to the association between health problems of great importance to man and social, economic, and other environmental factors.”

Looking forward

Although the Sustainable Development Goals may seem as hopeful and aspirational as Health for All, they have virtually no chance of succeeding. In fact, it is all too plausible that we are headed for a new dark age. The presidency of Donald Trump is but one manifestation of deepening inequality. Ugo Bardi, a member of the Club of Rome, has noted that as cheap energy declines, slavery is returning.  Famines are returning and "regional overload" is obvious in a growing number of locations.

A great awakening is needed. We need new leaders such as Jeremy Corbyn, Bernie Sanders (or maybe Elizabeth Warren) - but not neoliberal Oprah Winfrey. We need more vegetarianism and technological miracles. We need more courage in academia and also in the development movement, who are either blind or self-censoring about high rates of population growth in poor countries. We will need a lot of luck, because leadership such as from Mahler is now very hard to find.

Sunday, January 14, 2018

“Regional overload” as an indicator of profound risk: a plea for the public health community to awaken

Comments welcome. First draft of an abstract for a chapter in a forthcoming book to which I have been asked to contribute called: "Medicines for the Anthropocene: Health on a Finite Planet"

Public health is the field of medicine charged with protecting human well-being, via the identification of health risks and the development and implementation of effective strategies to lower such dangers. Public health workers have a long history of promoting new paradigms and tackling vested interests, often with great difficulty and against immense opposition. Their campaigns often take decades to be successful, even at a regional scale.
Today, the greatest threat to global public health is barely visible within public health circles, although, like blind men palpating an elephant, some of its many manifestations are recognized. These identified protuberances have several names. Ecological public health, climate change and health, and even planetary health have arisen, in recent decades, as sub-disciplines of public health. The risk of nuclear war to health has seen the awarding of two Nobel Peace prizes to health-related lobby groups. Tropical medicine has evolved to international health and global health.
Yet, none of these emerging sub-disciplines or fields, as yet, fully integrate the health risks arising from what McMichael called “planetary overload”. Central to this (Malthusian) conceptualization is the risk of large-scale conflict, famine and infectious diseases, acting, increasingly, in combination, in ways that reduce human well-being, as a consequence of linked “eco-social” phenomena. 
Crucially, ecological and environmental contributors to these crises must be recognized not just as living elements of the biosphere, such as crops, coral reefs and forests, but inert resources, particularly of fossil fuels and the rare elements needed to drive the accelerating energy transition. Although the health community is belatedly awakening to the public health harm that climate change constitutes, there is scarcely any recognition, within public health, that the growing scarcity of these inert resources risks the triggering of social consequences with profound adverse health consequences, many of which are already apparent, in examples of what Butler has termed “regional overload”.
There is a growing number of recent examples of this; from Syria to Yemen, South Sudan, north-east Nigeria and Myanmar. However, very little analysis exists, within public health, of the shared causal pathways which underpin these localized public health catastrophes.
There are many reason for this scarcity of analysis. One is the bias, within most public health fields, to problems in high-income settings. Publications on (for example) public health in South Sudan are rare, and data-rich papers are almost non-existent, due to the difficulties of safely obtaining such data. This leads to a reinforcing cycle, a positive feedback loop, where there is little market or appetite for analysis, and where such papers, if submitted are unlikely to be accepted due to their likely deficiencies. 
Another important impediment is of disciplinary boundaries and suspicion of the “academic other”. These factors contribute to the failure of academic institutions and health funders, who are overwhelmingly embedded in a milieu of neoliberal signals and incentives, to recognize the need for a new paradigm to emerge, if humanity is to thrive in this and coming centuries.

Thursday, December 28, 2017

2017: in the rear vision mirror, and the world in 2046

In late 2017 I listened to an interview of recent U.S. President Barack Obama, by Britain's Prince Harry. Most of it was quite reasonable, but when Obama claimed that we live in the best time ever I could not agree. 

Famines, Myanmar and Chinese surveillance

As I write, the world is experiencing 5 famines, one of which (Yemen) is shaping as the worst in decades. (The other famines are in South Sudan, NE Nigeria, Somalia, and two regions of the Democratic Republic of the Congo.) Inequality continues to worsen, as do global warming forecasts and climate change reality.

2017 closed with UNICEF reporting how children have been increasingly been used as weapons of war. UNICEF also said parties to conflicts were blatantly disregarding international humanitarian law and children were routinely coming under attack and that rape, forced marriage, abduction and enslavement had become standard tactics in conflicts across Iraq, Syria and Yemen, as well as in Nigeria, South Sudan and Myanmar.

This year also saw the ethnic cleansing of hundreds of thousands of Rohingya from Myanmar, more flouting of the refugee convention by Australia and countless atrocities in many countries in the Middle East and Africa. We have become so conditioned to horror that phenomena such as children or the disabled being forced to act as suicide bombers or entire coral reefs sickening fail to shock us.

Surveillance in China has intensified (a reported 480,000 cameras in Beijing, in 2015) with face recognition technology, other forms of artificial intelligence, and a points system to detect resistance making existence  harrowing in Xinjiang. The self-immolations of Tibetans, which I have long argued are a waste of life (given the indifference and false stereotypes of Tibetans most Chinese hold) continue. Chinese state media has allegedly compared the Dalai Lama to Hitler (n.b. link is in Chinese, I cannot tell if it is valid.) According to a recently released report sourced to the British ambassador to China in 1989, the death toll in the Tiananmen square massacre may have exceeded 10,000 people (see red text). To this day, the mothers of the murdered students are blocked from visiting graves or holding memorials.

Planetary health and human rights

Professionally, 2017 has been challenging, as I continue to seek academic employment, after resigning from a part-time position as a professor at the University of Canberra in July 2016. In late 2017 I resigned from a pro bono writing role with the United Nations Environment Programme. However I continue with many other writing and academic commitments, also pro bono. I have been invited to a workshop for a book on limits to growth and health, in Canada, in April, 2018. To qualify for permission to enter Canada I need a police check, a legacy of my arrest for civil disobedience in 2014, over climate change. I have so far been waiting for a decision for over a month. In December 2017, probably the largest ever civil disobedience protest, in the world, by health professionals, against coal mining occurred, with five arrests. I published an article on this in Croakey.

There have been successes: three invited talks and 4 publications, two in edited books, three chapters in press (all related to limits to growth and health) and one under review for the BMJ, related to novel entities and the so-called nocebo effect. I have posted 29 essays in this blog, which has had over 15,000 views this year. I may be able to enrol for a (second) PhD in 2018, at the University of Sydney, related to "planetary health" and, perhaps, human rights.

As we head to 2018, my main frustration is this. Since 1991, including in most of the previous 50 issues of BODHI Times, I have warned of climate change, neoliberalism, root causes of terrorism and poverty traps, especially in Africa and Asia. A great deal that I (and a few others, including Maurice King and the late Tony McMichael) warned of is now occurring. For example, more than 3,000 would-be migrants drowned in 2017, trying to cross the Mediterranean, many of whom were from the Sahel. The arrival of more than 1.5 million desperate people in Europe, in 2017, has contributed to a right-wing backlash, with a neo-Nazi party now sharing power in Austria, and fears of a return to extremism in Germany, including in its military.

The Gaza Strip has been described by Gideon Levy, a dissident Israeli journalist, as the biggest cage in the world, part of a "never-ending mass experiment on human beings". Levy also describes how the December killing, by Israeli military personnel, of a legless man in a wheelchair "passed almost without mention in Israel. He was one of three demonstrators killed Friday, just another humdrum day."

The world in 2046

BODHI was started by Susan (see photos of her memorial, below) and I in 1989, almost 28 years ago. If we are lucky, there will still be a semblance of civilisation in 2046, in another 28 years. It is likely to be a world of increased inequality, more famines, more surveillance and less freedom. Artificial intelligence and robots will not lead to utopia, but mass social unrest may be reduced by a universal income, at least in more enlightened locations. Weapons systems may be controlled by algorithms; we will be very lucky to avoid nuclear war.

In 1989 this trajectory was broadly foreseeable (see my paper in the 1991 Med J Australia, as an example), but there then seemed time to change it. However, the voices of those trying to head this off have been very faint, and are growing fainter. Neither the dominant media nor dominant academia seems to care; perhaps it is too overwhelming.

Despite all this, it is better to light a candle than curse the darkness. BODHI's partners in India and Bangladesh are lighting such candles. Thank you for reading this far and thank you for your support for BODHI's work.

Saturday, December 23, 2017

Reflections on human carrying capacity

Adapted from a chapter called "Population Trends and the Environment" published in 2012 in the
Praeger Handbook of Environmental Health. (Editor R. H. Friis), Westport, CT, USA. 


It is obvious that humans depend on the environment that exists here on Earth, in the habitable "Goldilocks" zone around our sun. Indeed, the region of the galaxy thought suitable to form terrestrial planets and old enough to allow biological evolution of complex multicellular life is comparatively limited.1 This nurturing milieu of Earth, atmosphere, and solar energy provides space, breathable air, liquid water, food, energy, and other resources necessary, useful, and desirable for humanity. Human life also depends on nonhuman life-forms. Not only did humans evolve from earlier forms of life, but all of the food we eat was once alive. Many other species contribute to 'ecosystem services' that benefit humans.2 Less obviously, many microscopic forms of life coexist with, on, and within humans (e.g., in the human gut); though some of these are harmful, many are benign, but the majority may be symbiotic and thus beneficial.

As humans have increased in number, so too has their impact on the environment.3 This is true even if additional people live in poverty. Poor people still require food, clothing, housing, and fuel. However the environmental impact of wealthy people is much greater, whether one considers diet (higher on the food chain, greater wastage of calories), housing (quality and size) or energy usage (electricity, travel).

Technological improvements (e.g. electricity powered by renewable energy, the driving of hybrid electric cars) only partially alleviate this impact. Increased efficiency is lowered by Jevons paradox, the tendency to increase the use of products if they become more efficient (and hence often less costly).

Global human population is now approaching seven billion and continues to climb by a million people every few days, as it has done now for several decades. Between 2000 year ago and AD 1650, global population increased by only 200 million, during which time two serious setbacks occurred, the Justinian plague in the sixth century and the Black Death in the fourteenth century. In 1900 global population was only 1.6 billion; it is now well over 7 billion. The rate of global population growth peaked in the late 1960 but in absolute term it is still very high increasing by at least 70 million a year, and probably over 80 million. Almost all of this growth is among poor people in low-income countries.

In comparison to the size of a person, the surface of the earth is enormous. The fact that surface area is fixed suggests that the expansion in human numbers will one day be forced to slow. However, well before space limitations force any such tapering many resources upon which humans depend will become scarce, depleted or degraded. This will necessitate radically new ways of life or a slower rate of population increase. It may even cause a steep decline in global population size.

A conscious human understanding of the relationship between population and environment is likely to be many thousands of years old and to far predate written records. This is difficult to prove, but considerable evidence can be detected. Writing consistent with this understanding survives from ancient Babylon, Rome, Greece, and China.4 Ecclesiastes (5:11) records that "when goods increase they are increased that eat them."5 In China, Hung Liang-Chi (1744-1809), independently of Thomas Malthus (1766-1834), wrote about population outstripping food supply, survival of the fittest, and reliance on the natural "checks" of flood, drought, plague, pestilence, and warfare to limit population growth.6

An unconscious understanding of this relationship can be deduced by consideration of the phenomena of migration and ancient forms of fertility restriction. Hundreds of thousands of years ago human ancestors and species related to humans, such as Neanderthals, migrated from Africa. Today, large-scale human migrations continue, including from Africa. The reasons for ancient and modem migration are easily understandable: people were and are looking for opportunity and reward, and they thought, hoped, and believed that this might exist over the horizon. Often they were correct.

Written records survive from comparatively recent times of abundant food species available to humans, especially in marine and coastal areas, lightly populated by modern standards. For example, stocks of cod were so rich and dense in some parts of the North Atlantic that they seemed as though they could be walked on.7 The paleontologist Tim Flannery describes, but also laments, how human populations often mismanaged and even squandered abundant resources upon arriving in an uninhabited ecosystem. A good example is the moa, a large flightless bird that existed by the thousands when Polynesians first migrated to Aoetoroa (New Zealand) around 1200.8 Within a hundred years humans were well-established in that country, but the moa was extinct. Indeed, throughout the Pacific, many species of bird perished, fairly soon after human colonization.9 Flannery characterizes behavior such as overhunting of the moa as "future eating" because such profligacy reduced the resources available to future human generations.8

ln some cases local elimination of species may have been intended, such as the retreat of large carnivores that threatened humans and their domesticated animals. In other cases, such as the "megafauna," extinction may have been unintended but till inevitable, given the growing sophistication of hunting and the fragility and scale of the environmental resources required to support them.

Nonetheless, future historians are likely to lament the wastefulness of our own time, especially in the way that we are consuming fossil fuels and several other nonrenewable resources.10 We are also critically reducing biodiversity (including highly prized species used for food, such as fish)2 and polluting, to the cost of our descendants, the atmosphere and oceans, with greenhouse gases, particulate matter, and other toxic substances such as mercury, plastics, and persistent organic pollutants.11

The Debate: Are Humans the Exceptional Animal?

Readers with backgrounds in ecology, geography, or anthropology may find the introduction above entirely unexceptionable. In fact, debate about the significance of the human impact on the environment has become in recent decades extremely contentious, especially in the social sciences. For example, in 1982 two demographers summarized the issue of population in the then recent demography literature by stating: "From a high point some 10-15 years ago, intellectual concern about population has steadily waned to a position where it falls now somewhere between ocean mining and acid rain."12

For much of the following two decades a widespread view prevailed, not only in demography but also in economics, that the issues raised by those concerned about limits to population and economic growth were invalid. A representative view was expressed in The Economist:

It is argued that predictions of ecological doom, including recent ones, have such a terrible track record that people should take them with pinches of salt instead of lapping them up with relish. For reason of their own, pressure groups, journalist · and fame-seekers will no doubt continue to peddle ecological catastrophes at an undiminishing speed. These people, oddly, appear to think that having been invariably wrong in the past makes them more likely to be right in the future. The rest of us might do better to recall, when warned of the next doomsday what ever became of the last one.13

This lessening of anxiety permeated public opinion. A global backlash flared against family planning, also fueled by the clumsy, corrupt, and coercive practices of many "family planners."14 Yet in the scientific literature, apprehension about the danger of too many people abated only slightly. Some writers continued to warn that a high rate of population growth, especially among poor and badly governed nations, was harmful to economic development, 15 although this debate was far less evident after 1980 compared to its heyday in the 1950s.16 17

Perhaps the most significant evidence for this ongoing fear was that in 1992 more than 1,700 leading scientists, including over 100 Nobel Laureates, signed the World Scientists' Warning to Humanity, which argued that human population growth threatened the viability of the global environmental support mechanisms for humanity and called for urgent action to' slow human population growth.18 In contrast to the urgency of this warning and to the stature of the signatories, the paucity of press attention paid to it was shameful, revealing, and unsettling.

In 1993 fifty-eight science academies issued a similar statement, nominating human population growth as a critical issue if civilization is to flourish. The African Academy of Science dissented: Whether or not the Earth is finite will depend on the extent to which science and technology is able to transform the resources available for humanity. There is only one earth-yes, but the potential for transforming it is not necessarily finite (emphasis added). 19

Yet in 1994 lethal genocidal conflict erupted in Rwanda, then (as now) the most densely populated nation in Africa. Within a few months about 800,000 people were killed, representing almost 10 percent of the Rwandan popuJation.20 Though many analysts have focused on the rivalry between the two main ethnic group (Hutu and Tutsis) as the "cause" of this conflict that is a simplistic explanation. A perfect storm occurred to ignite this genocide. Land scarcity in Rwanda21 combined with a scarcity of urban employment opportunities and declining prices for its major export, coffee.22 Unlike many developing countries, Rwanda received very few remittances (i.e., foreign exchange transferred home by Rwandans working internationally). The high fertility rates in this largely Catholic nation led to a 'bulge of youth23 many of whom were unemployed, unmarried, and squashed together in Rwanda's main city, Kigali.

These young men became the main implementers of the genocide. It is widely accepted that male youth are the most violent of any human age group, but of course youth bulges elsewhere have not led to conflict on any scale similar to that of Rwanda at that time. Without excusing the "genociders" for their collective responsibility, the Rwandan butchery can also be viewed as a systemic phenomenon, an example of an "irruptive" population trajectory. This phenomenon is well known in ecology, a pattern of a comparatively steep rise in population followed by a sharp decline. This was first described for some herbivores in ecosystems that lack predators, such as wolves.24 It is also common for some invertebrate populations, especially for introduced species.25

However, the possibility and application of irruption to human societies and populations is far less accepted, for several reasons. One such reason is that most humans do not like to consider themselves as animals. We prefer to think of ourselves as a special case, liberated from the biological constraints that may have burdened our ancestors. Our collective capacity to reason, to invent, and to migrate helps to disguise and hide human dependence upon environmental (including ecological) resources. But this dependency remains absolutely fundamental and unchangeable. Indeed, the more we collectively assert our independence from nature, the more vulnerable we are.

Many authors point to a deep human capacity for denial, perhaps one that is "hard-wired."26 Such denial is also evident in the human capacity to ignore the suffering of people who are far away in distance or time or unlike us in other ways. However, humans also have a disconcerting capacity to become conditioned to ignore the suffering of people close to them;27 this is unsettling in the context of the future.

Many other examples exist of conflict among humans that have a basis partially derived from resource scarcity; in fact, some might argue that it is hard to find situations of human conflict that do not have an environmental basis, whether the Israeli-Palestinian struggle,28 Darfur (Sudan),29 the 2003 invasion of Iraq, or the German invasion of Russia in World War II, when the Nazis desperately strove to secure the oilfields around Stalingrad. On the other hand, a vigorous literature denies any substantial relationship between resources and conflict, instead focusing on political religious, and ethnic elements that are claimed to be unrelated to control over resources. A prominent recent example of this disconnect is the 2003 invasion of Iraq, when the U.S. government, and some of its allies, strenuously and repeatedly denied any oil-related motivation for the invasion.

The History of Contraception

In recent history, and perhaps even now, there has been a widespread perception that contraception is a relatively recent addition to human knowledge.30 In fact, there is substantial evidence of ancient forms of restricting fertility, that is, well before the era of oral contraception, and even before widespread knowledge of the "fatal secrets" (including withdrawal and condoms) that underpinned the first marked decline in fertility in modem Europe, observed in France in the eighteenth century.31 Coitus interruptus was not invented by the French. There are numerous references to this practice in Jewish, Christian, and Islamic texts, but this method appears to have lost favor in the early Christian period. Mohammed approved the use of al-azl, mentioning that the man's wife should also give her permission.32 Its practice was widespread in Europe, though officially frowned upon. To this day it retains its Latin name, at least in English. Herbs have also been used for millennia, both as contraceptives and to induce abortion, though with varying success.32, 33

Many customs have been followed that do not rely on pills, potions, or douches. In the seventeenth century in England (where excellent records survive), the average age of marriage reached thirty, and it was higher among poorer than wealthy families. 34 Western societies had no monopoly on spacing out and sparing children. The Koran recognizes that prolonged lactation delays fertility.32 More than 1,000 years after that book was written, Charles Darwin observed that prolonged breastfeeding likely contributed to small families.32 But where birth could not be prevented, infanticide and neglect were common.32

The anthropologist Virginia Abernethy describes a variety of customs and practices that helped to regulate human populations, including in isolated valleys of New Guinea, keeping the populations within the limits of "human carrying capacity." In some valleys with a comparative scarcity of people, widows could easily marry; in others, with a comparative oversupply of people, various taboos limited opportunities for sexual intercourse.33 Abernethy also describes how the Roman Empire, under Constantine (the first Christian emperor), turned away from pronatalist legislation, at a time when unemployment was rising, agricultural production was falling, and there was a growing sense that "the world was full."35 In many parts of Africa prolonged breastfeeding (up to four years among
the Bushmen), postpartum abstinence, and pathological sterility modified fertility.36 In parts of the Himalayas and Tibet, polyandry has long been practiced, another custom that reduces fertility.

Perhaps the most extraordinary method of reducing human fertility is sub-incision, a traditional ritual practice long observed among adult indigenous men in the most arid and harsh environment of Australia. Early European commentators developed rather fanciful psychoanalytic interpretations for this custom,37 and there is no clear evidence that Indigenous men understood its purpose, beyond that of bonding and initiation. However, it is almost certain that sub-incision had a powerful contraceptive effect, by causing an artificial hypospadius (a condition in which the urethra opens along the shaft of the penis).38 The coincidence of this custom with the driest part of Australia provides very strong circumstantial evidence that sub-incision coevolved in the human population as a mechanism to limit human environmental impacts and to maintain prolonged and tolerable human existence.39

Malthus, the Green Revolution, and the Cornucopians

In the West the debate that concerns Limits to Growth40 and the tension between population and food supply is often summarized as Malthusianism, an eponymous term derived from Thomas Robert Malthus. However, Malthus was but one of many intellectuals who recognized the general principle. Western forebears of Malthus included Benjamin Franklin, Adam Smith, and the pioneering political scientist Giovanni Botero (1544-1617). Similar principles were recognized by such non-Westerners as Japanese, Chinese, and Arabic scholars.6

Simply expressed, the principle that often bears Malthus's name states that population growth is constrained by the availability of food and that, in the absence of deliberate checks on population growth (such as delayed marriage or the use of contraception), other checks" will occur, like famine, epidemic. or war. Though Malthus appear to have referred only to food, his thinking can be applied to resources in general, such as land, water, energy and other raw materials required to grow and distribute food. Caldwell notes that Jevons in the mid-nineteenth century indeed did extend the ideas of Malthus beyond food.41

Such thinking has long been controversial in both West and East. Many nations saw power and prestige flowing from large populations irrespective of the health, wealth, and vitality of those populations. Mao Tse Tung, the first Communist ruler of China, held such views,42 at least prior to the terrible famine from 1959 to 1962, in which about 30 million people perished.43 Remnants of this "mercantile" thinking persist today. For example, Ugandan President Museveni is on record as recently calling for a larger Ugandan population, which he equates with success.44 Yet such high population growth in Uganda and some other countries of East Africa is likely to delay achievement of the Millennium Development Goals45,46 and may, if it continues long enough, lead to other examples of irruptive population trajectories.

Even in his day, Malthus was controversial. Malthusian thinking influenced British economic and population policy, including its response to the Irish famine of the hungry 1840s. Marx was very critical of Malthus, starting a long tradition of suspicion among the Left of any advocate or policy that stresses population rather than distribution or production. Malthusian thinking began to be challenged in Europe as early as the "optimistic 1860s" but persisted far longer in less-industrialized, low-income, and famine-prone countries such as India.47

However, support for Malthus periodically strengthened, including from Lord Maynard Keynes, the most influential economist of his period, from the 1930s until his death soon after he had cofounded the World Bank. Bowen, defending Malthus in 1930, wrote: "It did not occur to those who contributed to this torrent of abuse that a fire which could not be put out must be the very fire of truth itself."48 A chief motivation used by Hitler for the expansion of the German Reich was his internally popular call for more Lebensraum, or living space. Hitler also craved oil security. However, Hitler's policies had two fatal flaws. The first is that Lebensraum could only be acquired at the expense of the living space of other people, such as the Poles and the Russians. The second flaw is that, in fact, the resources available in Germany in the late 1930 were adequate for a relatively high standard of living.

Although the expansion of the European empires preceding the Third Reich was not justified in terms of Lebensraum or carrying capacity, a concept that had not been applied at that time to human populations,24 there should be little doubt that territorial expansion and conquest both provided a relief valve for millions of Europeans and enabled the appropriation of vast extra-European resources from the colonies. This contributed to a rapid increase in European living standards.

In the first decades after World War II, concern about the Malthusian check of famine rose. In 1959 the then comparatively new Food and Agricultural Organization of the UN (FAO) warned that population was outstripping food. Such concerns were widely held, not only by Paul Ehrlich (whose book title, The Population Bomb,49 was taken by permission from a pamphlet earlier published by the heir of the Dixie Cup fortune, William Draper14), but also by U.S. President Lyndon B. Johnson and many other writers.

Yet as global population increase crested (as a proportion) at just over 2 percent in 1969, relief was in sight, in the form of the Green Revolution. Though much attacked by its critics, the Green Revolution led to a remarkable increase in the productivity of much cultivated land area, for example by the use of new strains such as dwarf wheat. It is true that these new crops were comparatively homogenous, fertilizer-hungry, and pesticide reliant. There is also no doubt that some human actors used the Green Revolution to increase inequality, for example, by encouraging indebtedness and perhaps by overcharging for fertilizer. Further, the enormous and rapid increase in per capita food supply did not eliminate global undernutrition, though it did lay the foundation for a reduction in both proportion and absolute terms, which continued until about 2000. But the failure of global society to achieve a steeper and more substantial decline in hunger is certainly not the fault of the technologies used in the Green Revolution, but of the failure of numerous human actors to collectively remedy the numerous other factors that are associated with and help to perpetuate inequality. A co-factor for the failure to better capitalize on the Green Revolution was the continued high rate of population increase, especially in the 1980s, the lost "decade of development." The lagging demographic transition not only reduced the global per person abundance of food but also hindered economic development in low-income countries,46 of which Rwanda is the most tragic example.

In 1970 the agricultural scientist Norman Borlaug was awarded the Nobel Prize for Peace for his seminal role in fostering the Green Revolution. Borlaug (later a signatory of the World Scientists' Warning to Humanity) was well aware of the consequences of rapid population growth.  In his acceptance speech, he warned:

“The Green Revolution has won a temporary success in man’s war against hunger and deprivation; it has given man a breathing space. If fully implemented, the revolution can provide sufficient food for sustenance during the next three decades. But the frightening power of human reproduction must also be curbed; otherwise the successes of The Green Revolution will be ephemeral only. (emphasis added)"50

It is sobering to realize that in late 1984, almost midway through the period of respite forecast by Borlaug, President Ronald Reagan became the first U.S. president in a series to specifically deny the importance of population.51 The Catholic U.S. President John F. Kennedy has been credited with advancing the global agenda to slow population growth in the early 1960s, particularly through U.S. support for key UN resolutions on population.52 Between Kennedy and Reagan, U.S. Presidents Johnson, Nixon, and Carter were explicitly Malthusian in their outlook. In 1968 Johnson authorized the emergency shipment of grain to India, then facing a famine, on condition that India increase its family planning program.53

Toward· a Theory for Human Carrying Capacity

The science of ecology largely concerns the interaction of many species, a mixture of predators and prey, all sharing different degrees of competition and cooperation in the context of always limited resources. All ecologists recognize that there are limits to the growth of any one species and also of the total biomass supported by any area and climate. Yet in ecology, the theory of carrying capacity has fallen from favor, perhaps because more precision was anticipated from the concept than could be delivered.24 Ideas about human carrying capacity (the maximum population supportable by a region for an indefinite period) are even more contested not least because the concept can and has been used to justify coercion,24 invasion, and "ethnic cleansing." Nevertheless, problems with the precise definition of human carrying capacity do not justify abandonment of the entire concept.
This is because its denial would obviate limits to population growth, which is untenable, despite legitimate debate concerning the proximity of these limits.

One way to conceptualize global human carrying capacity is as an emergent phenomenon arising from the interaction of different forms of wealth. Human will remain perpetually in need of resources from the physical environment, such as food, water, and energy. But people are not ciphers, programmable to live in perfect harmony, irrespective of their social milieu. Though some groups (such as the citizens of Hong Kong or Tokyo) are able to live largely peaceful, cooperative lives in intense proximity, these forms of group behavior cannot easily or quickly be transplanted to the entire globe.

Ample evidence for this includes the high fraction of wealth used for military purposes, largely insulated from spending cuts even during recession. History teaches that war and conflict are part of the human condition. Natural capital (both nonrenewable resources such as fossil fuels or uranium and renewable resources such as forests, water, and fisheries), and the degree of social cohesion are important determinants for the human carrying capacity of any region. However, the resources that generate human carrying capacity can be used not only to support a given number of humans, but also to provide wealth for any given population size. For example, the carrying capacity of Rwanda at the time of the genocide in 1994 would have been higher if there had been only one main ethnic group. But total homogeneity would not have indefinitely postponed the Rwandan calamity. The ethnically homogenous (and also largely Catholic) country of Ireland suffered an even greater proportional population crash between 1848 and 1852, following the cool, wet summer of 1848, when the potato blight Phytopthora infestans flourished. A more plausible mechanism by which genocide may have been deferred in Rwanda is via an earlier introduction of education, health care, and family planning supported by the Rwandan Catholic Church and promoted by good government. This in turn is likely to have facilitated economic takeoff, generating a virtuous circle leading to the establishment of export industries, more tourism, and increased skills enabling significant remittances. Instead, the people of Rwanda were effectively trapped, with diminishing marginal productivity that heralded the barbarity of its genocide.

In addition to the foundations of natural and social capital human carrrying capacity can be expanded by ingenuity and by inherited wealth: the quality and quantity of built and financial resources. Another example concerns the Caribbean island of Hispaniola, of which about two-thirds is allocated to the Dominican Republic, with the balance allocated to the more densely populated Haiti. The average income in the larger eastern nation is about seven times that of Haiti. There are, of course multiple interlocking reasons for this difference, but an important component is the rate of population growth.

With a sufficient expansion in resources (such as an infusion of capital), both nation could support more people at a higher living standard. The tiny, natural capital-poor island of Hong Kong shows that ingenuity, cooperation, and high human built and financial resources can support and sustain a densely settled population at a high living standard. However, Hong Kong cannot be extrapolated to the entire world. Many of its natural resources are supplied from beyond its boundaries, purchased with capital generated by the sale of manufactured goods and of ideas and services that are exported. Similarly, the wealth and total population size of Europe expanded in—the nineteenth century, due in part to the European import of natural capital from its numerous colonies; again a strategy that could not be applied globally.

The concept of human carrying capacity is necessarily imprecise; it does not mean that any simple number exists for how many people the earth could support. Such an answer depends not only on the human, financial, natural, social, and built resources available to any society, but also on any particular society's preference for wealth, numbers, and population growth rates.

A clearer understanding of these issues will reduce the chance that nations fruitlessly pursue wealth via a high birth rate. It also reduces the chance of population crashes, not only on a local and regional scale, but perhaps even on a global or subglobal scale.


Doctors, like ecologists, also recognize that endless growth is impossible. Indeed, trends in that direction, beyond an optimum, are harmful to a patient, whether as obesity or cancer. Some of our most eminent economists, including John Stuart Mill (I 806-l873), have recognized that in the future limits to growth will impose a steady state economy.54 Yet the prospect of Limits to Growth continue to be vigorously resisted. Its denial and that of the fundamental dependency of human well-being on limited environmental and ecological resources is every day leading the world closer to an eco-social precipice. Climate change, costlier energy, and the consequences of food insecurity and inequality could well push us over this precipice.

If we continue on this pathway, we risk a massive irruptive population trajectory, manifesting via a "blowback" world,55 in which nuclear-armed terrorist groups and rogue states strike out, leading to confusion in the "fog of war." A more plausible future, however, is the gradual intensification of our existing "enclave world," in which "seas" of order increasingly strengthen the barriers-financial, legal, military, and psychological against the "islands" of disorder. The size and influence of these islands are continuing to grow, whether manifest as pirates from Somalia, favelas in Rio de Janeiro, terrorists from Afghanistan, or train-derailers in the growing chaos and insurgency of northeast India.

To prevent these futures, we need to work toward a genuinely fairer world, in which we "muddle through" to a viable future. This is not a utopian vision of "Health for All," and the term "muddle through" should not be taken to mean a relaxed, laissez faire pathway. But it recognizes that, even if we need a massive technological campaign to accelerate the technological and sustainability transition, we will still need considerable agility and some luck. Irruptions are inevitable but need not become overwhelming.  Complacency will, however, be a fatal error.


1. C. H. Lineweaver, Y. Fenner, and B. K. Gibson, The galactic habitable zone and the age distribution of complex life in the Milky Way, Science 303 (2004 ): 59-62.
2. Millennium Ecosystem Assessment, Living beyond our means, Natural assets and human well-being (Washington, DC: Island Press, 2005).
3. P. R. Ehrlich and J. P. Holdren, Impact of population growth, Science 171 (1971): 1212-1217.
4. J. E. Cohen, How many people can the earth support? (New York: WW Norton and Co., 1995).
5. P. Demeny, Demography and the limits to growth, Population and Development Review 14 (1988): 213-244.
6. L. Silberman, Hung Liang-Chi: A Chinese Malthus, Population Studies 13 (1960): 257-265.
7. M. Harris, Lament for an ocean: The collapse of the Atlantlc cod fishery; a true crime story (Toronto: M&S, 1999).
8. T. F. Flannery, The future eaters (Melbourne: Reed Books, 199 ·
9. J. Diamond, Easter Island revisited, Science 317 (2007): 1692-1694.
10. C. A. S. Hall, W. John, and J. Day, Revisiting the limits to growth after peak oil American Scientist 97 (2009): 230-237.
11. J. Rockstrom W. Steffen, K. Noone, et al., A safe operating space for humanity, Nature 461 (2009): 472-475.
12. G. McNicoll and M. Nag, Population growth: Current issues and strategies, Population and Development Review 8 (1982): 121-139.
13. Anonymous, Plenty of gloom, The Economist 345 (1997): 19-21.
14. J. Kasun, The war against population: The economics and ideology of population control (San Francisco: Ignatius Press, 1988).
15. M. King, Health is a sustainable state, The Lancet 336 (1990): 664-667.
16. A. C. Kelley, The population debate in historical perspective: Revisionism revised, in Population matters: Demographic change, economic growth, and poverty in the developing world, edited by N. Birdsall, A. C. Kelley, and . W. Sinding (Oxford; New York: Oxford University Press, 2001), 24-54.
17. R. R. Nelson, A theory of the low-level equilibrium trap in underdeveloped economies, American Economic Review 46 (1956): 894-908.
18. Union of Concerned Scientists, World scientists' warning to humanity (Cambridge, MA: Union of Concerned Scientists, 1992).
19. African Academy of Sciences, The African Academy of Sciences on population, Population and Development Review 20 (1994): 238-239.
20. P. Uvin, Tragedy in Rwanda: The political ecology of conflict, Environment 38 (1996): 7-15.
21. C. Andre and J.-P. Platteau, Land relations under unbearable stress: Rwanda caught in the Malthusian trap, Journal of Economic Behavior and Organization 34 (1998): 1-47.
22. M. Chossudov ky, Economic genocide in Rwanda: The globalization of poverty (Penang: Third World Network, 1997), 111-122.
23. C. G. Mesquida and N. I. Wiener, Human collective aggression: A behavioral ecology perspective, Ethology and Sociobiology 17 (1996): 247-262.
24· N. F. Sayre, The genesis, history, and limits of carrying capacity Annals of the Association of American Geographers 98 (2008): 120-134.
25. D. Simberloff and L. Gibbons, Now you see them, now you don't! population crashes of established introduced species, Biological Invasions 6 (2004): 161-172.
26. R. Ornstein and P. Ehrlich, New world, new mind (London: Methuen, 1989).
27. S. Cohen, States of denial: Knowing about atrocities and suffering (Cambridge, UK: Polity, 2001).
28. D. E. Orenstein, PopuJation growth and environmental impact: Ideology and academic discourse in Israel, Population and Environment 26 (2004): 40-60.
29. United Nations Environment Programme, Sudan: Post-conflict environmental assessment (Nairobi: UNEP, 2007), /UNEP _Sudan. pdf (accessed October 13 20 11).
30. M. Raymond, The birth of contraception Nature 444 (2006): 685.
31. E. van de Walle and H. V. Muhsam, Fatal secrets and the French fertility transition Population and Development Review 21 (1995): 261-279.
32. M. Potts and M. Campbell, History of contraception, in Gynecology and obstetrics, vol. 6, edited by J. J. Sciarra (Philadelphia: Lippincott Williams and Wilkins, 2002), 1-27.
33. J. M. Riddle, Contraception and abortion from the ancient world to the Renaissance (Cambridge, MA: Harvard University Press, 1992).
34. E. A. Wrigley and R. S. Schofield, The population history of England 1541-1871 (Cambridge MA: Harvard University Press, 1981).
35. V. D. Abernethy, Population politics: The choices that shape our future (New York: Plenum Press, 1993).
36. J. Bongaarts, 0. Frank, and R. Lesthaeghe, The proximate determinants of fertility in sub-Saharan Africa, Population and Development Review 10 (1984): 511-537.
37. P. Singer and D. E. Desole, The Australian subincision ceremony reconsidered: Vaginal envy or kangaroo bifid penis envy, American Anthropologist 69 (1967): 355-358.
38. D. A. M. Gebbie, Reproductive anthropology-Descent through woman (Chichester, UK; New York: John Wiley & Son , 1981).
39. J. B. Birdsell, Some environmental and cultural factors influencing the structuring of Australian aboriginal populations, American Naturalist  87 (1953): 171-207.
40. D. Meadows, D. Meadows, J. Randers, et al., The limits to growth(New York: Universe Books, 1972).
41. J. C. Caldwell, The global fertility transition: The need for a unifying theory, Population and Development Review 23 (1997): 803-812.
42. W. D. Barrie, China's population struggle: Demographic decisions of the People's Republic 1949-1969 (review), Demography 11 (1974): 702-705.
43. J. Becker, Hungry ghosts: China's secret famine (New York: Henry Holt, 1996).
44. W. Wakabi, Population growth continues to drive up poverty in Uganda, The Lancet 367 (2006): 558.
45. A. C. Ezeh, B. U. Mberu, and J. 0. Emina, Stall in fertility decline in Eastern African countries: Regional analysis of patterns, determinants and implications, Philosophical Transactions of the Royal Society B 364(2009):2991-3007.
46. M. Campbell, J. Cleland, A. Ezeh, et al., Return of the population growth factor, Science 315 (2007): 1501-1502.
47. J. C. Caldwell, Malthus and the less developed world: The pivotal role of India, Population and Development Review 24 (1998): 675-696.
48. E. Bowen, Malthus, are-evaluation, The Scientific Monthly 30 (1930): 465-471.
49. P.R. Ehrlich, The population bomb (London: Ballantyne, 1968).
50. D. Tribe, Feeding and greening the world (Wallingford, UK: CAB International in association with the Crawford Fund for International Agricultural Research, 1994).
51. J. L. Finkle and B. Crane, Ideology and politics at Mexico City: The United States at the 1984 international conference on population, Population and Development Review 11 (1985): 1-28.
52. J. L. Finkle and C. A. Mcintosh, The new politics of population: Conflict and consensus in family planning, Population and Development Review 20 (1994): 3-34.
53. J. Califano, Goveming America (New York: Simon & Schuster, 1981).
54. H. Daly, Beyond growth (Boston: Beacon Press, 1996).
55. C. Johnson Blowback: The costs and consequences of American empire (New York: Metropolitan Books, 2000).