When to intervene?

Published online April 24, 2012.

In our interconnected global community how does identity influence one’s actions?

“First they came for the Socialists, and I did not speak out– Because I was not a Socialist.
Then they came for the Trade Unionists, and I did not speak out– Because I was not a Trade Unionist.
Then they came for the Jews, and I did not speak out– Because I was not a Jew.
Then they came for me– and there was no one left to speak for me.”

This famous quotation comes from public lectures given by protestant pastor Martin Niemöller, a critic of Adolf Hitler who was imprisoned in Nazi concentration camps for seven years. Like many others, he expressed lifelong regret at having failed to act sooner as the Nazis murdered millions. His faith differed from most of those who were persecuted, but the painful lesson he learned was that one’s identity should not dictate one’s actions, or the lack thereof. Unfortunately, this lesson remains relevant today.

Still, questions remain. When do you have a responsibility to help someone? When are other people’s problems also your problems? In the face of obvious wrongdoing or a natural disaster, is it always better to do something than nothing? The answers to these questions are not obvious, even if Niemöller’s words ring true. Injustices, atrocities and accidents occur daily, but as an increasingly interconnected global community, we have not figured out when and how we are supposed to act, either as individuals, organizations, or governments. Many argue that national boundaries should dictate who and what we are responsible for, but upon closer examination, this argument falls apart.

Nationality is one of the most common social categories we use to define our identity, and for good reason. Our nationality, our citizenship, plays a large role in determining where and how we live. We look toward nation-states to dictate the behavior of individuals and governments, and physical boundaries are also those used to assign rights, privileges and obligations. Furthermore, nationalism is not just a facet of our identity, but is deeply embedded in the international system. The norms and rules of sovereignty have for long prevented one country from wandering willy-nilly into the affairs of another (which is not to say that this happens infrequently).

For this reason, human rights advocates, non-governmental organizations, and international organizations like the International Criminal Court, often viewed as proxies for “western” governments, not to mention governments themselves, are often lambasted for meddling in the affairs of countries like the Democratic Republic of Congo, Rwanda, and Uganda. Governments of countries on the receiving end of intervention complain loudly about the imposition on their sovereignty. Citizens too are repulsed by the idea and actions of foreigners who behave as if they know and understand a place or problem better than the people who live there.

But it is not clear why national boundaries alone should dictate our rights and responsibilities. Physical boundaries are becoming increasingly porous, and arguably, irrelevant. What happens halfway around the world is not only visible, but also something in which individuals far and wide can have a stake. Following the tsunami in Indonesia or the earthquake in Haiti, individuals raised hundreds of millions of dollars, channeled not through governments but rather through non-governmental and international organizations.

It is clear that individuals can make a difference, but the question is when should they? It would be silly to suggest that we should only care about things that happen in countries where we hold citizenship. Why? At least in part because the selection of nationalism as the key factor for determining whether or not to act is arbitrary. If we should only care about “people like us” or stay out of “other people’s” affairs, an argument that begins with one’s citizenship as the relevant identity may quickly reduce down to a sub-national identity, or worse, race, religion, ethnicity, gender, or class.

It is surely not the case that we should only care about or attempt to redress injustices if the offended party shares our race, ethnicity, hometown or income level. An argument that lists nationality as the key determinant of whether or not we have a right or responsibility to act is no different and no better than one listing any of our other identities as the deciding factor. Each one of us has many different, and largely socially constructed, identities. For example, I am an American, born in the state of California, in a town called Palo Alto, to a Mexican father and an American mother. I was baptized and confirmed in a Lutheran church. I have light skin. I am a woman. Should any of these categories, any of these identities, limit who or what I care about? Under what conditions should any of these identities dictate how I act?

If identity (of any variety) should not be the determinant that dictates our rights and responsibilities to act, what should be? We do not have an answer to this question. What we do have is the creation of social categories around which it is easy to mobilize but also easy to persecute, the creation of “us” and “them”, “foreigners” and “locals”. Such a framing is neither productive nor sustainable.

Perhaps information, knowledge, or understanding should be a prerequisite for action. Much of the critique about “meddling” in other people’s affairs stems from the fact that the meddling is often poorly informed. The road to hell is paved with good intentions, as the adage goes. First do no harm, says another popular mantra. Unfortunately, the simplicity of these axioms is misleading. We often do not know whether our actions will, on balance, be more helpful or harmful – it is often impossible to measure one’s impact, even years after the fact.

Yet if we fail to act, we are in danger of becoming bystanders to massive atrocities. Many who looked on as the Rwandan genocide unfolded became exactly that – bystanders whose crimes were those of omission. So too were those who looked away as the Nazis summarily wiped out over six million people. More recently, we have faced crises in Libya, Bahrain, Syria, and beyond, as regimes have clobbered and battered their populations into submission. Rebel groups like the LRA continue to terrorize with abandon. The fundamental questions remain. Should we act? When? How?

The King and Queen-makers

Published online February 28, 2011

Driving through the countryside or city streets in Uganda or Rwanda, one is greeted by the same sight over and again – children. Youngsters in colourful uniforms fill the sidewalks and paths every morning and afternoon as they trek to and from school.

Jogging in the early morning down Kigali streets I have more than once been embarrassingly out-run by little girls in dress shoes and backpacks, screeching gleefully as they dash past. Meanwhile, the smaller children toddle curiously around the home, and babies find themselves securely strapped to the backs of their busy moms. You don’t have to look up demographic figures to know that one word characterizes the population: young.

In a region long defined by civil war, violence and dictatorship, youth is the new and hopeful quality permeating society. The wars that wracked the region for the past several decades have drawn to a close, one by one – the Ugandan civil wars of the 1970s and 1980s, the 20-year terror of the Lord’s Resistance Army in Northern Uganda, the Rwanda genocide of 1994, and the Congo wars that followed. As the worst episodes of violence recede, how will newfound security affect the political, social, and economic opportunities and beliefs of the new generation? How will the youth relate to the decisions of leaders whose lived experiences are increasingly distant from their own?

The children and young adults of today will live profoundly different lives than those of their parents and grandparents. While conflict continues in eastern Congo, a peace and cautious hope has come to most of the region. Nearly half of Rwanda’s population today was born after 1994. 52% of Rwandans and 61% of Ugandans are less than 20 years old. Nearly three quarters of all Ugandans have lived under President Yoweri Museveni for their entire lives.

Most Ugandans and Rwandans, therefore, know only stories of the terrible wars that once ravaged society. The scars, visible or not, are everywhere, but the memory is increasingly derived from history passed down by those who lived through it. As these children come of age, they face very different challenges than their parents before them. The vast majority will attended primary school, and will read and write in English. Many will graduate from secondary school, and an increasing number will obtain a university degree. Unlike their parents, most will not fear for their lives, but for their livelihoods.

Yet for now, those who govern the countries in which these children grow up – individuals who were intimately involved in the conflicts of the past several decades – continue to make calculations, judgments, and risk assessments based on the experiences through which they have survived, as have done leaders before them. National security is at the top of the agenda for every government, but the price one is willing to pay for security is shaped by experience. For the older generation, there may be no price too high. For the younger generation, the choices may not be so clear-cut.

It is difficult to assess the extent of the divide between today’s youngsters and the generation that preceded them. Often votes are a good indication of political and policy preferences, but the post-conflict generation is only just coming of age. Surveys too can help, but ultimately we are left to some speculation.

Recent surveys in Rwanda show that both the young and old continue to place a high value on national security. Overall, 44% of Rwandans said that “strong defence forces” should be the top national priority, with a similar percentage across all age groups, according to the World Values Survey. In the U.S., by contrast, while 38% of all Americans surveyed believe strong defence forces is the most important national priority, only 20% of those under 30 list national defence as the top priority. The vastly different security challenges facing each country have surely shaped these preferences.

In Rwanda, an extraordinarily large percentage of people not only support strong defence forces as the top national priority but would also contribute to this goal – 95% of all Rwandans and 96% of 15-29 year-olds surveyed said they would be willing to fight for their country. In the U.S., only 41% of 15-29 year-olds were willing to do so. 91% of Rwandans also expressed a preference for greater respect for authority in the country. All this suggests that so far, there is little evidence of a generational difference in security preferences. Nevertheless, it is important to keep in mind that most of the peacetime generation is still too young to be included in any survey. We are likely still observing the preferences of an adult population for whom the remnants of conflict may still be too fresh, and continued violence in eastern Congo too close.

In Uganda, evidence is mixed regarding whether the old and young have different preferences when it comes to national priorities, but there appear to be greater differences than in Rwanda. There are obviously serious economic challenges facing Ugandans, which may trump security concerns for the ordinary citizen — 64% of 18-29 year-olds were unemployed in 2008, according to an Afrobarometer survey. For most Ugandans, “improving economic conditions for the poor” is the most important national priority. Only 17% of 18-29 year olds listed maintaining order in the nation as the highest priority. Interestingly, young people expressed greater fear of political intimidation or violence than the very old in Uganda – 36% of young people said they had “a lot” of fear of political violence. And worryingly, the majority of Ugandans believe political competition often or always leads to conflict.

Uganda and Rwanda are both societies in transition  — transition away from conflict, transition toward greater political participation, transition out of poverty. How today’s children will view the behaviour and policies of leaders whose life experiences are increasingly distant from their own is yet to be seen. It may be too soon to detect generational differences in any scientific way, but ready or not, the youth bulge is coming into its own. Young people already make up the lion’s share of the population in the region. In just a few years they will be the king and queen-makers, or breakers. Watch this space.

The UNSC: Performance and Perceptions

Published online February 16, 2012.

The actions (or inactions) of the United Nations elicit varied reactions. Supporters hold the institution in high esteem, and believe it has the power to promote peace, human rights, justice, and social progress, as the preamble of its charter suggests. Even the most ardent enthusiast will admit that the UN is not a perfect institution, but argue instead that it is as close to a functioning world government as we can hope for in this period of our history.

To others, the UN is a bumbling, impotent and ineffectual player in the realm of international relations. It is an expensive bureaucracy that both overreaches and underperforms. Critics cite inaction during crimes against humanity on the one hand, and intrusion of state sovereignty on the other. They argue that decisions are dictated not by the community of nations but the community of a few superpowers.

On either side of this chasm, and everywhere in between, are people who do not understand how the organization works. What most people see, if they see anything at all of the UN, are its loud condemnations, the roaring silence of its passivity, or its troops scattered across the world’s “troubled” spots.

The future of the organization depends not only on its performance, but also on global public opinion. The recent failure of the United Nations Security Council (UNSC) to back the Arab League-sponsored resolution on Syria brings to light afresh the challenges the organization faces. The halting process of decision-making at a time when hundreds of Syrian civilians are being slaughtered by the Assad regime is reminiscent of past faltering by the Council. Inability to gather consensus on resolution wording, much less action, has hampered the UNSC on many occasions.

UN intervention during the Rwandan genocide is only the most vivid example of a lumbering bureaucracy with at least as many interests as members stumbling at the feet of those who clamor for its attention, resources, and even salvation. An excruciating recollection of the events of the UN’s behavior in 1994 calls our attention to the power of a few words. What is said and not said, done or not done, leaves a mark on history, on leaders, and on ordinary citizens that cannot be removed.

It is not at all surprising, therefore, that according to the World Values Survey, only 8% of Rwandan respondents thought that policies regarding international peacekeeping should be handled by the United Nations. The remaining 92% thought peacekeeping should be handled by national governments or regional organizations. This response varied drastically from almost all other countries surveyed. On average, nearly 50% of people in nearly 50 countries around the world thought the UN, and not regional organizations or national governments, is best placed to handle international peacekeeping. Rwanda’s experience with peacekeeping operations has clearly damaged trust in the UN in this arena.

On the other hand, a greater percentage of respondents in Rwanda than in any other country think the UN is best placed to handle refugees. Only 10% of Rwandans thought national governments should handle policies related to refugees, while the vast majority, 73%, thought the UN should handle refugees. Meanwhile, 32% of respondents from all other countries thought national governments should handle refugees while only 48% thought the UN was best. A more thorough investigation could uncover the extent to which first-hand experience with the UN intervention, in all of it various forms, shapes public opinion regarding what the organization can and should do.

In the meantime, the most recent failure of the UNSC to come down firmly on its position towards a regime that is clearly brutalizing its population raises questions about the limits of an international institution whose explicit and primary goal is world peace. It is precisely the collapse of these talks that chips away at people’s faith that the UN, or any organization for that matter, can promote peace in any way beyond that which is lip service.

Students in a class I am helping teach this term learned this lesson the “hard” way in a UNSC simulation last weekend. For two days the fifteen delegations, made up of at least some students who are destined to be diplomats and policymakers themselves one day, met to draft and pass a resolution on the international community’s response to Iran’s nuclear program. They took turns making statements to the council, drafting the resolution, and debating each word and paragraph. It soon became apparent that delegations were not all equal. The permanent five (P5) had the unique power to make or break a deal. After nearly twenty hours of debate, the council came to the final vote.

In a bizarrely parallel universe to the actual meeting and vote of the UNSC taking place the same day, the students’ resolution was vetoed by China, a member of the P5. A collective “boo!!” filled the room. But it was over. After days of work, side deals, compromises, and urgent pleas, the resolution had failed. Meanwhile, in New York, the real Chinese and Russian delegations killed the Syrian resolution. The palpable disappointment of the students was only a whisper of the hundreds of disappointments, much larger and all too real, of both UN action and inaction.

Those who have been burned before, in ways small or large, have no choice but to alter their behavior domestically and make do with the politics in New York. The UN has proven time and again that it is not an organization that leaders or publics can rely on when times get tough. Whether or not a resolution can be reached depends not only, not primarily, on the gravity of the situation at hand, on the peril at which lives are placed, or on the number of lives in danger. Instead, alliances, precedents, and power creep into the corners of debates between great and small countries, and the diplomats that represent them.

The ongoing perils of childbirth

Published online February 1, 2012.

A problem of supply in services is limiting further improvements in maternal health

Fertility rates in Rwanda have been falling steadily over the past several years, but this year close to 400,000 Rwandan women will become pregnant and give birth. Next door in Uganda, four times as many women will become pregnant, approximately 1.5 million. If recent trends hold, nearly 10,000 of these women will lose their lives during or shortly after their pregnancy. Many of them will suffer from bleeding and infections that can be treated or prevented.

Surveys show that pregnant women in both Rwanda and Uganda seek antenatal care at very high rates. Nearly 98% of women in Rwanda and 95% in Uganda have at least one antenatal visit during their pregnancy. These women want information about their pregnancy, and seek out health services that they believe will help them have healthy babies. But often the health system fails to provide these women with the information they need to take care of themselves, and far too many mothers lose their lives because they do not receive emergency care in time. Rwanda has been showing steady progress in improving maternal health, but Uganda has faired poorly.

Both Uganda and Rwanda continue to have high levels of maternal mortality, defined as the death of a woman while pregnant, or within 42 days after the termination of pregnancy (excluding accidents). Between 1985 and 1995 in Uganda, maternal mortality was estimated at 527 deaths per 100,000 live births. The following decade, from 1996 to 2006, maternal mortality was estimated at 435 deaths.Although these figures suggest a slight decrease over the past twenty years, the margin of error around these estimates are such that we cannot say with any confidence that maternal mortality rates have changed at all between 1985 and today. Thus, it appears pregnant women in Uganda today are equally likely to die in childbirth as they were 25 years ago, when the National Resistance Movement came to power.

Meanwhile, maternal mortality in Rwanda has fallen significantly, although rates in Rwanda have for some time been higher than those in Uganda. Between 1995 and 1999, maternal mortality in Rwanda was estimated at 1071 deaths per 100,000 live births, one of the highest rates of maternal death in the world. Between 2000 and 2004, however, it had dropped to 750. The most recent estimates should be available in the next year or so, and are likely to show even further decline.

Rwanda may have made greater strides than Uganda in reducing maternal mortality in the past decade or so, but both countries face significant challenges in improving maternal health. There is a long way to go. The good news is that unlike many types of preventive health behaviors, such as getting immunizations or sleeping under a bednet, seeking help during pregnancy has become very common, even natural. In other words, the demand for health care during pregnancy appears higher than for many other health issues. Unfortunately, while demand is high, supply of care during pregnancy is weak.

Although nearly all pregnant women seek antenatal services at least once during their pregnancy, not all clinics and health facilities are equipped and ready to meet their needs. In fact, most health facilities are lacking the basics when it comes to antenatal care. The Service Provision Assessment Survey 2007 found that only 31% of health facilities in Rwanda had all the items required for infection control, including running water, soap, latex gloves, and disinfectant, and only 28% had all the essential supplies for basic antenatal care, including iron and folic acid tablets, tetanus vaccines, and equipment to measure blood pressure. A mere 11% had all the medicines required to treat pregnancy complications, including antibiotics, antimalarial drugs, and medication to treat common sexually transmitted infections.

To make matters worse, very few women were given sufficient information so that they could take good care of themselves at home during their pregnancy. Only 8% of women in Rwanda were told about signs of pregnancy complications, while only 35% of women in Uganda were informed. It is perhaps not surprising that only 35% of Rwandan women and 47% of Ugandan women attend the recommended four antenatal visits. When women arrive in clinics, often without power or water, which do not provide the necessary equipment and information to help them with their pregnancy, there may be little incentive to keep going back.

Of course, the news is not all bad. On the contrary, the improvements that have been made in maternal health, particularly in Rwanda, are extraordinarily impressive. In just five years, between 2005 and 2010, the percentage of mothers whose delivery was assisted by a trained and skilled provider increased from 39% to 69%. The percentage of mothers who delivered in a health facility jumped an equally miraculous 28% to 69%. The increase in births under the watch of a skilled provider has likely played a large role in the reduction of maternal mortality. An estimated 15% of all pregnant women will encounter life-threatening complications, and trained nurses, midwifes, and physicians can help make sure these complications do not become fatal.

The fact that pregnant women appear to seek out services and information at high rates is a great opportunity for public health, but this opportunity is squandered if health facilities are poorly equipped to provide care. While Rwanda has made strides in improving the supply of care, there is less evidence of improvement in Uganda. The results speak for themselves.

Science in the time of cholera (and nodding syndrome)

Published online January 11, 2012.

In August 1854 a terrible illness tore through a London neighborhood, killing hundreds in a matter of days. The terrifying disease emptied the body of fluid until vital organs shut down, after which point the petrified soul would succumb to the illness. Death often arrived less than twelve hours after the first signs of an upset stomach. Londoners of the day had a name for this illness, but did not understand its cause. They called it cholera.

Though cholera outbreaks had hit London before the mid-1800s, the Broad Street Pump outbreak of 1854 is now perhaps the best known. It was during this scourge that physician John Snow was able to demonstrate that cholera was not an airborne disease, as was the popular and professional opinion at the time, but rather a waterborne disease. This insight proved critical to improving public health in London and beyond. Londoners had been emptying their waste into the Thames, often just upstream of intake pipes for water companies. Their water and city stunk. But because disease was thought to be airborne, they doused smelly sidewalks in chloride of lime in attempt to purify the air. They made few attempts to purify the water so obviously contaminated with their own waste.

When cholera inevitably struck, they applied all manner of remedies, most of them useless at best. Castor oil, opium, and leeches were all espoused to treat cholera, not just by ordinary folks, but also by doctors. Worse still were treatments such as laxatives or bleeding. The extreme dehydration facilitated by cholera was often “treated” by attempts to further remove fluids from the body.

In hindsight, both the cause and the treatments for cholera are straightforward, if not obvious. Cholera is a waterborne illness that spreads when one person ingests the cholera-infected waste of another person. The treatment for the extreme dehydration that ensues is most fundamentally rehydration – consuming copious amounts of fluid to replace those that are lost. Yet at its emergence, a series of facts and observations did not at first fit together in a single theory about the cause of cholera. When cholera struck a household, sometimes it struck everyone, sometimes just a single person. In a neighborhood, some homes would be hard hit, while others escaped untouched. Whether you survived or not seemed random.

So it is with another illness in our midst – nodding disease. Nodding disease sounds like a folksy and tabloid-inspired syndrome. Its name describes the telltale symptoms of the disease, a rhythmic head nodding in children. The fact that unlike many diseases its name does not betray anything about its likely causes demonstrates just how little we know about its transmission. For example, HIV (human immunodeficiency virus) is named for the virus that causes AIDS. The name malaria comes from the Italian mala aria, meaning “bad air”, so named because the illness we now know is caused by a parasite was originally thought to be airborne.

But despite its odd name, nodding disease is far from folksy or fake. It is often fatal. First reported in Tanzania in 1962, nodding disease has since spread throughout what is now South Sudan, and has been rapidly spreading in northern Uganda as well.

The pattern of incidence of nodding disease and its symptoms are puzzling, as were those of cholera in the early nineteenth century. First, the onset of nodding disease appears to occur almost exclusively in children between the ages of 5 and 15.

Second, nodding is reportedly often triggered by the presence or eating of familiar foods, or when a child becomes cold. Unfamiliar foods, such as chocolate candy bars, do not induce nodding. Third, when untreated, those with nodding syndrome cease developing both physically and mentally. They are often stunted and experience mental retardation. Fourth, most children affected come from very poor families. There are now thousands of children in South Sudan and northern Uganda who experience symptoms of nodding disease, and the incidence of the syndrome appears to be increasing.

Several theories regarding the cause of the syndrome have been mooted, but none proven. For the past several years, teams of experts from the U.S. Centers for Disease Control (CDC) and World Health Organization (WHO) have travelled to South Sudan and northern Uganda in an attempt to better understand the causes of nodding disease, and possible treatments. Their work suggests that nodding disease is a new epilepsy syndrome, and that the characteristic head nodding is caused by seizures that lead to temporary lapses in neck muscle tone.

A vast majority of children experiencing symptoms of nodding disease are also infected with a parasite called Onchocerca volvulus, which causes river blindness. The high prevalence of this parasite in victims of nodding disease means that the most plausible (published) theory about the cause of nodding disease links the syndrome to O. volvulus, but how and why remain unclear. Moreover, there are a number of children both in and outside the region who are infected with the parasite and do not acquire nodding disease, so the link between the two is not straightforward.

So far, therefore, we have accumulated a series of facts about the mysterious syndrome, which have yet to be pieced together in a coherent theory. We have many more tools at our disposal than did the Londoners of the 1800s, but answers to pressing medical and public health questions do not usually come without time and resources. Nodding disease is a terrifying prospect for those living in South Sudan and northern Uganda not only because of the debilitating effect it has on children, but also because families and communities do not understand why their children are falling ill in the first place. A confusing array of facts, theories, and observations are unnerving both to those in the midst of the outbreak, but also those who see its spread as a very serious health issue for the region.

Misunderstanding the causes of nodding disease can have disastrous consequences, as was the case with cholera some 150 years ago. So far, anti-epilepsy treatments appear to be helping children experiencing nodding disease, but supplies of these treatments are often scarce, and determining the ultimate cause of epilepsy in these children should be a high priority for health officials. Cases of epilepsy are often documented at high rates in hospitals in the region, and there is thought to be a link between epilepsy and cerebral malaria as well. In Arua Regional Referral Hospital, in northwestern Uganda at the border with Sudan and DRC, 7 percent of all outpatient children over age 5 in April 2009 were diagnosed with epilepsy. In 2004/05, 74 percent (nearly 4500) of all cases in the Mental Ward were diagnosed as epilepsy.

Clearly, epilepsy, whether nodding disease or otherwise, is a condition that deserves the utmost attention from public health and medical professionals. The sooner we understand the causes of this new breed of epileptic seizures, the sooner we can take steps to both treat it and prevent its spread. In the absence of a compelling theory about its cause, however, fear and futile treatments are likely to ensue.

2012: the raw and promising new year

Best wishes to you and yours as we bring 2011 to a close and ring in the new year. Thanks for reading and sharing, and I look forward to another year with you in 2012.

*               *               *

An excerpt from my final column of 2011 for The Independent (Rwanda Edition):

Shuffling through memories of the past twelve months, one is reminded of the heaving, tumultuous and heady days that made up the molding of global and local politics, innovation, and society. Almost every year feels exceptional at its end, and this one is no different. Exceptional for the unexpected uprisings, reassuring surprises, and most of all, the untimely, or perhaps just sobering, deaths.

A remarkable feature of the human brain is that emotion triggers extraordinary powers of memory – emotional events, traumatic or ecstatic, are captured in a different way from ordinary occurrences. I have many such memories this year. I can recall vividly the walls and tables of a classroom at the moment I heard that Tunisia’s Ben Ali stepped down, the living room and footage on Al Jazeera of Mubarak’s fall, the computer screen announcing Bin Laden’s death, and the Twitter feed of my phone as I scrolled through news of Gaddafi’s brutal demise early one morning, all in 140 characters or less. I also recall the unusually grey and rainy Palo Alto morning marking the first day in 57 years of a world without Steve Jobs, just a few days after the passing of Wangari Maathai. I see clearly the words of Christopher Hitchen’s last column staring back at me, in stark and final relief.

There are of course many other memories, moments captured with friends and family, as well as moments alone, preserved not as events in their entirety, but as a series of snapshots. At the end of every year, as now, there is more time to sit and shuffle through them. It feels like an exceptional year, and the past ten have felt like an exceptional decade.

The pace of progress, innovation, and change makes each decade, and increasingly, each year, feel remarkably different from the previous. In the first decade of the twenty-first century, we experienced tremendous economic growth worldwide, a sharp break from the previous several decades. By the mid-2000s, nearly every single country in the world experienced positive economic growth. The number of new infections of HIV is falling by the year, and deaths due to HIV peaked in sub-Saharan Africa and worldwide in 2004/5. Around the same time, Google went public, and together with Facebook, is now a household name in the global village. Mobile phone use has increased exponentially worldwide. In 2000, there were 12 mobile phone users for every 100 people. Today, there are around 69 mobile phone subscribers for every 100 individuals around the globe.

Change, therefore, is brazenly constant. Anyone who suggests otherwise is either deluding themselves or not paying attention. This is as true in Africa as in the rest of the world, although many both in and outside of the continent have been slow to recognize that the former has not, in fact, been standing in place while the latter dashes on.

The churning and surging marketplace for ideas is open. The stepping-stones placed by yesterday’s innovators serve as a launching pads for vaulting into the next year and decade. Even in the destruction strewn by mad and ordinary men lie the pieces that will build society anew. One can pick them up, or stargaze at glittering towers and soaring skylines far from home.

Entering the new year, we are without many of those who began 2011 with us just one year ago. The most memorable deaths on the news circuit were violent, painful, or both, untimely or just-in-time. The world is short a few tyrants, but short a good many great and beautiful minds too. Their exit is a reminder of the inexorable march forward that spares no one. There is no standing still, but there are choices, and our own expectations.

Here is to the raw and promising new year.

What is the (global) village gossiping about?

What is the (global) village gossiping about?

Published online December 22, 2011.

Accessing people’s thoughts and interests from Asia to Africa is just a click away

It used to be that education primarily took place in a classroom. These days, the chalk and blackboard are fading away and steadily being replaced, or at least complemented, by new technology. Even in some of the world’s hardest-to-reach places, cell towers and solar-charging stations are re-inventing the learning and communication experience. Alongside the traditional classroom teacher are laptops and cell phones, paving the way toward a whole new way of seeing the world.

A world of data is at your fingertips, quite literally. The advent of personal computers and increasing interest in making information open and accessible to all means that we now have the ability to answer many questions faster and more accurately than we ever thought possible. Information on everything from economic growth to weather patterns to flu outbreaks is just a Google search away.  Data and data sources are not without their flaws, but we can often see broad patterns much more clearly across and within countries than we once could. The question is, how can we take advantage of new and ever increasing sources of information? Perhaps one of the most novel uses of data pieces together the wisdom of the crowd. In particular, Internet search terms are an amazing guide to all sorts of phenomena we care about, including public opinion on politics and policies, investment interests, and even trends in infectious disease.

What kind of information are people searching for? What are the questions to which they seek answers? One can of course look at broad trends in search engine search terms across countries, something similar to looking at words and topics that are “trending” on Twitter, but one can also look for more specific information. How many people in the U.S., Europe, or Asia look for information about Rwanda, for example? What kind of information do they look for? Google Insights for Search can help answer these kinds of questions, and reveal interests from potential investors, tourists, and others that can be useful to the local business community, government, civil society, and individuals.

If you look at the most frequent search terms related to “Rwanda” used by those living in the United States, France, or even China, you’ll find that most are related to the genocide or the movie, Hotel Rwanda. Within the U.S., searches for “Rwanda genocide” spike every April and May, although the spikes are becoming smaller over time. This is some indication that while the world still heavily associates Rwanda with genocide, this association is becoming weaker with time. Searches for “Rwanda safari” or “Rwanda gorillas” increased greatly in 2005 and 2007 respectively, and most of these searches came from individuals living in the United States or the UK.

Meanwhile, searches about Rwanda in the East African region show a very different pattern. The top three search terms about Rwanda from those living in Uganda and Kenya are all related to jobs, and primarily come from three cities, Kampala, Nairobi, and Mombasa. Meanwhile, searches from within Rwanda about Uganda focused on news outlets, such as the Daily Monitor, New Vision, and “news Uganda” more generally. The most common searches in Rwanda about Kenya include Kenya Airways, the Daily Nation, and Kenyan universities.

Understanding search trends can be useful for businesses and entrepreneurs, but they are also a cheap and easy way to do public opinion polling. In the U.S., search trends of the past couple of months have tended to mirror official polling trends for presidential candidates in the Republican party, for example. If you look over time, you can see the rollercoaster levels of support for candidates such as Rick Perry, Mitt Romney, Herman Cain, and Newt Gingrich. In the U.S., regular and nationally representative polls are conducted throughout the campaign period, but the more informal “search” polling can be very informative as well, and far less expensive.

One challenge for using this type of data in countries like Rwanda and Uganda is that relatively few people are online, although the number of internet users is growing by the day. In Rwanda, approximately 13 percent of people accessed the Internet in 2010, up from 7.7 percent in 2009, according to the International Telecommunication Union. More and more people are using their mobile phones, rather than computers, to access the Internet, which makes it easier to get online. Although there may not be enough people using Google to get a good measure of public opinion in Rwanda, this will very likely be possible in the not-too-distant future.

Already, one can observe trends in public interest in politicians among those living in capital cities. Searches for “Besigye”, Ugandan President Yoweri Museveni’s archrival, spiked within Kampala in November 2005, a few months prior to the heated 2006 presidential election, and spiked again to a lesser degree in February 2011, during the most recent election. It appears there was much more interest in Kizza Besigye leading up to the 2006 election (even with considerably fewer people online) than during the time leading up to the most recent elections, a trend which was reflected in Besigye’s support on election day as well. Online searches for Besigye spiked again in April, during the Walk-to-Work protests, but unfortunately for the repeat presidential candidate, by then the election had already passed. Despite the limited connectivity of the population living in Uganda, general election trends were evident in people’s online behavior.

Searches for "besigye" in Uganda, 2004-2011

Finally, search terms can be useful for tracking trends in infectious disease. When people fall sick, they often turn to the Internet for information about their symptoms or illness. Tracking search terms can thus identify and follow outbreaks of particular types of illnesses. Google Flu, for example, uses data on search terms to estimate trends in the spread of the flu virus. Again, their data is best for countries in which the majority of the population has access to the Internet, but as Internet connectivity increases in countries like Rwanda and Uganda, crowd-sourced data on infectious disease may help health officials identify and address outbreaks.

The wisdom of the crowd has for long eluded policymakers, investors, and even public health experts because it is costly to collect information from a large number of people, and people often have incentives to misrepresent their interests and beliefs. Using search trends, however, as one measure of people’s interests, opinions, and concerns, is one way to crowd-source information gathering in a relatively inexpensive and expedient manner.

Analyzing Africa: The Audacity of Despair

A new, defiant image

Published online at The Independent, Rwanda Edition, December 17, 2011

In 2000, the cover of The Economist pictured a boy wielding an AK47 inside the outline of the African continent, surrounded by black. “The hopeless continent,” the cover ominously read. At the time a combination of factors led the magazine and a whole host of bystanders to throw up their hands in despair, and mentally close the door to hope for the future of “Africa.” A decade later, The Economist, whose cover this week reads, “Africa rising” and many others, are waking up, wide-eyed, to realize the tremendous growth and progress that has been taking place on the continent all along. Progress has not been even, or without crushing reversals along the way. But given the history of development across the globe, it is entirely unclear why we should have anticipated linear progress, or lament its absence. Political, social, and economic development will carry on with or without handwringing at one extreme, or ululations at the other.

There have been at least two common mistakes in assessing progress (or the lack thereof) in “Africa,” which together have made for some rather wrongheaded analyses. First, there is a danger in conflating levels of development with development itself. It is obvious to all that levels of per capita income, education, and mortality, for example are lower on average in Africa than anywhere else. The issue of levels, however, is entirely different from change over time. Contrary to popular belief, improvement in both human and economic development was occurring in Africa before the dawn of the new millennium, just not everywhere. This leads me to the second analytic pitfall – the “Africa is a country” problem.

It is obvious to all that Africa is not a country but a continent, but analysis nonetheless often treats Africa as if it were one political, economic, or social unit. It is not. There is tremendous variation across the continent in both levels of development and rates of improvement over time. A failure to acknowledge the divergent paths countries have taken leads to the kind of essentialisation one tends to regret.

It is all too easy to essentialize. The mind recalls the most extreme cases, and remembers those that support prior beliefs. So in 2000, near the height of the HIV/AIDS epidemic, with flooding, drought, the Second Congo War, political crisis in Sierra Leone and a waffling UN Security Council, it was easy to create an image of Africa that was tearing itself to pieces. “Africa was weak before the Europeans touched its coasts. Nature is not kind to it,” wrote The Economist. “This may be the birthplace of mankind, but it is hardly surprising that humans sought other continents to live in.” Ouch.

As noted, it is true that levels of development, that is, income per capita, literacy, infant mortality, and many other measures of development, are comparatively far lower in sub-Saharan Africa, but all of this ignores the changes that have been taking place. In the 1990s, for instance, despite much pessimism, a number of countries held multi-party elections, a wave that started with Benin in 1991. While these countries would not become flourishing liberal democracies overnight, the 1990s would mark the beginning of the end of dictatorship as we know it.

There was also an effort to improve access to education, and the percentage of children completing primary school grew in a number of countries, including Benin, Burkina Faso, Cape Verde, The Gambia, Guinea, Guinea-Bissau, Liberia, Mali, Malawi, Togo, and Uganda, albeit occasionally starting at very low levels. Gains in education were not achieved everywhere, and schooling declined in some countries, but this fact only further demonstrates the variation in performance across African countries.

The best news is that although improvement in education varied, improvements in health over the past several decades have been nearly universal. Since 1960, child mortality has fallen in every single African country for which there is data, with the possible exception of Somalia. Even in a country like the Central African Republic (CAR), notorious for its poor governance, under-5 mortality fell by half over the past fifty years, from 300 to just over 150 deaths per 1000 births. In 1960, just over one in three children born in CAR would not live to see their fifth birthday; today six out of seven will survive childhood. Moreover, in spite of the devastating HIV/AIDS pandemic, which has claimed millions of lives, the hardest hit African countries are rebounding, and child and maternal mortality rates are again declining in countries like Botswana, Namibia, South Africa, and Zimbabwe.

Economically, the performance of African countries has been diverse for decades, with some countries consistently growing and others wallowing in economic misery. A number of African countries experienced periods of negative economic growth throughout the 1970s, 1980s, and into the 1990s, which, along with population growth throughout, meant that several had the same or even lower levels of per capita income in the 1990s than they had at independence.

Still, many countries began to see positive economic growth in the 1990s or earlier, including countries as diverse as Angola, Burkina Faso, Cameroon, Congo (Brazzaville), Ethiopia, Ghana, Guinea, Mali, Mozambique, Rwanda, Senegal, Uganda, and Zambia. Some of these economies are reliant on commodities such as oil and minerals, but service and other sectors comprise an increasing share of the economy in many countries, and regional trade has grown as well.

Average levels of development give Africa a bad name, but initial conditions were different from most of the rest of the world, and rates of improvement have often equaled or exceeded those in the developed world. As interest in Africa is piqued by double-digit economic growth figures and opportunities for investment, we will continue to see discussion of a part of the world most people inadvertently essentialize. Fortunately, I think the audacity of despair that has pervaded western thinking on Africa has left little in its wake other than egg on some faces. The audacity of hope has now come to the fore.

HIV in colonial Africa

Online this week in The Independent (Rwanda Edition): How public health efforts likely contributed to the early spread of HIV.

The Tragic Amplifier

Published online December 8, 2011.

This year marks the 90th anniversary, approximately, of the introduction of human immunodeficiency virus (HIV) into the human population. It also marks thirty years since HIV was first scientifically recognized in 1981. Since the 1920s, this virus has spread across the globe and become the HIV/AIDS pandemic we are all too familiar with today. Most people consider the 1980s to be the beginning of the HIV/AIDS pandemic, but the virus had been prevalent in populations living in parts of central Africa for decades before it became a global nightmare.

New evidence from epidemiologist and international health expert Jacques Pepin suggests that human efforts to improve public health in central Africa were critical in facilitating the early spread of HIV, which has since claimed nearly 30 million lives. In the past two decades, massive coordination, mobilization, innovation, and investment have managed to slow the epidemic and save millions. As we mark World AIDS Day on December 1, 2011, HIV/AIDS is a reminder to us all of the tremendous power of human folly, but also of human triumph.

The Origins of AIDS, by Pepin, is a remarkable new book that pieces together the emergence of HIV in the human population, and its subsequent spread across the globe. HIV is the human version of simian immunodeficiency virus (SIV), which has been present in chimpanzee populations of central Africa for hundreds of years. Human contact with chimpanzees led to at least one transmission of SIV to HIV in a human in the early 1920s, most likely a hunter or a cook living in central Africa, where the majority of SIV-carrying chimpanzees live. This transmission alone was extremely unlikely to have triggered an HIV epidemic, and indeed chimpanzee-to-human transmission could have occurred on separate occasions prior to the 1920s, but would not have spread far. An infected hunter may have passed HIV to his family members, but in all likelihood, the virus would have stopped there. Why did HIV begin to spread beyond a few infected individuals in the early 1920s?

Pepin argues that heterosexual transmission, which is the predominant mode of transmission of HIV today, could not alone have led to an outbreak of HIV on a scale that would trigger a pandemic. Thus, there must have been some kind of “amplifier” that allowed for very rapid transmission of HIV to many people at a time. And what was the mostly likely initial culprit in the amplification of the virus? Colonial public health campaigns involving widespread use of unsterilized syringes and needles.

In the 1930s and 1940s, colonial administrations in French Cameroon, the Belgian Congo, and elsewhere began massive public health campaigns to treat various infectious diseases, including yaws, syphilis, malaria, leprosy, and sleeping sickness, using syringes and needles which were not sterilized regularly, if at all (oral tablet versions of treatments were not available for these diseases at the time). Although there are no blood samples from this time period still in existence (the oldest blood sample in which HIV has been detected dates back to 1959, taken from a man living in Leopoldville, Congo, now known as Kinshasa), it is well documented that other less lethal viruses, like Hepatitis C, were transmitted via syringes in Cameroon, Gabon, and the Belgian Congo, among other colonies. It is not difficult to imagine that HIV could have been passed quickly through a population via syringe as well.

One clinic to treat sexually transmitted diseases (STDs) in Leopoldville treated up to 1000 patients a day by the mid-1950s, with documented evidence that medical equipment was not sterilized between patients. To make matters worse, HIV was likely introduced into Leopoldville/Kinshasa at a time when there was a dramatic gender imbalance due to colonial policies. Urban areas like Leopoldville were often the equivalent of “work camps” in which wives and children were not welcome, which resulted in widespread prostitution, further facilitating the spread of HIV through heterosexual transmission.

HIV, which first spread through non-sterile syringes, often in clinics aimed at treating sexually transmitted diseases among men and sex workers in urban areas, kept at a steady prevalence through heterosexual transmission among the same population. In the colonial period, female sex workers, or “free women”, had only a few regular clients each year, but by the time of independence, female prostitutes would often see up to 1000 clients per year. This new type of prostitution greatly facilitated the transmission of HIV to populations beyond urban areas, and spread along major trades routes and cities in central and eastern Africa, including Kigali.

By 1984-85, Kigali, which at the time had a high ratio of males to females, and thriving prostitution, had the highest recorded HIV prevalence in the world, with 80 percent of prostitutes, 50 percent of STD patients, and 15-20 percent of blood donors, factory workers, and hospital employees testing positive for HIV. By 1987, HIV prevalence was at 17.8 percent in urban areas and jumped to 27 percent in urban areas by 1996.

From central Africa, HIV soon spread to Haiti, before being transmitted via multiple routes to the United States and beyond. Today, 34 million people are living with HIV/AIDS, and another 29 million have perished. That the spread of this virus was likely facilitated, and perhaps only possible, with the help of human technology and early public health campaigns should give us pause, and remind us of the terrifying potential for destruction due to human folly. As Pepin writes, “When humans manipulate nature in a way that they do not fully understand, there is always a possibility that something unpredictable will occur.”

Turning the tide on the spread of HIV/AIDS has taken decades, and millions have tragically lost their lives in the process. But the HIV/AIDS epidemic also demonstrates the amazing power of human innovation and cooperation that can take place on a global scale. Today, there are 6.6 million people receiving life-saving antiretroviral treatment, and both AIDS-related deaths and new HIV infections are declining in most parts of the world. The time, research, energy and money that have gone into tackling HIV has been phenomenal. If anything, we are now in danger of devoting too few resources to other health challenges that must also vie for the attention of the global health community and domestic health budgets.

HIV/AIDS is an extraordinarily painful reminder of the good intentions that can pave the road to hell, and of the unique capability of humans to create as well as destroy.

HIV/AIDS: Human folly and triumph

Today is World AIDS Day. HIV has taken the lives of an estimated 29 million people around the world, and currently around 34 million people are infected. The effort of many individuals, organizations, and governments has led to a turnaround in the pandemic, infection rates and deaths due to AIDS are falling in most parts of the world. Still, there is a long way to go, and many people still do not have access to life-saving drugs.

A new book by Jacques Pepin, The Origins of AIDS, provides a remarkable account of how HIV initially spread among populations in central Africa, and later became the pandemic we know today. His sobering finding is that human efforts to treat and prevent disease with the use of non-sterilized syringes in colonial Africa very likely facilitated early and rapid HIV transmission. I discuss his work in this week’s column, excerpts of which is below.

HIV/AIDS: Human folly and triumph (published in this week’s Independent Rwanda Edition)

This year marks the 90th anniversary, approximately, of the introduction of human immunodeficiency virus (HIV) into the human population. It also marks thirty years since HIV was first scientifically recognized in 1981. Since the 1920s, this virus has spread across the globe and become the HIV/AIDS pandemic we are all too familiar with today. Most people consider the 1980s to be the beginning of the HIV/AIDS pandemic, but the virus had been prevalent in populations living in parts of central Africa for decades before it became a global nightmare.

New evidence from epidemiologist and international health expert Jacques Pepin suggests that human efforts to improve public health in central Africa were critical in facilitating the early spread of HIV, which has since claimed nearly 30 million lives. In the past two decades, massive coordination, mobilization, innovation, and investment have managed to slow the epidemic and save millions. As we mark World AIDS Day on December 1, 2011, HIV/AIDS is a reminder to us all of the tremendous power of human folly, but also of human triumph.

*                   *                   *

Turning the tide on the spread of HIV/AIDS has taken decades, and millions have tragically lost their lives in the process. But the HIV/AIDS epidemic also demonstrates the amazing power of human innovation and cooperation that can take place on a global scale. Today, there are 6.6 million people receiving life-saving antiretroviral treatment, and both AIDS-related deaths and new HIV infections are declining in most parts of the world. The time, research, energy and money that have gone into tackling HIV has been phenomenal. If anything, we are now in danger of devoting too few resources to other health challenges that must also vie for the attention of the global health community and domestic health budgets.

HIV/AIDS is an extraordinarily painful reminder of the good intentions that can pave the road to hell, and of the unique capability of humans to create as well as destroy.

%d bloggers like this: