By David DiSalvo
It’s a huge understatement to say that panic is part of human nature. We’re all wired to anticipate threats and experience nervous system overdrive when they arrive – our species wouldn’t have made it this far if we didn’t. But what happens when the anticipation itself is enough to trigger heart pounding panic? And stranger still, why do threats as rare as they are vague cause more panic than threats that surround us every day?
Those are a couple of the questions that infectious-disease epidemiologist Philip Alcabes set out to investigate in his newly released book, Dread: How Fear and Fantasy Have Fueled Epidemics from the Black Plague to Avian Flu. What he couldn’t have known, however, is that his book would begin hitting bookstore shelves just as swine flu began consuming the public consciousness – providing a more than timely example of a dread-catalyzing threat with mass-panic potential.
Dr. Alcabes spent some time talking with me about epidemics real and imagined, how we respond to threat inspiring messages in the media, and why our attention is riveted by remote threats while tangible ones close to home are not hard to find.
We’re right in the middle of what appears to be a full-blown epidemic just as your book is hitting the shelves. What’s your take on what we’re seeing in the news?
All epidemics are stories. They often have a widespread disease at their core (often but not always, as the epidemics of “cyber-stalking” and school shootings attest). But the numbers of the sick, dying, and deceased aren’t the main aspect of the story. There have been 50-odd deaths associated with the new flu strain as we speak. Does 50 deaths make for an epidemic? That’s less than the death toll on American highways and roads on the average day. It’s less than the toll taken by malaria in Africa in any one-hour period of any day. It’s sad, and it’s a frightening reminder of the randomness of nature’s deadly bite. But 50 deaths from accident, incident, or infection doesn’t always constitute an epidemic for us.
In fact, the numbers of cases of swine flu and the flu death rate are both quite low in comparison to the normal situation with seasonal flu, the “bug” that comes around every winter. If this were January, we might not even have noticed this outbreak, as it would have been hidden by the far larger and more lethal outbreak of plain-vanilla flu. In fact, if in any given winter the death rate from flu were as low as it’s been in this springtime outbreak, we’d be relieved and call it a mild flu season.
But would you agree that it’s truly an epidemic – the “real thing”?
Yet, I would agree that this is an epidemic — simply because that’s what people say. In fact, as we speak, the W.H.O. has raised the “pandemic alert” to 5 on a scale of 6. Our officials are leading the way in making sure that this small outbreak (it has affected a handful of countries, with about 2500 cases in Mexico so far, 90-odd in the U.S., and scattered clusters elsewhere) is indeed defined as an epidemic. Possibly a pandemic.
The question I ask myself is, why is it so important to us to see this small, thus-far mild outbreak of flu as a scary situation? Why should W.H.O. feel the need to act?
In part, it’s because we’ve been primed for this. Our health agencies (the W.H.O. most notably) have been telling us for years that a flu pandemic is “inevitable.” All those agencies needed a case-in-point to justify their dire warnings, otherwise the “pandemic preparedness” campaign might have gone the way of the prior “bioterrorism preparedness” campaign (2002-2004): simply withered away from lack of interest.
But more deeply, the preparedness rhetoric influenced our thinking. Repeatedly gesturing toward the terrible 1918 flu outbreak, in which tens of millions of people died worldwide, authorities and flu researchers reminded us to think of 1918 when we think of flu. The result, as we see now, is that the few facts available about the new flu serve as the basis for projections of our horror fantasies. People (again, including W.H.O. officials) talk about the inevitability of a “pandemic,” about the likelihood that there will be more cases and more deaths.
So, if by “real thing” you mean, is this a public health problem, I’d say yes. People are sick with a contagious disease. More might fall ill. It demands attention from public health authorities.
But if you mean, is this the disaster that is being depicted, I would say not yet, and probably not ever. The problem is that once the fantasy scenarios start being painted, the facts become scenery on the stage. It’s the fear that drives the drama. We’ll undoubtedly see more fear-driven pronouncements. I hope we’ll also see good public health.
We’re hearing some health officials say that this flu is a harbinger of diseases to come — an evolved mutant virus combining multiple strains.
This is, simply, influenza. What flu does is switch back and forth between species, recombining genetic elements, mutating here and there, “reinventing itself,” to use the term of art. I suspect that calling it “swine” flu gives it a certain pernicious cachet, “swine” being associated with filth in the language. But it seems important to us to label this virus with its own name, not just as flu but “swine flu,” as if it had some special status. I think the naming helps us to be frightened.
Is it a harbinger of the future? Well, I don’t have that particular crystal ball. A lot of people who call themselves flu researchers and whom the media refers to as “experts” are fond of making predictions about pandemics, as if they could see the future. This has gotten us into trouble at least once, in the swine flu immunization fiasco of 1976 (when hundreds of Americans were sickened by a flu vaccine and over 30 died, yet there was no serious outbreak). And it gets us into trouble when, as with “bioterrorism,” we spend a fortune protecting ourselves from a chimeric threat.
But how do we plan to protect public health unless we make some predictions about possible outcomes?
I think we have to draw a distinction between sensible planning for sound public health programs based on observable facts, and so-called predictions that are really just projections of horror fantasies. We have to be careful with this flu outbreak, because, as I said earlier, there are a lot of fantasies afoot, and because many of them hark back to 1918. We have to remember that the world is a very different place than it was in 1918. We have to do good public health to ease suffering and control disease — but we don’t want to get into the business of divining the future. We should stick to what we know, and can see, and what we know how to do about what we can see.
When you started writing Dread, was anything in particular going on in the world that influenced you? And who did you have in mind as the audience?
I felt I was seeing a shift in the way people think about epidemics, a dramatic personalization of epidemic risk. I saw this in New York, where I live, when West Nile fever came in 1999 – its first appearance in the western hemisphere was in Queens, NY. It made headlines, especially because the city administration mounted a major insecticide-spraying campaign that was highly visible. But West Nile was really rare. There were only forty-seven cases that year in NYC – far, far fewer than the number of people hurt or killed in traffic accidents in the city each year (over 70,000), which you don’t hear much about as a health problem. Seven of those forty-seven West Nile cases died, which made the outbreak more serious and of course sad, but no less rare. Still, people worried about every mosquito bite (some would come up to me, roll up their sleeves, and ask if this particular mosquito bite was from a “West Nile mosquito”). A couple of years later, when the postal anthrax outbreak killed five people around the eastern seaboard, there was a similarly personal response, even though the disease was even rarer than West Nile disease. It wasn’t, “what can the City do to slow the outbreak down?” but “what can I do to protect myself?”
And that was in New York – a town that’s seen a few things.
Exactly. This is not a town where people are hysterical about danger. In fact, the anthrax outbreak took place in the shadow of the World Trade Center disaster – it was just a few weeks later. We lost 2,700 of our neighbors, co-workers, family members, etc., on September 11th, it was the worst disaster in city history, and in a situation like the World Trade Center attack, it really had mattered in a life-or-death way exactly what you did and where you were. But most New Yorkers didn’t ask about whether they were doing things “right” or “wrong” when the towers fell. There was widespread fatalism, you could say. By contrast, these small outbreaks of really rare diseases – West Nile, anthrax – made people inspect their personal living arrangements. “Should I be ironing my mail, or would it be better to microwave it?” “Can I go to a park or is my kid going to contract Lyme disease and die?”
I wanted to write about the disconnect between the epidemic that epidemiologists study – I was trained as an infectious-disease epidemiologist – and the one that people see. It was particularly intriguing to try to explain how this personalization of epidemic risk allowed for the growth of the “biopreparedness” industry, the cabal that was warning us that evildoers were about to wreak mayhem using smallpox or tularemia or some other germ.
And I also wanted to see how the personalized epidemic related to the way I had seen people respond to AIDS in the ‘80s and ‘90s. So many people talked about AIDS as a crystallization of a particular set of preconceived anxieties. This is something Susan Sontag had been concerned about when she wrote “AIDS and Its Metaphors” – but that was in 1989, and she thought things would be different by the time AIDS became treatable, and better understood. Yet, by the early 2000s, AIDS was still always “about’ something: gay rights; new sexual modes, like Internet hookups; the failed War on Drugs; policy makers’ unwillingness to talk about homosexuality; “racial/ethnic disparities”; and so forth.
So the book really had to be for a general audience, then.
Yes, an audience interested in thinking a little bit analytically, interested in examining what they hear and read, how they process information, how they decide what to do. And an audience that had some interest in history. The book had to establish the historical background to the way we think today, or it wouldn’t make sense.
In the book you talk about our long-standing historical fear of epidemics and that we pay very little attention, relatively speaking, to the actual diseases responsible for the most deaths – tuberculosis being one example. Why do you think we’re so focused on what might happen and less on what’s really happening?
The easy answer is that the future is terrifying but we always imagine that we can deal with what’s in front of us. The slightly more complex answer is that it’s particularly easy to deal with what’s in front of us when it isn’t really in front of us at all. Most of the mortality from TB and malaria takes place in poor countries in the tropics and subtropics, out of sight and pretty far from the experience of most people in the developed world.
And the still more complicated answer is that when we think about epidemic illness we’re guided (or blinded, sometimes) by what we expect. Tuberculosis and malaria together kill about 3 million people a year, but they kill that many people year in and year out. We accept it, expect it.
And most of the people who are dying don’t run in Western social circles.
No doubt, it’s easier to accept because those aren’t “our” people who are dying. So that’s another piece of the epidemics puzzle. How are our fears shaped not just by what might happen, but by what might happen to us?
So, I don’t want to belittle the efforts to provide adequate treatment for tuberculosis or malaria in countries where they are rife, or efforts on the part of some foundations to fund better prevention activities. But there’s a difference in how we pay attention to what is happening to someone far away, how we pay attention to what is happening to people now, and how we construct our fears of the future.
Aren’t a lot of us scared that what’s happening “over there” will eventually find its way “over here”?
That’s true – our attention often turns to that possibility. Part of the how-we-pay-attention story has to do with messages. When we think a health problem tells us something about what we’re doing wrong, it’s easier for us to call it an “epidemic threat.” This is unfortunate for Africans whose kids could be saved from malaria with insecticide-treated bednets or from filariasis or schistosomiasis with simple water purification systems. Those diseases don’t seem to speak to us about what’s wrong with our modern lives. Obesity, asthma, allergies, childhood bipolar disorder… these epidemics seems to have messages for us. Our appetites are unrestrained, or our culture is toxic, or we aren’t good parents – whatever the message is that we read into these new epidemic threats, it’s about us and our lives.
So the last piece of the story is that worrying about the future is a way of validating the anxieties that epidemics touch on. Malaria seems to tell us nothing about ourselves (of course, it does: it tells us that we don’t do enough for the Have-nots of the world – but that’s not an anxiety that troubles our sleep much). AIDS tells us that we’ve been injudicious or overly tolerant or inadequately informed about condoms – whichever message appeals. But the obesity epidemic, which seems to carry messages about sloth, addiction, or the “toxic” food supply, is all about what might happen in the future. That’s very powerful.
During the last few avian flu scares, and now with swine flu, the 1918 flu epidemic was brought up repeatedly in the media as evidence that “this can happen again.” Would you classify this as media sensationalism, or is there some legitimacy to the comparison?
I hope we can get away from the spell of 1918. Influenza is a serious public health problem. Even in the U.S., with all our know-how, there are thousands of deaths from flu each year and thousands more from flu-related illnesses, especially pneumonia. Much more needs to be done about the consequences of real flu outbreaks—not the ones the experts claim to foresee, since those haven’t happened, but the ones that people really experience.
When people say that another cataclysmic flu epidemic like 1918’s is inevitable, I hear it as a statement of feeling vulnerable and inadequately protected. The response to that sensation shouldn’t be to try to predict a grimmer future in detail. The future is just too complicated and murky to know what it will look like in its specifics. And nature has always had surprises for us. Better to look at the illness and suffering we see today and try to understand how to reduce it. There is much we still need to know about how changes in commerce, animal husbandry, food production, wealth distribution, the landscape, water use, et cetera create opportunities for animal-human interactions that can promote flu-virus spread. That seems more pressing right now than trying to describe exactly what form a pandemic of flu might take.
But, still, avian flu in particular persists in our armageddon disease scenarios.
Well, with avian flu, the situation over the past decade should remind us that influenza viruses can take advantage of complicated new pathways—there have been over four hundred human cases of H5N1 flu, about two-thirds of them fatal, even though it’s basically a bird virus. The concepts put forth by the One World—One Health movement, which looks at the complex interactions of microbes, the environment, animal raising, food processing, etc., not just at the simple situation of one virus infecting humans—are particularly compelling. I hope we can learn, from avian flu, to take a broad-minded, inclusive view of how to deal with human and animal viruses generally, and lift the burden of suffering caused by viruses passing back and forth among species (human and animal). I don’t think media sensationalism has too much to do with that; it’s more a matter of expansiveness of vision and political will.
Walk into any supermarket in the U.S. and it’s immediately apparent that we’re obsessed with hygiene. Bleach spray, bleach wipes, hand sanitizers, disinfectants, and on and on are everywhere. But what are we really accomplishing with all of this? Are we really heading off an “outbreak”?
It all goes to warding off germs. Germs are a big deal to us, and have been for a century now. Why this is, I discuss in some detail in the book. Here’s the short version: germs provide a focus for our fears, uncertainties, and anxieties.
Not that germs don’t exist. They do. They can be harmful. If there is MRSA going around, say, it’s a good idea to wash your hands thoroughly and often. What does all the antisepsis – the fancy word for “germ killing” – do for us in the larger, population-level picture, though? Personal hygiene does, under certain circumstances, stop outbreaks. Commercial kitchens, hospitals – places where there is a very high potential for rapidly broadcasting potentially dangerous microbes – can do a lot to reduce the likelihood of outbreaks if everyone is careful about hygiene.
So germ phobia isn’t really such a bad thing?
I wouldn’t necessarily say that. It’s not such a crazy thing, but germ phobia has much less impact on population health generally. To prevent outbreaks, you have to have a broader, more ecological view. This is what the One World – One Health people are advocating. You can’t just think about interrupting the pathway leading from germs to people, as if we humans were bowling pins and germs were the ball. The really dramatic downturn in infectious disease mortality that happened in the developed world in the 20th century wasn’t because of antiseptics or even antibiotic medications or vaccines; it was because women gained some control over their reproductive lives so that family size dropped drastically, because food supplies stabilized and nutrition improved, because sanitation systems gave people access to clean water and disposed of sewage, and because housing standards increased. Antibiotics and vaccines were adjuncts, not the main story. Hand washing probably helped, but it wasn’t primary in any way.
And there’s a price to pay for the bowling-alley sort of control that antisepsis, or even antibiotic medication, gives: we encourage the potentially harmful microbes to evolve. Bacteria are the evolutionary champions. Anything we do that encourages them to evolve is only going to make things worse for us in the long run.
We’re inadvertently fostering the revenge of the microbes.
Yes. Antibiotics and anti-microbial sanitizers solve a problem in the short term but they run the risk of encouraging worse problems down the road. That’s the MRSA story – we’ve seen a rapid evolution of antibiotic-resistant bacteria, MRSA most prominently. And it’s a common story: there are also drug-resistant malaria strains, drug-resistant viruses. In fact, some people who study diseases think that all this hygiene is also harming public health in the long run by promoting the rise of allergies, asthma, and disruptions of the immune system.
But there’s one more aspect to the hygiene story: it’s reassuring to feel that we’re fighting off germs, but it’s a false assurance. We live in a world of microbes (in fact, a veritable world of microbes lives inside each of us). Most diseases don’t happen simply because a bacterium or virus barrels down the lane and hits one of us, and no epidemic happens just because one microbe gets loose. It’s always a lot more complicated than that. The hygiene obsession makes us think health is a simple thing when really it’s very complex.
You talk in the book about the recent shift in concern to what we might call social epidemics – obesity for example. It seems like use of the word “epidemic” in these cases is more than just poor word choice by reporters – we’re really invested in labeling these things as epidemics to underline their severity. If this gets the message across, is it such a bad thing?
I’ll answer the question with a question: what’s the message we’re trying to get across when we call something an epidemic? When we talk about obesity, crystal-meth addiction, teen suicide, or other social concerns as epidemics, we’re saying that this is a crisis, it seems scary to us. And we’re also saying that we think that someone is doing something wrong.
In fact, we rarely use the word “epidemic” anymore except when we want to send the message that someone is doing something we don’t like. We say that there’s an economic “downturn,” a housing “slump.” We talk about natural disasters, like floods or earthquakes. But obesity, we say, is “epidemic.”
Now, it’s not crazy to talk about obesity as harmful, since that’s a real possibility: some people can develop diabetes, hypertension, or other conditions as a result of being obese. But are we saying nothing more than, “if you’re overweight, see if you can slim down”? I think there’s more to it than that. If we talk about a flu epidemic there’s a virus to blame, but with obesity, there’s no germ. Instead, we seem to be expressing our sense that our food supply is bad for us, or that people are eating compulsively, or that our appetites have gotten the better of us. In America, we talk about an “epidemic” of crystal-meth addiction (back in the ‘80s, it was the “crack epidemic”) because we’re always ambivalent about chemically induced altered states. We really believe, as the DuPont ads used to say, in “better living through chemistry,” and the success of the pharmaceutical industry corroborates that. But we’re also suspicious of unrestrained appetites, and worried that one demon rum or another is leading us into improprieties.
You make it sound as though there’s an accusation implicit in the use of the word “epidemic.”
Often there is. Sometimes it’s made explicit for us: we’ve become complacent about AIDS, U.S. officials tell us, meaning that we’re engaging in “risky behavior” (read: having a little more fun with sex than the health sector approves of). Physicians in the U.S. and Britain say we’re complacent about childhood obesity, meaning that we’re not bringing up our children right.
That’s why I think we have to be careful to ask what the point is, when someone talks about an epidemic. What is it we’re supposedly doing wrong? Who is being blamed?
The CDC and NIH aren’t immune to breeding hysteria, as you point out in the book. But as with the obesity question above, is there a legitimate role for amping up the alarmism about a potential epidemic if this approach gets peoples’ attention?
The question in my mind is, why do officials want to get our attention? What do they expect us to do? Or, what is it that they expect us to give up?
Often, it seems that officials want to generate panic about some supposed health threat for reasons that have nothing to do with actually reducing harm and suffering. Sometimes it’s to shake loose more funding from Congress. That’s what seems to have happened when the CDC cooked up the Airplane Man hysteria, which I write about in the book. The officials insisted that there was a health emergency because this guy who had TB had traveled on commercial air flights. But it turns out that CDC deals with situations like this fairly often, very quietly. This time, they were worried about the appropriation they get from Congress for TB control, so they made the case high-profile.
Does a political agenda factor in, perhaps?
Often times, yes, there is a political agenda. I wrote about the “Killer HIV” case in New York, when the city’s health commissioner created a public panic about a supposedly drug-resistant HIV strain, and it turned out that he was aiming to soften up policy makers for a change, which he was about to propose, to the state’s HIV Confidentiality Law. And often, it’s just to reassure the public that the officials are doing something. If you, the official agency, tell people there’s a threat but that you’re taking action to ward it off, then every day that nothing bad happens your agency looks like it’s successful. This is what we saw with the bioterrorism scare. There was never a real possibility of epidemics created by purposive spreading of germs, but officials tried to get us to believe they were hard at work to protect us, and their evidence was that there hadn’t been an epidemic!
This isn’t some kind of bad behavior on the part of officials. I do wish they would be more straight with the public. But they are themselves in a tricky position. As we saw with the CDC’s TB Bureau and the Airplane Man case, agencies can only get the funding they need by issuing a stream of alerts, and then showing that they are taking action to ward off each imagined threat. They aren’t being asked to assess whether what they have done really accomplished any health improvement.
So official agencies’ attention getting is a two-edged sword. It might alert us to a genuine problem so that we can determine how to address it, but it also puts everyone on edge, keeps the public in a state of hypervigilance, expends a lot of tax-levy funding for no practical purpose, and, as we saw with the bioterrorism scare, allows government agencies to decrease funding to useful harm-avoidance programs, increase surveillance, and limit civil rights in the name of averting “epidemics.”
If you were going to give someone advice on how to reasonably think about disease, as opposed to always fearing the worst, what would it be?
At the end of Tony Kushner’s Angels in America plays, the character Prior says, “The world only spins forward… More life.” That’s not a bad exhortation, and we might think of it as advice.
How does this relate to your question? Life is never without risk. Everybody dies. We should stop thinking that, by studying the supposed mistakes that others have made we are going to keep ourselves healthy forever. In the meantime, why not live deeply? And be more tolerant of other people’s desire to live their lives deeply, even dangerously, knowing that life is necessarily short.
Eat, drink and be merry for tomorrow we die.
Sure, but this isn’t a plea for hedonism. On the contrary, I wish we were much less involved with our personal self-esteem and individual sense of well-being, and a lot more forthright in trying to limit human suffering.
But we don’t need to spend all our energy trying to detect risk and stamp it out. Almost nothing you do or your neighbor does is going to result in a great plague that harms you or your family. On the other hand, nature remains “red in tooth and claw,” as Tennyson put it: not in the least benign, always in the midst of tearing into what we thought was a comfortable existence. Sometimes it’s a flood or airplane crash that does that, sometimes some previously unheard-of (and un-thought-about) disease, sometimes it’s one of the old stand-bys like tuberculosis. Of course we should do what we can to keep people healthy. But we shouldn’t be taken in by the illusion that just because we switch from processed to organic foods, abstain from casual sex, forswear crystal-meth use, or curtail our television watching and get to the gym, we can somehow fool nature. Life is still finite, and you almost never know when your number is up. So: More life.
Link to Philip Alcabes’ website
Link to an interview with Philip Alcabes on The Daily Show with Jon Stewart
Link toDreadon Amazon