America and the World since September 11

Michael Sherry. Magazine of History. Volume 25, Issue 3. July 2011.

The hijacked airliners slamming into Manhattan’s World Trade Center, Washington’s Pentagon, and a field in Shanksville, Pennsylvania on September 11, 2001 seemed to rip the fabric of American history. In the aftermath of the attacks, panicked national leaders quickly asserted that everything had changed. The nation was in a war “unlike any other we have ever seen,” President George W. Bush announced, one waged against the terrorist organization al-Qaeda, which carried out the 9/11 attacks, and its apparent leader, Osama bin Laden. Vice President Richard Cheney reportedly now rode in his armored car with “a duffel bag stocked with a gas mask and a biochemical survival suit.” George Tenet, from his office as Director of the Central Intelligence Agency, simply said, “We’re fucked,” “Holy Fucking Shit Attack on America,” screamed the satirical newspaper The Onion on September 26, both channeling and parodying the national mood. To be sure, reactions to 9/11 among Americans were not monolithic, with proximity to the attacks and relationships to the dead the most obvious sources of variation. Still, the shock they felt on 9/11 was unmatched by any national catastrophe since the assassinations of John Kennedy, Martin Luther King, Jr, and Robert Kennedy in the 1960s.

The responses of high officials underline what historians understand about catastrophe: it is less the event itself than the context in which it occurs and the reactions it generates that determine whether it changes history. History is in good part the story of catastrophes, but most are not game-changers. Over one thousand people died at a New York City dock when the steamer General Siocum caught fire in 1904, another 1,517 when the British liner Titanic sank on April 14, 1912, and 844 were killed when the Eastland rolled over at a Chicago dock in 1915. But none of those events changed history, in part because shipwrecks (usually of lesser magnitude) were commonplace and because no widely perceived enemy had caused the disasters.

But it was difficult to lodge the 9/11 attacks in the familiar. Even the precedent most cited—the Japanese attack on Pearl Harbor on December 7, 1941—was far back in time and far different in circumstance. Many Americans had expected war with Japan, just not Pearl Harbor as the site of an attack. No significant attack by foreign agents on the continental United States had occurred since the War of 1812, not to mention one carried out by such bizarre and apparently unforeseen means—a terrorist hijacking of civilian airliners, rather than the assault by enemy bombers or missiles periodically imagined by Americans since the 1920s. The shock was nearly universal among Americans and foreign observers, not just at the death toll (nearly three thousand), but also at the iconic nature of the buildings attacked and their naked vulnerability to assault, gruesomely evident when the twin towers collapsed. For shock induced, 9/11 ranks with the stock market crash of 1929, Pearl Harbor, the U.S. atomic attack on Hiroshima on August 6, 1945, and few other days in modern U.S. history. And for consequences unleashed, 9/11 may also rank with those history-bending events. This essay evaluates that possibility: how much discontinuity, and of what sort, was triggered by the 9/11 attacks?

Situating Shock in Historical Context

The case for discontinuity is obvious and powerfid. The attacks immediately gave direction to a rather listless Bush presidency. They prompted Bush to offer an extravagant promise on September 14 at the National Cathedral in Washington: the nation’s “responsibility” was now nothing less than “to rid the world of evil.” They led pundits and politicians to offer a blunt embrace of America’s imperial role in the world. They sparked the U.S. war against al-Qaeda and the Taliban regime in Afghanistan (aerial attacks began in October, ground troops followed in January). They ushered in an enormous panoply of security measures, legal and illegal, including but not limited to surveillance, capture, torture, and deportation, measures that were authorized by the USA Patriot Act which Congress passed hurriedly in October. Soon followed Bush’s declaration of America’s right to conduct preventive war—“If we wait for threats to fully materialize, we will have waited too long, “he told cadets at West Point on June i, 2002—and his determination to go it alone at war if others would not join America’s side. Then came his decision to invade Iraq in March 2003, even though international inspectors had found little evidence of the Iraqi “weapons of mass destruction” that the administration (and British Prime Minister Tony Blair’s government) darkly warned about. Those actions were phases of a global war on terror of vast scale, enormous expense, murky nature, and much-debated value.

Yet every reaction generated by the attacks had deep roots in the pre-9/11 past, as many historians soon noted. The shock itself was instantly historicized, above all by analogies to Pearl Harbor, in ways that grounded it in the past. Bush’s presidency was hardly the first to be re-booted (and ultimately undermined) by wan something similar happened to Harry Truman’s presidency with the outbreak of the Korean War in 1950. The resort to war in response to attack was characteristic for the U.S., as it is for most nations able to muster a response. The “global war on terror” (a semi-official term introduced by the Bush administration in December 2001) had begun much earlier. Ronald Reagan had lectured the United Nations about the “war on terrorism” in 1986, and his stance was echoed by the Bill Clinton White House in the wake of a February 1993 terrorist explosion under one of the World Trade Center towers. That war had been fictionalized for movie audiences, who watched George Clooney and Nicole Kidman barely avert a nuclear explosion in New York in The Peacemaker (1997). An invasion of Iraq, intermittently bombed by the U.S. under Clinton, was already under consideration by the Bush administration before September 11. Bush’s extraordinary pledge to “rid the world of evil” had deep roots in the traditions of American exceptionalism; many presidents had promised that the U.S. would bring the world liberty and peace (if not the end of “evil”). The muscular presidency rolled out after 9/11 had been developing for decades, just as Vice President Cheney had for a quartercentury bewailed restrictions on the executive branch imposed after America’s war in Vietnam and the Watergate scandal. The web of security procedures that enshrouded airports had been tightening since 1970. Bush’s proclamations about unilateral action and preventive war echoed Cold War Republican conservatism and had precedent in American strategic doctrine—the U.S. had never clearly renounced the first use of nuclear weapons in a war, for example.

The major (and not trivial) difference was that Bush elevated the principles of unilateral action and preventive war to the level of explicit, loud, official presidential doctrine. Scale also counts—the new “global war” was bigger and costlier than Clinton’s campaign against terrorism. But the war on terror would have expanded without the 9/11 attacks, and as one careful scholar puts it, “9/11 shocked America but did not transform the politics of national security.” Nor was post-9/11 xenophobia surprising. The state’s surveillance and deportation of foreign-born Middle Easterners and Muslims, like popular expressions of Islamophobia, fit well-worn grooves of American hostility towards alien groups in times of stress. (See Moustafâ Bayoumi’s article in this issue.) And continuity emerged when President Barack Obama, though discarding Bush’s strutting style and egregious practices, continued (sometimes at Congress’s insistence) most of his policies, carried on his wars, expanded the covert U.S. war in Pakistan, and joined a multilateral intervention into Libya in 2011.

The development most intensely defended and attacked as a break from the past was the Bush administration’s torture, rendition, and other abuse of alleged terrorists and prisoners captured by or handed over to American forces. Many imprisoned at the U.S. naval base at Guantánamo Bay, Cuba, for instance, were subjected to water-boarding, electric shock, sexual degradation, and other abusive techniques. Defenders touted these practices as part of a “New Paradigm” and a resort to the “dark side,” as Vice President Cheney put it on Meet the Press shortly after 9/11, necessitated by a new kind of war. Critics too judged these practices as a sudden break from the past, though in their eyes a sickening one that abandoned an American tradition of humane practice and adherence to international law. (See Martin Flaherty’s article in this issue.) Yet like many nations and groups at war, the U.S. forces had in the past tortured (and been tortured by) enemies—some examples include campaigns against Native Americans, the waterboarding of Filipinos during the Philippine-American War of 1899-1902, the handling of Japanese personnel during World War II, and the Phoenix program which organized assassinations and torture of National Liberation Front forces during the Vietnam War. There were also precedents for post-9/11 abuses in America’s “war on crime,” which greatly expanded before 9/11. It included its own abusive practices, sometimes carried out by the same personnel who went abroad to find and guard prisoners after 9/11.

What was new after 9/11 was not these practices, but their authorization by the highest officials, the brazenness with which those officials simultaneously defended and denied abuse (“we do not torture,” Bush repeatedly said), their resort to suspect legal reasoning, and their willful obliviousness to legal and practical considerations. Those condoning these acts of torture ignored knowledgeable career military officers, lawyers, and judges, as well as Federal Bureau of Investigation officials. They resisted the administration’s dissolution of restraint, and likewise warned that torture was a dubious, even misleading, way to gain intelligence, one sure also to inflame Muslim sensibilities against the U.S. Perhaps political leaders ignored such advice because their purpose was less to gain useful intelligence than to display American power. They would show enemies and Americans alike that the shackles were removed from its exercise, as they were from Jack Bauer when he raced the ticking clock in 24, the popular Fox television series that debuted on November 6, 2001 and ran until May 2010. As the practice of torture suggested, the state’s responses to 9/11 for the most part revived and expanded past practices and implemented long-held impulses. The result was abundant change for a nation that, since the Civil War, had evolved within fairly narrow parameters (no revolutions here, even if we do talk of a “civil rights revolution” and the like). But however big the change, it stopped short of rupture.

The Politics of Disruption

Perhaps the biggest discontinuity in state policy was one that most Americans did not foresee, desire, or immediately notice, for it crept up on them: two long wars far from U.S. shores. Both found the U.S. bogged down against shadowy enemies, allied with fragile new regimes, and caught up in wars that it had helped trigger. Never before had the U.S. waged extended war in the Middle East, or any overt war in Afghanistan. There was additionally no precedent for the carrying on of two simultaneous wars, except perhaps for WWII, when the U.S. waged two substantially distinct wars against Germany and Japan. But this discontinuity was at a considerable distance from 9/11; the attacks that day neither compelled nor immediately unleashed the new American wars. This suggests again how discontinuity inheres little in an event itself. Discontinuity was not so much done to us as done by us.

That discontinuity also involved an important political change. The Republican Party, long the party of war talk more than war itself, now became the biggest proponent of war. Since 1917, Democratic administrations had led the nation into every major war it fought, until President George H. W. Bush’s Gulf War in 1991. Even the Gulf War had, however, adhered roughly to the post- Vietnam Caspar Weinberger Colin Powell doctrine of the 1980s. The U.S., that is, would go to war only if its national security was dearly threatened, victory and exit were readily foreseeable, and national and international support were broad and sustainable. In going to war in Afghanistan and Iraq, the George W. Bush administration either ignored that doctrine or naively assumed that it applied. They plunged into the very “nation-building” that Bush and others had accused Clinton of engaging in, but with the help of only reluctant, arm-twisted, well-bribed, or insignificant allies in a “coalition of the willing.” The ensuing wars came at great cost to Americans (by the end of 2010, over 4,000 U.S. military personnel had died in the Iraq War and nearly 1,500 in the Afghanistan War) and far greater cost to the populations warred over (estimates vary wildly, but at least 100,000 Iraqis and tens of thousands of Afghans had died by 2 on). Contributing mightily to a swift shift from surpluses to deficits in the U.S. federal budget, these wars raised anew the danger of “imperial overstretch,” about which historian Paul Kennedy had warned in his 1987 book The Rise and Fall of the Great Powers. The more the U.S. wielded its power abroad, the less influence it seemed to have, and that influence diminished further when America’s model of lightly-regulated capitalism lost credibility with the Great Recession of 2007.

Yet the discontinuity that the Afghan and Iraq wars entailed was not deeply felt by most Americans. The geographic and cultural remoteness of the wars, their hard-to-track course, and the uncertainty of their outcomes deprived these wars, after their initial stages, of a familiar narrative arc. Too, the American burden of these wars fell on a professional force (and its civilian contractors) largely insulated from the rest of the nation, while the greater suffering of Iraqis and Afghans faded from American media attention.

Of course, state policy is only one way to measure continuity and discontinuity, with public sensibilities being another. Indeed, perhaps the biggest discontinuity was simply the pervasive sense of discontinuity—the fear that the war on terrorism was unlike any war in the American past, that 9/11 was a gash across American history, and that history had been shut down and restarted, which is how I experienced the days after 9/11. An explosion of conspiracy theories about who or what lay behind the attacks was one measure of the shock; perhaps, conspiracists speculated, 9/11 was an inside job carried out by the Bush Administration or Jewish schemers. Another measure was the vast and varied memorial practices undertaken after 9/11. The photos of the disappeared posted in Manhattan, the moving service at the National Cathedral on September 14, the mournful sounds of Samuel Barber’s Adagio for Strings in repeated broadcasts, and the biographies of the dead posted in the New York Times “Portraits of Grief richly expressed the memorial impulse long growing in American life. They gave voice to both grief and fear that the world had been ripped asunder.

Yet that memorial impulse dissipated with surprising speed. It was soon swallowed up in Byzantine political struggles over what to do with the World Trade Center site. It was curtailed by the rapid disappearance from the public sphere of troubling 9/11 images like those of people leaping from the twin towers. It was constrained because the attacks left behind few survivors from the buildings struck to offer memorial testimonies to the event, as survivors of Pearl Harbor and Hiroshima had for decades. People either escaped intact or perished that day. It was shrunk by the emergence of a narrow nationalist frame for understanding 9/11. Despite the many foreigners from some ninety countries killed in the attacks and an outpouring of international support for the U.S., Deputy Assistant Attorney General John Yoo referred to how “a foreign entity can kill 3,000 Americans.” The memorial impulse was also baldly conscripted for political ends by the Republican Party, denuding it of its we’re-all-in-this-together feel. And it was overtaken by accusations, some from survivors of 9/11 victims, that the administration could have foreseen and forestalled the attacks, by investigations into its pre-9/11 conduct, and by debates and court proceedings over its treatment of captives. That Ls, the story shifted from what happened on 9/11 to how the Bush presidency anticipated and responded to it. Meanwhile, in comparison to previous wars, movies and television dramas about 9/ 11 and the ensuing wars were few and seen by few. (See Lary May’s article in this issue about post-9/11 film.)

Those wars did keep the memory of 9/11 alive if only because politicians invoked 9/11 as a justification for them. But that link shriveled when administration claims about ties between Iraqi leader Saddam Hussein and al-Qaeda, and about Hussein’s “weapons of mass destruction,” were discredited. Most Americans viewed these wars at such a literal, psychic, and political distance, widened by administration decisions to keep the caskets of American war dead out of public view, that they served poorly to nourish the memory of 9/11. Perhaps the Iraq War came most alive for many Americans early in 2004 with the stories, photographs, and video images of American torture of Iraqis at Abu Ghraib prison in Baghdad. But the issues raised by those revelations tracked only loosely back to 9/11, while the feckless quest to capture or kill Osama bin Laden gradually faded from view (until his spectacular assassination by an elite U.S. Navy Seals team in May 2011 in Pakistan). As the wars abroad turned sour, 9/11 receded: “It was as if Americans forgot 9/11 in order to forget the wars that followed it,” as historian Michael Allen has put it to me. By 2004, 9/11 was simultaneously everywhere in American life at some immeasurable psychic level—mental health experts spotted widespread but hard-to-quantify damage—yet distant from the lived reality of most people. As memorialization shrank and scattered, so too did the sense of shock it expressed.

Most Americans were left with a powerful sense of disruption, but also without the daily reality that would give it palpable meaning. To be sure, as with most grand generalizations, this one has exceptions. To those who witnessed the destruction of 9/11 firsthand, who lost loved ones in it, or who suffered injury or broken health in the mammoth rescue and clean-up efforts, the disruption was acutely lived, perhaps never to be lost. To those deported (probably several thousand), jailed, or under suspicion by neighbors and law-enforcement agencies, there was no easy return to normality. For those who marched off to war and those who knew them (in addition to regular forces, three-quarters of a million in National Guard and reserve personnel were called to active duty during the decade), disruption was sharp, and often repeated since many endured additional tours of duty in the war theaters. For those who viewed the destruction fleetingly—millions of residents, tourists, and visitors passed through Lower Manhattan in the months after 9/11—disruption was something less, if no less meaningful. For those, far more numerous, who saw and heard about it from afar, it was a horror movie, frightening, yet not much experienced in their lives. Perhaps the most common experience of disruption involved the new anti-terror measures imposed on air travel and other public activities. But those measures added new layers to an existing system and were, as they were designed to be, quickly normalized. For most Americans, life resumed in familiar fashion within weeks or months of 9/11, albeit a normal life shadowed by fear.

That was what people in power wanted. For all that the Bush administration broadcast its message of a changed nation engaged in a war “unlike any other,” it simultaneously insisted that little need change. As the airline industry buckled and the holiday season approached in 2001, Bush urged Americans to “get down to Disney World,” shop at malls, and hop on airliners. Many Americans believed that to shop and travel as normal would constitute, as Bush put it on November 8, “the ultimate repudiation of terrorism.” In other ways, too, the administration insisted on normality—in its refusal to raise taxes to pay for the “global war on terrorism,” for example, and its sharp-edged partisan politics. The result was a schizoid stance by the administration, leaving Americans in a twilight zone: everything had changed, and nothing really had. This gap between the profession of shock and the performance of normality shaped the post-9/11 landscape. It was perhaps the biggest discontinuity to emerge after 9/11.

Conclusion

Still, the historian—at least this historian—is left surprised by how little changed after 9/11. Of course, historians are expert at the parlor game of finding continuity and discontinuity; we can spin it either way, as any reader of this essay could. But the continuity is notable. As historian Marilyn Young put it in 2003, “Neither compassion nor spiritual renewal occurred” after 9/11, despite hopes that they would. “Instead, as is sometimes the case with individuals facing crises, the country became even more itself, almost to the point of caricature.” How could such a catastrophic event bring about such limited change? One answer might be the fundamental conservatism and stability of the nation’s institutions, politics, culture, and customs. Resistance to the more extreme steps of the Bush administration also limited change. Resistance came from courts, military lawyers, advocacy groups, Muslim Americans and others, though less from national politicians, among them the aged, Shakespeare-quoting Senator Robert Byrd, his voice quivering as he denounced the Iraq War—“We flaunt our superpower status with arrogance”—and other administration initiatives. The continued pursuit of familiar interests also moved history back into familiar grooves, while new challenges—Hurricane Katrina in 2005, the Great Recession in 2007—changed the subject. And of course there was no national mobilization of people and resources on the wrenching scale that World War II entailed.

But there was no such mobilization because there did not have to be. The threat was frightening but limited, and no further major terrorist attacks on the U.S. (though significant ones elsewhere) ensued to ramp it up. What gave World War II its gravity as a history-changing event was not only the attack on Pearl Harbor, but the near-global scale of Axis assaults on American allies, interests, and possessions before and after December 7, and the huge mobilization required to defeat Germany and Japan. However shocking and monstrous, 9/11 was a rogue disaster, not one in a long chain of catastrophes. With great relief we can say that 9 /11 was not 12/7.

On March 1, 2004, President Bush proudly took possession of a pistol taken from Saddam Hussein when the Iraqi leader was captured on December 13, 2003. Later, Bush eagerly showed the pistol to White House visitors and made sure it would be displayed at his presidential library in Dallas. It was a fitting token of his tough-talk Texas style of squaring off against terrorists. But it was also a rather pathetic emblem of the course he and the nation had traveled since 9/11. Instead of the scalp of Osama bin Laden, he had only the pistol of a second-rate tyrant who had nothing to do with the 9/11 attacks. Instead of Iraqi “weapons of mass destruction,” he could only show off one small gun. The media barely noticed Bush’s fondness for that pistol by the time he left office, another indication that 9/11 was leaving behind a dark but indistinct mark on American life. The war against terrorism “will end in a way and at an hour of our choosing,” Bush had pledged on September 14, 2001. But that was not to be. While the killing of Osama bin Laden by American forces on May 1, 2011, elicited satisfaction from many Americans, it came too long after the 9/11 tragedy to provoke fullthroated triumphalism, even from some survivors. “I just can’t find it in me to be glad one more person is dead, even if it is Osama bin Laden,” commented Harry Waizer, who escaped the World Trade Center after the attack. “You know the dead are still dead,” Waizer noted. “So in that sense, there is no such thing as closure.”