Feeds:
Entradas
Comentarios

Archive for 2014

Phantom Menace The myth of American isolationism

By Peter Beinart

In an op-ed last year in The Washington Post, former Sens. Joe Lieberman and Jon Kyl warned of “the danger of repeating the cycle of American isolationism.” That summer, Post columnist Charles Krauthammer heralded “the return of the most venerable strain of conservative foreign policy: isolationism.”

New York Times columnist Bill Keller then fretted that “America is again in a deep isolationist mood.” This November, Wall Street Journal columnist Bret Stephens will publish a book subtitled The New Isolationism and the Coming Global Disorder.

What makes these warnings odd is that in contemporary foreign policy discourse, isolationism—as the dictionary defines it—does not exist. Calling your opponent an “isolationist” serves the same function in foreign policy that calling her a “socialist” serves in domestic policy. While the term itself is nebulous, it evokes a frightening past, and thus vilifies opposing arguments without actually rebutting them. For hawks eager to discredit any serious critique of America’s military interventions in the “war on terror,” that’s very useful indeed.

TO GRASP HOW little basis today’s attacks on “isolationism” have in reality, it’s worth understanding what the term “isolationism” actually means. Merriam-Webster defines it as “the belief that a country should not be involved with other countries.” The Oxford dictionaries call it “a policy of remaining apart from the affairs or interests of … other countries.”

When critics decry isolationism today, they usually map that dictionary definition onto a particular historical period: the 1920s and 1930s. Warnings about isolationism almost always come with the same historical morality tale: America turned inward in the interwar years, and the world went to hell. That’s what makes “isolationism” scary. Like “socialism,” it’s a euphemism for “Hitler and Stalin are coming.”

The problem is that isolationism—as commonly understood—not only doesn’t fit American foreign policy today, it doesn’t even fit American foreign policy in the 1920s and 1930s. There are plenty of valid critiques of how the United States comported itself on the world stage between World War I and World War II. But the claim that America detached itself from other countries is simply not true. In 1921, for instance, President Harding summoned the world’s powers to the Washington Naval Conference and pushed through what some have called the first disarmament treaty in history. In 1924, after Germany’s failure to pay its war reparations led French and Belgian troops to occupy the Ruhr Valley, the Coolidge administration ended the crisis by appointing banker Charles Dawes to design a new reparations-payments system, which Washington muscled the European powers into accepting. American pressure helped to produce the 1925 Treaty of Locarno, which guaranteed the borders between Germany and the countries to its west (though not, fatefully, to its east). In 1930, President Hoover played a key role in the London Naval Conference, which placed further limits on naval construction.

Dr. Seuss drew many anti-isolationism cartoons during the early 1940s. (PM Magazine/Dr. Seuss)

Dr. Seuss drew many anti-isolationism cartoons during the early 1940s. (PM Magazine/Dr. Seuss)

Again and again during the interwar years, the U.S. deployed its newfound economic power to shape politics in Europe. And this overseas engagement wasn’t limited to America’s government alone. Although the United States severely limited European immigration in the 1920s, Americans built the avowedly internationalist institutions that would help guide the country’s foreign policy after World War II. The Council on Foreign Relations was born in 1921. The University of Chicago created America’s first graduate program in international affairs in 1928. And during the interwar years, American travel to Europe expanded dramatically. To be sure, the U.S. in the interwar years was more comfortable intervening economically and diplomatically than militarily. But despite the Neutrality Acts meant to keep the U.S. out of another European war, the Roosevelt administration began sending warplanes and warships to Britain two years before Pearl Harbor. By early 1941, long before America officially entered the war, its ships were already hunting German vessels across the Atlantic.

The only sense in which the United States in the interwar years truly remained apart from other nations lay in its refusal to make binding military commitments, either via the League of Nations or through alliances with particular nations. America wielded power economically, diplomatically, and even militarily, but it jealously guarded its sovereignty. That’s why one influential history of the era dubs U.S. foreign policy between the wars “independent internationalism.” (The last prominent spokesperson for that form of independence was Sen. Robert Taft of Ohio, who during the early Cold War opposed NATO because it required that America pledge itself to Europe’s defense, but who endorsed an all-out war with China to reunify Korea under Western control.) The popular “characterization of America as isolationist in the interwar period,” argues Ohio State University’s Bear Braumoeller in a useful review of the academic literature on the period, “is simply wrong.”

IF CALLING AMERICA isolationist in the 1920s and 1930s is wrong, calling America isolationist today is absurd. The United States currently stations troops in more than 150 countries. Its alliances commit it to defend large swaths of Europe and Asia against foreign attack. Recent presidents have dropped bombs on, or sent troops to, Kuwait, Iraq, Afghanistan, Bosnia, Kosovo, Somalia, Sudan, Syria, Libya, Pakistan, and Yemen. Last month, President Obama sent 3,000 American troops to battle an Ebola outbreak in West Africa. And while Americans fiercely debate particular military interventions and foreign-aid programs, the general presumption that the United States should play a leading role in solving problems far from our shores is largely uncontested in the American political mainstream.

Just how uncontested becomes clear when you examine the foreign policy evolution of Rand Paul, the man frequently held up as the leader of his party’s isolationist wing. As a Senate candidate in 2009, Paul mused about reducing America’s military bases overseas. In 2011, soon after entering the Senate, he suggested eliminating foreign aid. He has also repeatedly insisted that only Congress, and not the president, can declare war (a position that Barack Obama championed when he was in the Senate as well).

Even these views did not make Paul an isolationist. He has never questioned America’s membership in NATO, for instance, or its security alliance with Japan, the cornerstones of America’s post-World War II global role. But in Paul’s early days on the national political stage, his foreign policy instincts did diverge substantially from the ones that held sway in official Washington.

What has happened since shows just how hegemonic America’s globalist consensus actually is. For starters, Paul’s efforts to dial back American interventionism went nowhere. His Senate bill to end foreign aid to Egypt, Pakistan, and Libya got 10 votes. A later bid to reduce America’s overall aid budget from $30 billion to $5 billion garnered 18 votes. This at a time when, according to Bill Keller, America was in “a deep isolationist mood.”

Moreover, Paul’s own views have become markedly more conventional. After first saying that the U.S. should not “tweak” Russia for its aggression in Ukraine, Paul later called for imposing harsh sanctions on Moscow, reinstalling missile-defense systems in Poland and the Czech Republic, and boycotting the Winter Olympics in Sochi. On ISIS, Paul has followed a similar path. After expressing initial skepticism about the value of air strikes, he now says, “If I had been in President Obama’s shoes, I would have acted more decisively and strongly against ISIS.”

Were Paul really an isolationist, his approach to the Middle East would be straightforward: Extricate America from the region and stop giving its people reasons to hate us. But he has explicitly repudiated that view. “I don’t agree that absent Western occupation, that radical Islam goes quietly into that good night, ” he said in a speech last year. “Radical Islam is no fleeting fad but a relentless force.” Paul has even attacked Obama for “disengaging diplomatically in Iraq and the region.”

(PM Magazine/Dr. Seuss)

(PM Magazine/Dr. Seuss)

Instead, over the last year, Paul has developed an approach patterned on the internationalist thinking that influenced foreign policy elites during the Cold War. In a speech last February, Paul said the United States should contain jihadist Islam the way George Kennan envisioned containing Soviet Communism. For Kennan, containment represented an alternative to both isolationism and war. It required buttressing partners that could halt the expansion of Soviet power without trying to roll it back, since that would risk war. Whether one can usefully transfer the concept of containment to the current “war on terror” is questionable. But in invoking Kennan, Paul was expressing a preference for steady, cautious, long-term American engagement in the Middle East—hardly what you’d expect from an isolationist.

Besides containment, Paul’s other watchword is “stability.” “What much of the foreign policy elite fails to grasp is that intervention to topple secular dictators has been the prime source of that chaos,” he said last month. “From Hussein to Assad to Qaddafi, we have the same history. Intervention topples the secular dictator. Chaos ensues, and radical jihadists emerge. … Intervention that destabilizes the region is a mistake.”

Against both liberal interventionists and “neoconservatives” who support intervention to produce more democratic, pro-Western regimes, in other words, Paul wants the United States to support the Arab world’s traditional, comparatively secular autocrats, because at least they keep the region under control. His core argument with hawks such as John McCain and Lindsey Graham is not over whether America should withdraw from the Middle East. It’s over whether America should use its influence there to prop up the old order or usher in something new. That’s why Paul now peppers his speeches with quotes from Colin Powell, Robert Gates, and Dick Cheney circa 1991, policymakers who cut their teeth in the more risk-averse but still undoubtedly internationalist Republican Party of Henry Kissinger and George H.W. Bush. As Jason Zengerle recently pointed out in The New Republic, Paul’s foreign policy has become a fairly standard brand of realism, with some anxiety over unchecked presidential power thrown in.

Critics see this as cynical. Paul, as numerous articles have noted, has grown more hawkish as he’s courted the donors he needs to fund his likely presidential campaign. But the fact that Paul is, by necessity, drawing closer to a foreign policy consensus he once challenged is evidence not of that consensus’s weakness, but of its strength.

THAT CONSENSUS WITHIN the political class is not built upon big-dollar donations alone. There are certainly differences between how party elites want the United States to behave around the world and what ordinary citizens desire. But contrary to much media commentary, isolationism is not only largely absent from foreign policy discourse in Washington. It’s also largely absent from foreign policy discourse among the public at large.

Last December, a poll by the Pew Research Center found that, by 52 percent to 38 percent, Americans wanted the U.S. to “mind its own business internationally,” the largest gap in a half-century. The poll sparked a torrent of journalistic anxiety. “American isolationism,” fretted a Washington Post headline, “just hit a 50-year high.”

But upon closer examination, it becomes clear that Americans don’t actually want their country to “mind its own business” overseas at all. The same Pew poll that supposedly revealed Americans to be isolationists also found that, by a margin of more than 40 percentage points, they believe that “greater U.S. involvement in the global economy is a good thing.” Fifty-six percent of respondents told Pew the United States should “cooperate fully with the United Nations.” Seventy-seven percent agreed that, “in deciding on its foreign policies, the U.S. should take into account the views of its major allies.” And a clear majority opposed the idea that “since the U.S. is the most powerful nation in the world, we should go our own way in international matters.” In that same vein, a recent study by the Chicago Council on Global Affairs found that 59 percent of Americans want the U.S. to maintain its overseas military deployments at current levels. It also found that when told how much the U.S. spends on defense and foreign aid, Americans urge cutting the former but want the latter to go up.

(PM Magazine/Dr. Seuss)

(PM Magazine/Dr. Seuss)

How can a public that endorses greater economic globalization, far-flung military bases, extensive coordination with American allies and the United Nations, and higher foreign aid also say it wants the U.S. to “mind its own business” internationally? The answer lies in the way Washington elites have defined America’s international “business.” In recent years, America’s highest-profile overseas behavior has been its military interventions, either directly or via proxies, in Afghanistan, Iraq, Libya, Syria, and, at one point, potentially Ukraine. When Pew conducted its poll in late 2013, it was those interventions that Americans rejected, not international engagement, or even military action, per se.

The Chicago Council poll teased out the distinction. Like Pew, it uncovered an ostensibly high level of isolationism: Forty-one percent of respondents said it would “be best for the future of the country” if “we stay out of world affairs.” But when the council dug deeper, it found, “Even those who say the United States should stay out of world affairs would support sending U.S. troops to combat terrorism and Iran’s nuclear program. However, many of the conflicts in the press today—for example, in Syria and Ukraine—are not seen by the public as vital threats to the United States.” It’s no surprise, therefore, that since September, when the ISIS beheadings convinced many Americans that the chaos in Iraq and Syria might threaten them, the percentage supporting military action in those countries has shot up.

In important ways, in fact, the standard claim that elites must overcome the ingrained isolationism of ordinary Americans gets things backward. When it comes to working through the U.N. or paying heed to America’s allies, the public is more sympathetic to international cooperation than are many Beltway insiders. In official Washington, for instance, it is virtually taken for granted that America must remain the world’s lone superpower. By contrast, ordinary Americans, according to Pew, overwhelmingly want America to play a “shared leadership role” with other countries. Only 12 percent want America to be the “single world leader,” the same percentage who want America to play “no leadership role” at all.

GIVEN THE OVERWHELMING evidence, both from politicians and the public, that isolationism in America today is virtually nonexistent, why do so many high-profile commentators and politicians depict it as a grave threat? One clue lies in a word that these Cassandras use as a virtual synonym for isolationism: “retreat.” If the subtitle of Bret Stephens’s forthcoming book is The New Isolationism and the Coming Global Disorder, its title is America in Retreat. In their op-ed warning of a new “cycle of American isolationism,” Lieberman and Kyl employ variations of “retreat” or “retrench” six times.

But “isolationism” and “retreat” are entirely different things. Isolationism has a fixed meaning: avoiding contact with other nations. Retreat, by contrast, only gains meaning relatively. The mere fact that a country is retreating tells you nothing about the extent of its interactions overseas. You need to know the position it is retreating from.

(Ed Hall)

(Ed Hall)

Herein lies the rub. In general, the isolationism-slayers are far more comfortable bemoaning American retreat than defending the military frontiers from which America is retreating. That’s because those frontiers, which reached their apex under George W. Bush, were both historically unprecedented and historically calamitous.

To realize how historically unprecedented they were, it’s worth remembering how much more circumscribed America’s military ambitions were under Ronald Reagan. He could not have imagined sending ground troops to invade Afghanistan or Iraq. For one thing, both countries were clients of the Soviet Union. For another, the bitter legacy of Vietnam made sending hundreds of thousands of troops to overthrow a government half a world away inconceivable. During his eight years in office, Reagan invaded only one foreign country: Grenada, whose army boasted 600 troops. In his final year in the White House, when some administration hawks suggested he invade Panama, Reagan adamantly refused. The idea struck him as far too risky.

Equally inconceivable was the idea of deploying American troops on former Soviet soil. One of the disputes that initially led hawks to label Rand Paul an isolationist was the Kentuckian’s 2011 opposition to admitting the former Soviet republic of Georgia into NATO, an issue that put him in conflict with fellow GOP rising star Marco Rubio. But if Paul is an isolationist because he opposes an American military guarantee to defend Georgia, what does that make James Baker, who in 1990 reportedly promised Mikhail Gorbachev that if Moscow allowed Germany to reunify, NATO would not expand “one inch” further east: not even into East Germany, let alone the rest of Eastern Europe, let alone the former Soviet Union itself.

Between Reagan’s presidency and Obama’s, America’s military frontier advanced to fill the gap left by the collapse of Soviet power. Aspects of that expansion turned out well. George H.W. Bush reestablished Kuwait’s sovereignty in the first Persian Gulf War; Bill Clinton helped stabilize southeastern Europe by waging war to stop Slobodan Milosevic’s rampage through Bosnia and later Kosovo; countries such as Poland, Hungary, and the Czech Republic have prospered under NATO protection.

But in Afghanistan and Iraq, America’s forward march turned catastrophic. More than twice as many Americans have died in those two wars than in the September 11 attacks that justified them. A 2013 study by Linda J. Bilmes of Harvard’s Kennedy School of Government estimates that they will ultimately cost the United States between $4 trillion and $6 trillion. As a result, she argues, their financial legacy “will dominate future federal budgets for decades to come.”

Obama has made mistakes in his retreat from those wars. (I’ve been particularly critical of him for disengaging diplomatically from Iraq while Nuri al-Maliki was pushing his country’s Sunnis into the arms of ISIS.) But the notion that Obama should not have retreated—that he should have defended a historically unprecedented military frontier in wars that were causing America debilitating long-term fiscal damage and snuffing out thousands of young American lives, against insurgencies that posed no direct or imminent threat to the United States—is hard to forthrightly defend. Which is why hawks rarely defend it. Instead, they equate retreat with isolationism and isolationism with a fictionalized account of the 1920s and 1930s. And, presto, Obama becomes a latter-day Neville Chamberlain while they become heirs to Winston Churchill rather than to a guy named Bush.

Hawks worried that Barack Obama, or Rand Paul, or the American people have not defended American interests forcefully enough in Iraq, Syria, Ukraine, or Iran can make plenty of legitimate arguments. Calling their opponents “isolationists” isn’t one of them. It’s time journalists greet that slur with the same derision they currently reserve for epithets like “socialist,” “fascist,” and “totalitarian.” Then, perhaps, we can have the foreign policy debate America deserves.

Peter Beinart is is  an associate professor of journalism and political science at the City University of New York.

Read Full Post »

tom_dispatch_300x250_22

The Importance of Being Exceptional
From Ancient Greece to Twenty-First-Century America
By David Bromwich

TomDispatch.com   October 23, 2014

The origins of the phrase “American exceptionalism” are not especially obscure. The French sociologist Alexis de Tocqueville, observing this country in the 1830s, said that Americans seemed exceptional in valuing practical attainments almost to the exclusion of the arts and sciences. The Soviet dictator Joseph Stalin, on hearing a report by the American Communist Party that workers in the United States in 1929 were not ready for revolution, denounced “the heresy of American exceptionalism.” In 1996, the political scientist Seymour Martin Lipset took those hints from Tocqueville and Stalin and added some of his own to produce his book American Exceptionalism: A Double-Edged Sword. The virtues of American society, for Lipset — our individualism, hostility to state action, and propensity for ad hoc problem-solving — themselves stood in the way of a lasting and prudent consensus in the conduct of American politics.

In recent years, the phrase “American exceptionalism,” at once resonant and ambiguous, has stolen into popular usage in electoral politics, in the mainstream media, and in academic writing with a profligacy that is hard to account for. It sometimes seems that exceptionalism for Americans means everything from generosity to selfishness, localism to imperialism, indifference to “the opinions of mankind” to a readiness to incorporate the folkways of every culture. When President Obama told West Point graduates last May that “I believe in American exceptionalism with every fiber of my being,” the context made it clear that he meant the United States was the greatest country in the world: our stature was demonstrated by our possession of “the finest fighting force that the world has ever known,” uniquely tasked with defending liberty and peace globally; and yet we could not allow ourselves to “flout international norms” or be a law unto ourselves. The contradictory nature of these statements would have satisfied even Tocqueville’s taste for paradox.

On the whole, is American exceptionalism a force for good? The question shouldn’t be hard to answer. To make an exception of yourself is as immoral a proceeding for a nation as it is for an individual. When we say of a person (usually someone who has gone off the rails), “He thinks the rules don’t apply to him,” we mean that he is a danger to others and perhaps to himself. People who act on such a belief don’t as a rule examine themselves deeply or write a history of the self to justify their understanding that they are unique. Very little effort is involved in their willfulness. Such exceptionalism, indeed, comes from an excess of will unaccompanied by awareness of the necessity for self-restraint.

Such people are monsters. Many land in asylums, more in prisons. But the category also encompasses a large number of high-functioning autistics: governors, generals, corporate heads, owners of professional sports teams. When you think about it, some of these people do write histories of themselves and in that pursuit, a few of them have kept up the vitality of an ancient genre: criminal autobiography.

All nations, by contrast, write their own histories as a matter of course. They preserve and exhibit a record of their doings; normally, of justified conduct, actions worthy of celebration. “Exceptional” nations, therefore, are compelled to engage in some fancy bookkeeping which exceptional individuals can avoid — at least until they are put on trial or subjected to interrogation under oath. The exceptional nation will claim that it is not responsible for its exceptional character. Its nature was given by God, or History, or Destiny.

An external and semi-miraculous instrumentality is invoked to explain the prodigy whose essence defies mere scientific understanding. To support the belief in the nation’s exceptional character, synonyms and variants of the word “providence” often get slotted in.  That word gained its utility at the end of the seventeenth century — the start of the epoch of nations formed in Europe by a supposed covenant or compact. Providence splits the difference between the accidents of fortune and purposeful design; it says that God is on your side without having the bad manners to pronounce His name.

Why is it immoral for a person to treat himself as an exception? The reason is plain: because morality, by definition, means a standard of right and wrong that applies to all persons without exception. Yet to answer so briefly may be to oversimplify. For at least three separate meanings are in play when it comes to exceptionalism, with a different apology backing each. The glamour that surrounds the idea owes something to confusion among these possible senses.

First, a nation is thought to be exceptional by its very nature. It is so consistently worthy that a unique goodness shines through all its works. Who would hesitate to admire the acts of such a country? What foreigner would not wish to belong to it? Once we are held captive by this picture, “my country right or wrong” becomes a proper sentiment and not a wild effusion of prejudice, because we cannot conceive of the nation being wrong.

A second meaning of exceptional may seem more open to rational scrutiny. Here, the nation is supposed to be admirable by reason of history and circumstance. It has demonstrated its exceptional quality by adherence to ideals which are peculiar to its original character and honorable as part of a greater human inheritance. Not “my country right or wrong” but “my country, good and getting better” seems to be the standard here. The promise of what the country could turn out to be supports this faith. Its moral and political virtue is perceived as a historical deposit with a rich residue in the present.

A third version of exceptionalism derives from our usual affectionate feelings about living in a community on the scale of a neighborhood or township, an ethnic group or religious sect. Communitarian nationalism takes the innocent-seeming step of generalizing that sentiment to the nation at large. My country is exceptional to me (according to this view) just because it is mine. Its familiar habits and customs have shaped the way I think and feel; nor do I have the slightest wish to extricate myself from its demands. The nation, then, is like a gigantic family, and we owe it what we owe to the members of our family: “unconditional love.” This sounds like the common sense of ordinary feelings. How can our nation help being exceptional to us?

Teacher of the World

Athens was just such an exceptional nation, or city-state, as Pericles described it in his celebrated oration for the first fallen soldiers in the Peloponnesian War. He meant his description of Athens to carry both normative force and hortatory urgency. It is, he says, the greatest of Greek cities, and this quality is shown by its works, shining deeds, the structure of its government, and the character of its citizens, who are themselves creations of the city. At the same time, Pericles was saying to the widows and children of the war dead: Resemble them! Seek to deserve the name of Athenian as they have deserved it!

The oration, recounted by Thucydides in the History of the Peloponnesian War, begins by praising the ancestors of Athenian democracy who by their exertions have made the city exceptional. “They dwelt in the country without break in the succession from generation to generation, and handed it down free to the present time by their valor.” Yet we who are alive today, Pericles says, have added to that inheritance; and he goes on to praise the constitution of the city, which “does not copy the laws of neighboring states; we are rather a pattern to others than imitators ourselves.”

The foreshadowing here of American exceptionalism is uncanny and the anticipation of our own predicament continues as the speech proceeds. “In our enterprises we present the singular spectacle of daring and deliberation, each carried to its highest point, and both united in the same persons… As a city we are the school of Hellas” — by which Pericles means that no representative citizen or soldier of another city could possibly be as resourceful as an Athenian. This city, alone among all the others, is greater than her reputation.

We Athenians, he adds, choose to risk our lives by perpetually carrying a difficult burden, rather than submitting to the will of another state. Our readiness to die for the city is the proof of our greatness. Turning to the surviving families of the dead, he admonishes and exalts them: “You must yourselves realize the power of Athens,” he tells the widows and children, “and feed your eyes upon her from day to day, till love of her fills your hearts; and then when all her greatness shall break upon you, you must reflect that it was by courage, sense of duty, and a keen feeling of honor in action that men were enabled to win all this.” So stirring are their deeds that the memory of their greatness is written in the hearts of men in faraway lands: “For heroes have the whole earth for their tomb.”

Athenian exceptionalism at its height, as the words of Pericles indicate, took deeds of war as proof of the worthiness of all that the city achieved apart from war. In this way, Athens was placed beyond comparison: nobody who knew it and knew other cities could fail to recognize its exceptional nature. This was not only a judgment inferred from evidence but an overwhelming sensation that carried conviction with it. The greatness of the city ought to be experienced, Pericles imagines, as a vision that “shall break upon you.”

Guilty Past, Innocent Future

To come closer to twenty-first-century America, consider how, in the Gettysburg Address, Abraham Lincoln gave an exceptional turn to an ambiguous past. Unlike Pericles, he was speaking in the midst of a civil war, not a war between rival states, and this partly explains the note of self-doubt that we may detect in Lincoln when we compare the two speeches. At Gettysburg, Lincoln said that a pledge by the country as a whole had been embodied in a single document, the Declaration of Independence. He took the Declaration as his touchstone, rather than the Constitution, for a reason he spoke of elsewhere: the latter document had been freighted with compromise. The Declaration of Independence uniquely laid down principles that might over time allow the idealism of the founders to be realized.

Athens, for Pericles, was what Athens always had been. The Union, for Lincoln, was what it had yet to become. He associated the greatness of past intentions — “We hold these truths to be self-evident” — with the resolve he hoped his listeners would carry out in the present moment: “It is [not for the noble dead but] rather for us to be here dedicated to the great task remaining before us — that from these honored dead we take increased devotion to that cause for which they gave the last full measure of devotion — that we here highly resolve that these dead shall not have died in vain — that this nation, under God, shall have a new birth of freedom.”

This allegorical language needs translation. In the future, Lincoln is saying, there will be a popular government and a political society based on the principle of free labor. Before that can happen, however, slavery must be brought to an end by carrying the country’s resolution into practice. So Lincoln asks his listeners to love their country for what it may become, not what it is. Their self-sacrifice on behalf of a possible future will serve as proof of national greatness. He does not hide the stain of slavery that marred the Constitution; the imperfection of the founders is confessed between the lines.  But the logic of the speech implies, by a trick of grammar and perspective, that the Union was always pointed in the direction of the Civil War that would make it free.

Notice that Pericles’s argument for the exceptional city has here been reversed. The future is not guaranteed by the greatness of the past; rather, the tarnished virtue of the past will be scoured clean by the purity of the future.  Exceptional in its reliance on slavery, the state established by the first American Revolution is thus to be redeemed by the second. Through the sacrifice of nameless thousands, the nation will defeat slavery and justify its fame as the truly exceptional country its founders wished it to be.

Most Americans are moved (without quite knowing why) by the opening words of the Gettysburg Address: “Four score and seven years ago our fathers…” Four score and seven is a biblical marker of the life of one person, and the words ask us to wonder whether our nation, a radical experiment based on a radical “proposition,” can last longer than a single life-span. The effect is provocative. Yet the backbone of Lincoln’s argument would have stood out more clearly if the speech had instead begun: “Two years from now, perhaps three, our country will see a great transformation.” The truth is that the year of the birth of the nation had no logical relationship to the year of the “new birth of freedom.” An exceptional character, however, whether in history or story, demands an exceptional plot; so the speech commences with deliberately archaic language to ask its implicit question: Can we Americans survive today and become the school of modern democracy, much as Athens was the school of Hellas?

The Ties That Bind and Absolve

To believe that our nation has always been exceptional, as Pericles said Athens was, or that it will soon justify such a claim, as Lincoln suggested America would do, requires a suppression of ordinary skepticism. The belief itself calls for extraordinary arrogance or extraordinary hope in the believer. In our time, exceptionalism has been made less exacting by an appeal to national feeling based on the smallest and most vivid community that most people know: the family.  Governor Mario Cuomo of New York, in his keynote address at the 1984 Democratic convention, put this straightforwardly. America, said Cuomo, was like a family, and a good family never loses its concern for the least fortunate of its members. In 2011, President Obama, acceding to Republican calls for austerity that led to the sequestration of government funds, told us that the national economy was just like a household budget and every family knows that it must pay its bills.

To take seriously the metaphor of the nation-as-family may lead to a sense of sentimental obligation or prudential worry on behalf of our fellow citizens. But many people think we should pursue the analogy further. If our nation does wrong, they say, we must treat it as an error and not a crime because, after all, we owe our nation unconditional love. Yet here the metaphor betrays our thinking into a false equation. A family has nested us, cradled us, nursed us from infancy, as we have perhaps done for later generations of the same family; and it has done so in a sense that is far more intimate than the sense in which a nation has fostered or nurtured us. We know our family with an individuated depth and authority that can’t be brought to our idea of a nation. This may be a difference of kind, or a difference of degree, but the difference is certainly great.

A subtle deception is involved in the analogy between nation and family; and an illicit transfer of feelings comes with the appeal to “unconditional love.” What do we mean by unconditional love, even at the level of the family? Suppose my delinquent child robs and beats an old man on a city street, and I learn of it by his own confession or by accident. What exactly do I owe him?

Unconditional love, in this setting, surely means that I can’t stop caring about my child; that I will regard his terrible action as an aberration. I will be bound to think about the act and actor quite differently from the way I would think about anyone else who committed such a crime. But does unconditional love also require that I make excuses for him? Shall I pay a lawyer to get him off the hook and back on the streets as soon as possible? Is it my duty to conceal what he has done, if there is a chance of keeping it secret? Must I never say what he did in the company of strangers or outside the family circle?

At a national level, the doctrine of exceptionalism as unconditional love encourages habits of suppression and euphemism that sink deep roots in the common culture. We have seen the result in America in the years since 2001. In the grip of this doctrine, torture has become “enhanced interrogation”; wars of aggression have become wars for democracy; a distant likely enemy has become an “imminent threat” whose very existence justifies an executive order to kill. These are permitted and officially sanctioned forms of collective dishonesty. They begin in quasi-familial piety, they pass through the systematic distortion of language, and they end in the corruption of consciousness.

The commandment to “keep it in the family” is a symptom of that corruption. It follows that one must never speak critically of one’s country in the hearing of other nations or write against its policies in foreign newspapers. No matter how vicious and wrong the conduct of a member of the family may be, one must assume his good intentions. This ideology abets raw self-interest in justifying many actions by which the United States has revealingly made an exception of itself — for example, our refusal to participate in the International Criminal Court. The community of nations, we declared, was not situated to understand the true extent of our constabulary responsibilities. American actions come under a different standard and we are the only qualified judges of our own cause.

The doctrine of the national family may be a less fertile source of belligerent pride than “my country right or wrong.” It may be less grandiose, too, than the exceptionalism that asks us to love our country for ideals that have never properly been translated into practice. And yet, in this appeal to the family, one finds the same renunciation of moral knowledge — a renunciation that, if followed, would render inconceivable any social order beyond that of the family and its extension, the tribe.

Unconditional love of our country is the counterpart of unconditional detachment and even hostility toward other countries. None of us is an exception, and no nation is. The sooner we come to live with this truth as a mundane reality without exceptions, the more grateful other nations will be to live in a world that includes us, among others.

David Bromwich teaches English at Yale University. A TomDispatch regular, he is the author most recently of Moral Imagination and The Intellectual Life of Edmund Burke: From the Sublime and Beautiful to American Independence.

Read Full Post »

The Civil War’s Most Famous Clown

David Carlyon

The New York Times September 18, 2014

A clown ran for public office – and no, that’s not the beginning of a joke. On Sept. 15, 1864, America’s most famous circus clown, Dan Rice, accepted the Democratic nomination for the Pennsylvania State Senate. And it was just his first foray into politics: Even while continuing his career as a clown, a state convention later considered him as a candidate for Congress, and, in 1867, he made a brief but legitimate run for president.

Dan Rice, ca. 1870.

Dan Rice, ca. 1870.Credit David Carlyon

While the idea of a clown running for office sounds like a gimmick, in the 1860s it was taken seriously — because circus itself was taken seriously, as adult fare. Long before it was relegated to children’s entertainment, early circus in this country combined what appealed to grown-up tastes: sex, violence, political commentary and, in a horse-based culture, top-notch horsemanship. George Washington attended the first circus in 1793 in Philadelphia not for family-friendly amusement — a notion that didn’t emerge until the 1880s — but as a horseman keen to see animals and humans working together at a peak level.

Sex and violence enhanced the appeal. Like later burlesque comedians, talking clowns told dirty jokes in a titillating whirl of the scantily clad: Circus acrobats and riders showed more skin — or flesh-colored fabric that seemed to be skin — than could be seen anywhere else in public life.

Walt Whitman approved. Reviewing a circus in 1856 in Brooklyn, he wrote: “It can do no harm to boys to see a set of limbs display all their agility.” (In a favorite mind-plus-body theme, Whitman added: “A circus performer is the other half of a college professor. The perfect Man has more than the professor’s brain, and a good deal of the performer’s legs.”) Meanwhile, fights were a daily occurrence, drawing attention the way fights at soccer matches do now. Violence was so common that Rice’s journal from 1856 noted the rare days when no fight occurred.

And while nostalgia portrays early circus as small and quaint, antebellum tents were some of the largest structures on the continent, seating thousands, while over the winter, circuses played major city theaters.

Dan Rice stood in the center of this lively public arena. Born in New York City in 1823, he burst onto the circus scene in the 1840s with a lightning-quick wit and sharp topical instincts that made him a national favorite. Proclaiming himself “the Great American Humorist,” he combined ad-libs, jokes ancient and new, sexual allusions, comic and sentimental songs, clever parodies of Shakespeare and quips on current events. (He did little physical comedy, which was the specialty of knockabout clowns and acrobats.)

Scholars believe that Mark Twain, who later adopted that Great American Humorist label, used Rice as his model for the clown described in “Huckleberry Finn,” “carrying on so it most killed the people,” as “quick as a wink with the funniest things a body ever said.” Though obscure when he died in 1900, Rice had probably been seen by more Americans than any other public figure. Nor was renown restricted to the United States: Imitators in England and Germany appropriated his famous name in their own acts.

As the country tumbled toward war, Rice expanded his “hits on the times.” Instead of Bozo, think Jon Stewart or Rush Limbaugh. Or Robin Williams, who shared the same quick wit, verbal virtuosity, and sharp political humor. (In fact, Williams toyed with the idea of playing Rice in a movie.) Rice’s expanded approach extended to his costumes, as he alternated between traditional clown garb decorated in stripes and stars, and a new look of tailcoat, vest, and pants, the Great American Humorist as respectable gentleman, a man with serious opinions on the events of the day.

Once the Civil War erupted, Rice pushed directly into politics, a Peace Democrat condemning Abraham Lincoln and “Black Republicans” from the circus ring. By 1864, it was a natural step for the Democrats of Erie, Pa.., near his winter quarters in Girard, to choose the nationally prominent “Col. Dan Rice” as their candidate for the state senate. (The title was self-granted, matching the times’ martial mood.)

Writing from his tour on Sept. 15 to accept the nomination, Rice denied that he worshipped “at the shrine of any political dogma,” but did declare that his “proclivities were formerly with the Whigs.” He condemned Lincoln for violating the Constitution and creating an imperial presidency. Rice wrote: “When I see the great principles of personal liberty and the rights of property being cloven down by the men now running the machine of Government, ‘the ancient landmarks’ of the Constitution ‘which our fathers set’ removed, I feel like crying, in the language of the Holy Writ, ‘cursed be he that removeth them.’”

Historians, adopting the later family-friendly image of circus, assumed that a clown’s campaign for office had to be a publicity stunt. But Rice’s nomination was no joke. Chicago newspapers took it seriously: On Sept. 23, the Republican Tribune opened a two-day attack in its headline, “Dan Rice and Disloyalty.” It complained that Rice filled “his ring talk with disloyal utterances and flings at Lincoln and the war. A trimmer so cautious as this personage who once, it is said, actually gave a performance under the confederate flag, should understand that this style of thing will not pay in loyal communities.” (The “Confederate flag” jab was political spin, because Rice presented his circus in New Orleans when Louisiana seceded.)

Next the Tribune claimed that no one laughed at Rice’s “quips and pasquinades persistently leveled at the President, the war, the government, and the anti-slavery sentiment of the north.” That Rice could make these jokes and still attract customers is another indication that late into 1864, discontent about the war remained strong. The Tribune, in an allusion to Southern sympathizers known as Copperheads, concluded by urging the press on his route to guard that his jokes did not “resemble a certain kind of soda — ‘drawn from copper.’” (Rice, visiting his friend Morrison Foster, Stephen Foster’s brother, apparently met the notorious Copperhead Clement Vallandigham there.)

Even as criticism of abolitionists continued, the crucible of war was burning away belief that the nation’s “peculiar institution” of slavery was acceptable. And as the country changed, so did Rice. In a July 4 speech in Elmira, N.Y., he had declared that blacks “are God’s creatures, and shouldn’t belong to Jeff. Davis, or any other man,” for they “were not made for southern planters to vote on, nor northern fanatics to dote on.” He added a folksy variation on Lincoln’s theme of equality: “Let every tub stand on its own bottom.”

Rice ran an abbreviated campaign. He was still a businessman with a show to troupe. He also knew he faced an uphill battle, running against a Republican incumbent, Morrow Lowry, in a heavily Republican district. Whatever advantage his national renown gave him was offset by the leading families of Girard, who harbored the distaste of small-town gentry for “the show business.” That distaste increased when Rice married into one of those families against their objections, to a woman the same age as his daughters

A trading card advertising Dan Rice's circus in 1873.
A trading card advertising Dan Rice’s circus in 1873.Credit David Carlyon

Despite such handicaps, in November Rice ran ahead of the Democratic ticket. He attracted 40 percent of the district’s vote, while the presidential candidate Gen. George McClellan got only 36 percent.

Later, like others who had criticized the war, Rice sought to shore up his reputation for patriotism. In 1865 in Girard he erected what was said to be the first Civil War monument, with a ceremony featured on the front page of the Nov. 25 Harper’s Weekly.

He also began peddling a claim that he’d been Abraham Lincoln’s pal, dropping by the White House to cheer up war-weary Abe and advise him on the mood of the country. Blatantly false, the tale thrived thanks to Rice’s national stature and the postwar urge to paper over the bitter divide of the war. The Lincoln fiction survived intact into the 20th century, as a bit of trivia about the president, because it fit a new sentimentality about clowns as sweetly innocuous. It was easier to believe in a clown consoling Lincoln than one attacking him as a tyrant.

Another claim, though one that Rice didn’t make himself, said he’d been the model for Uncle Sam. At first glance it’s unlikely. Thomas Nast, the cartoonist who completed the evolution of that image to the icon we know today, was a fervent Republican who wouldn’t have knowingly based anything on a fervent Democrat like Rice. But it wouldn’t have been unusual to be unconsciously influenced by one of the most famous Americans of the era. In any case Nast drew a cartoon that echoed Rice perfectly, combining the famous clown’s democratic irreverence, his trademark goatee, the top hat he often wore, and a mash-up of his two primary costumes, a clown’s stars and stripes and the fancy wardrobe of a middle-class gentleman. If anyone could be said to have been the model for Uncle Sam, it was Dan Rice, circus clown and political candidate.

Follow Disunion at twitter.com/NYTcivilwar or join us on Facebook.


Sources: David Carlyon, “Dan Rice: The Most Famous Man You’ve Never Heard Of “ and Carlyon, “Twain’s ‘Stretcher’: The Circus Shapes Huckleberry Finn,” South Atlantic Review, 72.4 (Fall 2007); Dan Rice, “Fourth of July Oration,” “Dan Rice’s Songs, Sentiments, Jests, and Stories”; Walter A. McDougall, “Throes of Democracy: The American Civil War Era: 1829-1877.”


David Carlyon is the author of “Dan Rice: The Most Famous Man You’ve Never Heard Of.”

Read Full Post »

Transcripts Kept Secret for 60 Years Bolster Defense of Oppenheimer’s Loyalty

William J. Broad

The New York Times   October 11, 2014

A detonation over the Marshall Islands in 1952 was the first test of a hydrogen bomb. Credit Underwood Archives, via Getty Images

At the height of the McCarthy era, J. Robert Oppenheimer, the government’s top atomic physicist, came under suspicion as a Soviet spy.

After 19 days of secret hearings in April and May of 1954, the Atomic Energy Commission revoked his security clearance. The action brought his career to a humiliating close, and Oppenheimer, until then a hero of American science, lived out his life a broken man.

But now, hundreds of newly declassified pages from the hearings suggest that Oppenheimer was anything but disloyal.

Historians and nuclear experts who have studied the declassified material — roughly a tenth of the hearing transcripts — say that it offers no damning evidence against him, and that the testimony that has been kept secret all these years tends to exonerate him.

“It’s hard to see why it was classified,” Richard Polenberg, a historian at Cornell University who edited a much earlier, sanitized version of the hearings, said in an interview. “It’s hard to see a principle here — except that some of the testimony was sympathetic to Oppenheimer, some of it very sympathetic.”

Photo

J. Robert Oppenheimer Credit Associated Press

A crucial element in the case against Oppenheimer derived from his resistance to early work on the hydrogen bomb. The physicist Edward Teller, who long advocated a crash program to devise such a weapon, told the hearing that he mistrusted Oppenheimer’s judgment, testifying, “I would feel personally more secure if public matters would rest in other hands.”

But the declassified material, released Oct. 3 by the Energy Department, suggests that Oppenheimer opposed the hydrogen bomb project on technical and military grounds, not out of Soviet sympathies.

Richard Rhodes, author of the 1995 book “Dark Sun: The Making of the Hydrogen Bomb,” said the records showed that making fuel to test one of Teller’s early H-bomb ideas would have forced the nation to forgo up to 80 atomic bombs.

“Oppenheimer was worried about war on the ground in Europe,” Mr. Rhodes said in an interview. He saw the need for “a large stockpile of fission weapons that could be used to turn back a Soviet ground assault.”

The formerly secret testimony “was immensely relevant to Oppenheimer’s opposition,” he said, adding, “There’s a lot here for historians to digest.”

Robert S. Norris, a senior fellow at the Federation of American Scientists and the author of “Racing for the Bomb,” a biography of Lt. Gen. Leslie R. Groves, the military leader of the World War II project to develop the atomic bomb, said a reading of the formerly secret testimony showed it had little or nothing to do with national security.

“In many cases, they deleted material that was embarrassing,” he said in an interview. “That’s pretty obvious.”

The Energy Department, a successor to the Atomic Energy Commission, offered no public analysis of the 19 volumes and no explanation for why it was releasing the material now. It did, however, note that the step took 60 years. Sidestepping questions of guilt or innocence, it referred to the 1954 hearing as a federal assessment of Oppenheimer “as a possible security risk.”

Steven Aftergood, director of the Federation of American Scientists’ project on government secrecy, called the release “long overdue” and added, “It lifts the last remaining cloud from the subject.”

Priscilla McMillan, an atomic historian at Harvard and author of “The Ruin of J. Robert Oppenheimer,” applauded the release but also expressed bafflement at its having taken six decades, saying her own research suggested that the transcripts held “zero classified data.”

An eccentric genius fond of pipes and porkpie hats, Oppenheimer grew up in an elegant building on Riverside Drive in Manhattan, attended the Ethical Culture School and graduated from Harvard in three years. After studies in Europe, he taught physics at the University of California, Berkeley.

As a young professor, he crashed his car while racing a train, leaving his girlfriend unconscious. His father gave the young woman a painting and a Cézanne drawing.

In the 1930s, like many liberals, Oppenheimer belonged to groups led or infiltrated by Communists; his brother, his wife and his former fiancée were party members.

The physicist Edward Teller. In secret hearings in 1954, Teller said he did not trust Oppenheimer’s judgment. Credit Associated Press

The physicist Edward Teller. In secret hearings in 1954, Teller said he did not trust Oppenheimer’s judgment. Credit Associated Press

In the 1940s at Los Alamos in New Mexico, in great secrecy, he led the scientific effort that invented the atomic bomb. Afterward, as chairman of the Atomic Energy Commission’s main advisory body, he helped direct the nation’s postwar nuclear developments.

Oppenheimer’s downfall came amid Cold War fears over Soviet strides in atomic weaponry and Communist subversion at home. In 1953, a former congressional aide charged in a letter to the Federal Bureau of Investigation that the celebrated physicist was a Soviet spy.

Troubled by the allegation, President Dwight D. Eisenhower ordered “a blank wall” erected between Oppenheimer and any nuclear secrets.

No evidence came to light that supported the spy charge. But the security board found that Oppenheimer’s early views on the hydrogen bomb “had an adverse effect on recruitment of scientists and the progress of the scientific effort.” He died in 1967, at 62.

Experts who have looked at the declassified transcripts say they cast startling new light on the Oppenheimer case. Dr. Polenberg of Cornell, for example, expressed bewilderment that 12 pages of testimony from Lee A. DuBridge, a friend and colleague of Oppenheimer’s who discussed the atomic trade-offs and the European war situation, had remained secret for 60 years.

“A difference of opinion doesn’t mean disloyalty,” he said. “It’s hard to see why it was redacted.”

Dr. Polenberg also pointed to 45 pages of declassified testimony from Walter G. Whitman, an M.I.T. engineer and member of the Atomic Energy Commission’s advisory body. “In my judgment,” Mr. Whitman said of Oppenheimer, “his advice and his arguments for a gamut of atomic weapons, extending even over to the use of the atomic weapon in air defense of the United States, has been more productive than any other one individual.”

Asked his opinion of Oppenheimer as a security risk, he called him “completely loyal.”

Alex Wellerstein, an atomic expert at the Stevens Institute of Technology, said in a comment on the secrecy blog of the Federation of American Scientists that years ago he had asked the government to declassify the secret Oppenheimer testimony.

The department’s public silence on his request, he said, made the unveiling look like “the result of an internal interest in the files rather than prodding from an outside historian.”

A few of the declassifications cast new light on what were already famous moments in Oppenheimer’s downfall.

Isidor I. Rabi, a Nobel laureate and veteran of the Manhattan Project who staunchly defended the beleaguered physicist, told atomic investigators that he found the hearing “most unfortunate” given what “Dr. Oppenheimer has accomplished.”

The restored transcript adds a deleted phrase in which Dr. Rabi mentioned the hydrogen bomb, then also known as the Super. It underscored the depth of his fury.

“We have an A-bomb,” he told the hearing, as well as “a whole series of Super bombs.” He added: “What more do you want, mermaids?”

Read Full Post »

Here Is What Can Make a Difference in Race Relations – And It Happened in Major League Baseball Decades Ago

Michael H. Ebner

HNN  October 12, 2014

 

The obituary a few weeks ago of a former major league baseball player – George Shuba – has furnished a useful lesson about race.

Shuba, a journeyman outfielder, played for six years with the Brooklyn Dodgers (1948-1950 and 1951-1955). He never appeared in more than one-hundred games during his career, although he did have the distinction of playing in the World Series of 1952, 1953, and 1955. Brooklyn won its first world championship in the latter year. On the field Shuba is best remembered as a dependable pinch hitter – lifetime batting average of .259, with twenty-five home runs (one of them against the Yankees in the World Series of 1953) – for a team that was regularly in contention for the National League pennant.

Largely forgotten until the publication of his obituary last week, now Shuba is celebrated for breaking an inter-racial taboo. He did so by extending his hand by way of congratulating teammate Jackie Robinson, who had just hit a home run for the minor league team known as the Montreal Royals of the International League. The late Jules Tygiel, a peerless researcher, made no mention of it. Arnold Rampersad, in his biography of Robinson, mentions the handshake – and includes a photograph of it – but does not make much of the incident.

Next we turn to the obituary of Steve Gromek (1920-2002), a pitcher for the Cleveland Indians and later the Detroit Tigers. Over a seventeen-year career (1941-1957), he compiled a respectable win-loss record of 128-108. He won nineteen games during 1951 and eighteen in 1954. Gromek also won a World Series game in 1948, filling in for baseball legend Bob Feller who required an extra day of rest.

When Jackie Robinson arrived in the major leagues in 1947, he experienced the sting of racial hostility. A handful of his Brooklyn Dodger teammates unsuccessfully sought to prevail on the management of the Dodgers to drop Robinson from the roster. Pee Wee Reese – the team’s captain and shortstop, later elected to the Baseball Hall of Fame – rejected the opportunity to sign a petition circulated among teammates opposing Robinson’s presence on the roster. Well known is that the ringleader of the petition effort, Dixie Walker, ultimately found himself traded away by the Dodgers.

When the Dodgers played in Cincinnati a fan hurled vicious epithets at Robinson. Reese – a native Kentuckian – quietly walked across the infield to Robinson and gently placed his arm on his teammate’s shoulder. The hecklers ceased. While this moment remains much remembered, no known image exists.

This brings us to Larry Doby, the first African American to play in the American League. The rookie outfielder hit a key home run in the World Series of 1948, securing Gromek’s winning pitching effort. Afterwards Gromek enthusiastically hugged Doby in the clubhouse, an image that made its way into newspapers. Margaret Mackenzie wrote about the episode for the Pittsburgh Courier, a widely read African American newspaer: »That picture of Gromek and Doby has unmistakable flesh and blood cheeks pressed close together, brawny arms tightly clasped, equally wide grins.» The chief message of the Doby-Gromek picture is acceptance.»

Years later Gromek, a native of Hammtramck, Michigan – a largely white working-class suburb adjacent to Detroit – experienced ostracism but quickly shrugged it off. Today the Gromek-Doby embrace remains an iconic image in the history of American race relations.

These episodes – each of them situated in the immediate aftermath of World War II – reflect changing racial sensibilities. The Swedish social scientist Gunnar Myrdal, in his landmark book – An American Dilemma (1944) – anticipated the shifting tableaux of race relations in post-war American culture. What is remarkable is that major league baseball – its game played before crowds numbering in the tens of thousands – represented an agent of social change. The re-integration of professional baseball occurred seven years in advance of Brown v. Tulsa Board of Education.

Michael H. Ebner is professor of American history of emeritus at Lake Forest College. He can be reached at ebner@mx.lakeforest.edu

Read Full Post »

How Even President Obama Gets U.S. History Wrong: We Weren’t a Colonial Power?

by Roxanne Dunbar-Ortiz

HNN October 10, 2014

In a 2009 interview with Al Arabiya Television in Dubai, soon after his first inauguration, President Barack Obama affirmed that the U.S. government could be an honest broker in the Israeli-Palestinian conflict, saying, “We sometimes make mistakes. We have not been perfect. But if you look at the track record, as you say, America was not born as a colonial power.”

One has to query the president: How did the United States begin with thirteen small colonies/states hugging the Atlantic seaboard and end up in the mid-twentieth century with fifty states over much of North America, and a number of island colonies in the Pacific and the Caribbean? Apparently, it was manifest destiny at work.

According to the centuries-old Doctrine of Discovery, European nations acquired title to the lands they “discovered,” and Indigenous inhabitants lost their natural right to that land after Europeans had arrived and claimed it.Under this legal cover for theft, European wars of conquest, domination, and in some cases–such as the United States–settler colonial states devastated Indigenous nations and communities, ripping their territories away from them and transforming the land into private property. Most of the land appropriated by the United States ended up in the hands of land speculators and agribusiness operators, many of which, up to the mid-nineteenth century, were plantations worked by another form of private property, enslaved Africans.

Arcane as it may seem, the Doctrine of Discovery remains the basis for federal laws still in effect that control Indigenous peoples’ lives and destinies, even their histories by distorting them.

From the mid-fifteenth century to the mid-twentieth century, most of the non-European world was colonized under the Doctrine of Discovery, one of the first principles of international law Christian European monarchies promulgated to legitimize investigating, mapping, and claiming lands belonging to peoples outside Europe. It originated in a papal bull issued in 1455 that permitted the Portuguese monarchy to seize West Africa. Following Columbus’s infamous exploratory voyage in 1492, sponsored by the king and queen of the infant Spanish state, another papal bull extended similar permission to Spain. Disputes between the Portuguese and Spanish monarchies led to the papal-initiated Treaty of Tordesillas (1494), which, besides dividing the globe equally between the two Iberian empires, clarified that only non-Christian lands fell under the discovery doctrine.

This doctrine, on which all European states and the United States relied, thus originated with the arbitrary and unilateral establishment of the Iberian monarchies’ exclusive rights under Christian canon law to colonize foreign peoples, and this right was later seized by other European monarchical colonizing projects. The French Republic used this legalistic instrument for its nineteenth- and twentieth-century settler colonialist projects, as did the newly independent United States when it continued the colonization of North America begun by the British.

In 1792, not long after the founding of the United States, Secretary of State Thomas Jefferson claimed that the Doctrine of Discovery developed by European states was international law applicable to the new U.S. government as well. In 1823 the U.S. Supreme Court issued its decision in Johnson v. McIntosh. Writing for the majority, Chief Justice John Marshall held that the Doctrine of Discovery had been an established principle of European law and of English law in effect in Britain’s North American colonies and was also the law of the United States. The Court defined the exclusive property rights that a European country acquired by dint of discovery: “Discovery gave title to the government, by whose subjects, or by whose authority, it was made, against all other European governments, which title might be consummated by possession.” Therefore, European and Euro-American “discoverers” had gained real-property rights in the lands of Indigenous peoples by merely planting a flag. Indigenous rights were, in the Court’s words, “in no instance, entirely disregarded; but were necessarily, to a considerable extent, impaired.” The Court further held that Indigenous “rights to complete sovereignty, as independent nations, were necessarily diminished.” Indigenous people could continue to live on the land, but title resided with the discovering power, the United States. The decision concluded that Native nations were “domestic, dependent nations.”

In fact, Indigenous peoples were not allowed to continue living on their land under Andrew Jackson’s presidency; with the Indian Removal Act that he pushed through Congress, all the Indigenous nations east of the Mississippi were dissolved and their citizens were forcibly relocated to “Indian Territory,” which itself was later dissolved to become a part of the state of Oklahoma.

The Doctrine of Discovery is so taken for granted that it is rarely mentioned in historical or legal texts published in the Americas.

In the era of global decolonization of the second half of the 20th century, Native Americans remained colonized. The official celebration of Columbus is a metaphor and painful symbol of that traumatic past, although the United States did not become an independent republic until nearly three centuries after Columbus’s first voyage. None of Columbus’s voyages touched the continental territory now claimed by the United States.

Native American nations and communities are involved in decolonization projects, including the development of international human rights law to gain their right to self-determination as Indigenous Peoples, having gained the United Nations’ 2007 Declaration on the Rights of Indigenous Peoples, which the Obama administration endorsed. It’s time for the United States government to make a gesture toward acknowledgement of its colonial past and a commitment to decolonization. Doing away with the celebration of Columbus, the very face of the onset of colonialism in the Western Hemisphere, could be that gesture. In its place proclaim that fateful date of the onset of colonialism as a Day of Solidarity and Mourning with the Indigenous Peoples. In retiring Columbus, nullification of the Doctrine of Discovery is also required.

The affirmation of democracy requires the denial of colonialism, but denying it does not make it go away. Only decolonization can do that.

Roxanne Dunbar-Ortiz grew up in rural Oklahoma, the daughter of a tenant farmer and part-Indian mother. She has been active in the international Indigenous movement for more than four decades and is known for her lifelong commitment to national and international social justice issues. After receiving her PhD in history at the University of California at Los Angeles, she taught in the newly established Native American Studies Program at California State University, Hayward, and helped found the Departments of Ethnic Studies and Women’s Studies. Her latest book is “An Indigenous Peoples’ History of the United States.” –

Read Full Post »

“1898”, McGee y el imperialismo progresista

José Anazagasty Rodrìguez

 

80 grados   3 de octubre de 2014

WilliamJohnMcGee_1900_Smithsonian02861200

William J. McGee

La Era Progresista fue un periodo de la historia estadounidense, entre la última década del siglo 19 y las primeras dos del siglo 20, protagonizada por un movimiento social reformista que concretó diversas reformas en los campos sociales, políticos, económicos, y ambientales. Este movimiento acogió la instauración del progreso como su problemática primordial. Se trataba de un progresismo liberal y crítico del Gilded Age que, pese a ello, no se alejaba demasiado del polo conservador del liberalismo.

El progresismo, indeterminado, desafía cualquier intento de definirlo, esto por haber sido un movimiento heterogéneo, dinámico y complejo. Se trataba también de un movimiento que de muchas formas intentó reconciliar varias tendencias opuestas: entre lo nuevo y lo viejo, entre el individuo y la sociedad, entre la racionalidad científica y la lógica del protestantismo cristiano, entre el fomento del crecimiento económico y los excesos del desarrollo capitalista, entre otras tensiones. Sin embargo, muchos progresistas, arraigados a la modernización, defendieron y promovieron tenazmente la racionalidad científica, reclamando eficiencia y apoyando la intrusión tecnocrática en el ordenamiento y control social. Y algunos favorecieron la intervención estatal para garantizar incluso un crecimiento económico eficiente pero sensato, oponiéndose a los monopolios y los excesos corporativos. Pero el movimiento también apoyó la expansión territorial de los Estados Unidos, su ingreso a los círculos imperialistas a finales del siglo 19.

El origen de la Era Progresista coincidió con la génesis de la fase hemisférica del imperialismo estadounidense. Fue en los primeros años de la Era Progresista que Estados Unidos se inició como fuerza imperialista, esto tras adquirir en 1898 un imperio directo transcontinental que incluyó a varias islas. Sin embargo, las conexiones entre el progresismo y el imperialismo estadounidense son pocas veces destacadas por los estudiosos de la historia imperial estadounidense. Entre los historiadores estadounidenses y otros estudiosos de esa nación predomina una interpretación ortodoxa y dogmática que imagina el progresismo y el imperialismo como incompatibles. Pero contrario a esta tesis, y como demostró William E. Leuchtenburg, la mayoría de los progresistas favorecieron el imperialismo, algunos más que otros. Más aún, el contenido ideológico del progresismo y del imperialismo concordó muchas veces, un contenido también palpable en varias políticas coloniales estadounidenses. Un buen ejemplo fue la Ley de los 500 Acres, implantada en Puerto Rico por la administración militar-colonial estadounidense, la que estaba fundamentada en el llamado progresista a regular los monopolios, esfuerzo concretizado en las llamadas “antitrust laws.” Otro buen ejemplo fue el manejo de los recursos naturales en las colonias, como el ordenamiento racional y científico de los bosques puertorriqueños a través de la dasonomía y la silvicultura durante la Era Progresista, prácticas asociadas a Gifford Pinchot, conocido conservacionista progresista. De hecho, el conservacionismo de la época nos permite examinar algunos de los paralelos entre el contenido ideológico del progresismo y el imperialismo.

Me propongo a continuación, y mediante una lectura de uno de los escritos de William J. McGee publicado en National Geographic Magazine en 1898 antes de que este sirviera como oficial gubernamental bajo Theodore Roosevelt, develar algunos aspectos de esa afinidad y del apoyo progresista al imperialismo.

El movimiento conservacionista, antecesor del ambientalismo moderno estadounidense, se dividió en dos tendencias principales. Una de estas tendencias enfatizó el uso y manejo eficiente de los recursos naturales para garantizar el crecimiento económico sostenido de la nación. La otra tendencia enfatizaba la restauración y conservación de los recursos naturales por razones estéticas, morales y recreacionales. La tensión entre estas tendencias han marcado las políticas ambientales estadounidenses desde entonces, como ilustra la historia del US Forest Service. Jhon Muir fue el gestor más importante de la segunda tendencia mientras que Gifford Pinchot fue el gestor más importante de la primera.

William Joseph McGee, quien ya discutí en un artículo previo, también fue un importante representante de esta segunda tendencia y ambos apoyaron el imperialismo estadounidense, inclusive como actores importantes en la administración de Theodore Roosevelt. McGee fue antropólogo, etnólogo, inventor, geólogo y conservacionista. Fue ideólogo del conservacionismo en las esferas gubernamentales de la administración Roosevelt, participando inclusive de la redacción de los discursos presidenciales. McGee fue también Vicepresidente y Secretario del Inland Waterway Commision, dirigente del Bureau of Ethnology, y Presidente y Vicepresidente del National Geographic Society.

Para McGee el conservacionismo era la fase más avanzada de la evolución, esta entendida desde la perspectiva lamarckista. McGee, igual que Frederick J. Turner, consideraba la expansión territorial determinante en la evolución de los Estados Unidos. Es por ello que McGee celebró y justificó la adquisición de un imperio directo transcontinental a finales del siglo 19. Fue precisamente en el mismo año de la Guerra Hispanoamericana, 1898, que McGee pronunció ante una sección conjunta de la National Geographic Society y la American Society for the Advancement of Science un discurso sobre el crecimiento territorial de los Estados Unidos en el que explicaba, elogiaba y hasta legitimaba la expansión territorial. Su discurso sería más tarde publicado en National Geographic Magazine ese mismo año.

Según McGee, la anexión de Hawái, Filipinas y Puerto Rico resumían una larga pero interrumpida historia de expansión territorial estadounidense, la que describió como una carrera sin paralelos, esto por el tremendo y rápido crecimiento territorial que envolvió. Además, McGee afirmaba que esta fue una carrera expansionista amigable y de anexiones voluntarias que no envolvieron conquistas inspiradas en “motivos mercenarios.” Insistía además en que esa carrera benefició a los habitantes de las tierras agregadas tanto como a los estadounidenses. Afirmaba también que el crecimiento territorial de los Estados Unidos no era sino la expresión de su “destino manifiesto”, un destino afín con las leyes naturales de la evolución. En adición, McGee alegaba que el crecimiento territorial envolvió la rápida asimilación y “conquista noble” de la naturaleza, la superación de diversos obstáculos naturales mediante la innovación tecnológica producto del carácter innovador de los estadounidenses. Finalmente, cada extensión territorial, insistía McGee, estuvo precisamente caracterizada por efectos positivos y significativos en el carácter nacional e individual de los estadounidenses.

Para McGee, con toda aquella épica historia expansionista como precedente, no había razones para pensar que sería distinto con “la isla jardín de Porto Rico,” y “las cientos de islas filipinas.” Con esos planteamientos McGee movilizó varios de los mismos conceptos utilizados por los imperialistas estadounidenses, incluyendo la idea de los Estados Unidos como una nación excepcional y benevolente cuyo destino expreso, aparte de perfeccionar continuamente su carácter, era expandirse alrededor del globo, conquistar la naturaleza y llevar las buenas nuevas de sus innovaciones, el progreso, al resto de los habitantes del planeta. Pero quizá lo más interesante de las expresiones de McGee fue su caracterización de la expansión territorial estadounidense, del imperialismo, como un proceso natural.

Según McGee, si los nuevos territorios representaban una pequeña extensión de tierra, una mera “onda en la corriente del progreso nacional,” el proceso y sus consecuencias serían similares a expansiones previas. Los estadounidenses, realizando su destino manifiesto, incorporarían esas tierras y sus habitantes rápidamente para, y guiados por la benevolencia, transferirles grandes beneficios a los habitantes de aquellas tierras, de paso conquistando la naturaleza y sus frenos al progreso humano mediante la ciencia y la tecnología. Para McGee la posesión de las islas les requería a los estadounidenses producir dispositivos que le permitieran acortar el tiempo y aniquilar el espacio, una fuerza naval para McGee. El entonces Vicepresidente de la National Geographic Society vaticinaba, probablemente inspirado en Alfred T. Mahan, que Estados Unidos se convertiría en la “nación naval de la Tierra.” Para McGee, vencer esos obstáculos marítimos significaba, como significó vencer las fuerzas naturales en expansiones previas, el avance del carácter estadounidense, tanto a nivel individual como a nivel nacional. Y eso no era otra cosa para él que el progreso mismo de la humanidad.

En su artículo McGee recurrió a los números y varias tablas y gráficas para detallar la expansión territorial de los Estados Unidos a lo largo de su historia. Para él, cada expansión territorial, medida en millas cuadradas, fue seguida de un aumento poblacional considerable así como de un incremento significativo en la actividad comercial. Pero para McGee el auténtico crecimiento de la nación no estaba en esos indicadores territoriales, poblacionales y comerciales sino más bien en el avance de la iniciativa estadounidense, en la progresión de su vigor intelectual, físico y moral, en lo que llamó la “individualidad inteligente” de los estadounidenses, quienes unidos laboraban para “elevar” la humanidad y mejorar el mundo. Este énfasis en los lazos sociales y la cooperación era característico del progresismo. Para McGee el mejor indicador, aunque indirecto, del crecimiento en dicho vigor e individualidad, donde recaía el verdadero crecimiento de la nación, era la riqueza derivada de la expansión territorial:

The strenght of America is indeed faintly suggested by broad territorial expanse, teeming millions of people, and half the railways of the world; the real strenght lies in the immeasurable capabilities of individuals, who have already made noble conquest of nature’s forces; and there are no units for measuring the spontaneous powers of freemen united by common impulse in the common task of elevating mankind and bettering the world. While there is no direct way of measuring the individuality—much less the unity—of the American people, there are certain values indicating this quality even more clearly tan area or population; one of these is wealth, individual and collective.

McGee, al convertir la lucrativa expansión territorial estadounidense en un proceso natural, y por ende normal, ofreció a sus oyentes, miembros de la National Geographic Society y la American Society for the Advancement of Science, una interpretación lamarckista de imperialismo estadounidense, una similar a la de la “Tesis de la Frontera” de Frederick Jason Turner. Para McGee, y como el mismo expresó, el progreso estadounidense residía en la “conquista de la naturaleza” y no en la “conquista de las naciones” o en políticas nacionales. Para el conocido conservacionista-progresista la historia de Estados Unidos era la de una nación formada en su choque con la naturaleza, una historia en la que los estadounidenses además de acomodarse a las circunstancias ambientales transformaban su entorno a su favor, tomando, como se infiere del evolucionismo de Lamarck, una participación activa en la mutación del ambiente y consecuentemente de su propia especie. Y esa transformación era para McGee tan subjetiva como material. En la lucha con la naturaleza se construían la sociedad estadounidense y su identidad nacional. Allí también se construía el imperio y el futuro mismo de toda la humanidad, con los Estados Unidos a la vanguardia de su evolución.

El resultado ideológico de la narrativa lamarckista, turneriana y progresista de McGee fue conspicuo, coherente y efectivo: la naturalización del imperialismo. Y lo hizo, como es típico también de la retórica colonialista, en dos sentidos. Primero, McGee redujo el imperialismo a un fenómeno natural; el imperialismo estadounidense, parte de la historia humana, solo seguía las leyes naturales. Segundo, McGee convirtió el imperialismo en un fenómeno regular, un fenómeno que corrientemente ocurre, y por ende normal o natural. McGee lo hizo regular, habitual, ordinario. La expansión territorial era para McGee una expresión corriente de la conquista de la naturaleza y además conforme a las leyes de la evolución. El imperialismo estadounidense, como confirmaba el influyente conservacionista, no era una nueva política nacional sino la continuación de un proceso natural, centenario, exitoso y usual en la historia de la nación estadounidense:

He errs who forgets the history of this country. Every citizen of the United States would do well to remember the decades past, and realize that the growth of 1898 marks no new policy, and is but the normal continuation of a course of development successfully pursued for a century.

Read Full Post »

Huellas2

Último número de Huellas de Estados Unidos

3 de octubre de 2014

El sétimo y último número de la revista on-line Huellas de Estados Unidos. Perspectivas y debates desde América Latina, rinde homenaje al recientemente fallecido historiador norteamericano Gabriel Kolko. Marxista convencido, Kolko desarrolló  una valiosa obra crítica sobre diversos temas de la historia estadounidense, entre los que destacan,   el desarrollo del capitalismo estadounidense, la llamada era progresista, el imperialismo norteamericano y  la guerra de Vietnam. Entre sus libros más importantes se encuentran: The Triumph of Conservatism: A Reinterpretation of American History, 1900-1916 (New York, NY: The Free Press, 1963), Anatomy of a War: Vietnam, the United States, and the Modern Historical Experience (New York, NY: The Free Press, 1985) y Century of War: Politics, Conflicts, and Society since 1914 ( New York, NY: The New Press. 1994).

Huellas de Estados Unidos rinde a homenaje a este gran historiador con un editorial de la pluma de Pablo A. Pozzi y un trabajo de Leandro Della Mora sobre el análisis que hizo Kolko del conflicto vietnamita. También se incluyen dos trabajos del propio Kolko: «El fin de la guerra de Vietnam, hace 30 años» y «La lección de una derrota total de Estados Unidos» y «Usemos la cabeza. Recetas para el peliagudo planeta de hoy».

Además de las piezas dedicadas a Kolko, este número de Huellas incluye una interesante selección de ensayos sobre diversos temas. Los trabajos de Leonardo Pataccini y Arno J. Mayer enfocan el espinoso tema de las relaciones ruso-norteamericanas y la guerra civil ucraniana. Valeria L. Carbone y Meghan Keneally examinan el tema las relaciones raciales en Estados Unidos, exacerbado por los recientes brotes de violencia racial. Roberto A. Ferrero examina el desarrollo de las relaciones mexicano-estadounidenses, Marcela Croce analiza el uso imperialista del cine infantil de Hollywood y Sonali Konhatkar enfoca la figura de Cesar Chavez.

Mis felicitaciones a los editores de Huellas de Estados Unidos por lo balanceado de este número y por el merecido homenaje a Gabriel Kolko.

Norberto Barreto Velázquez

Lima, 3 de octubre de 2014

Read Full Post »

A Forgotten Stage of the Atlantic Slave Trade

by Gregory E. O’Malley 

HNN September 21, 2014

156926-FPJOn January 9, 1786, thirty-five “Men, Women, boys and Girls” from Angola climbed aboard a small brig in Kingston’s busy harbor and returned to sea. They had recently survived an Atlantic crossing to Jamaica with hundreds of other captives, but the vagaries of the Atlantic slave market split them off for another voyage. Embarking on this second ocean passage, the smaller group of captives climbed aboard a much smaller vessel, called Mars. The crew also packed the hold with goods, so the Angolans maneuvered around barrels of rum, sugar, and pimento.

The observant among them gleaned from the sun or stars that this new voyage carried them north, instead of west. They surely noticed a change in the weather. Winter gripped North America, and even in Georgia that January, locals remarked at “the severity of it.” The Mars rocked and thrashed in violent waves whipped up by storms out of the northeast. Frigid rains and high seas drenched the deck with water that dripped and sloshed into the hold. Contrary winds caused an unexpectedly “long passage.” Provisions ran low.

The crew headed for the nearest harbor, but one of the Angolan women succumbed to cold or hunger and “died two days before [the Mars] got into port.” Mercifully, the other thirty-four prisoners survived to reach Savannah, Georgia—probably unaware that their intended destination had been a place called Charleston, farther up the coast. The merchant in charge of selling the survivors perceived them as “a very slight made People,” probably because their passage from Jamaica on short rations made them appear so. One man died “a few days after they arrived.” The others recovered enough for sale into American slavery, but it would eight months after sailing from Jamaica before the last of them sold.

As typically told, the story of the Atlantic slave trade ends after the ocean crossing. A transatlantic slave ship glides into an American port, planters flock to an auction on the pier, and enslaved people presumably march with new owners to nearby plantations. Slave trade histories usually end with such a sale, but for hundreds of thousands of enslaved African people the journey did not end there. Labor-hungry plantation owners were not the only buyers of weary survivors of the Middle Passage; merchant speculators sought human commodities as well.

Port records, merchant papers, and imperial correspondence all suggest that a thriving intercolonial slave trade dispersed as many as a quarter of the African people who arrived in the New World, extending their dangerous journeys to American plantations. Such “final passages,” after the Atlantic crossing, occurred for a variety of reasons. Some colonial markets were too small to attract vessels directly from Africa with hundreds of slaves, but could be profitably targeted by intercolonial traders with a few enslaved people and an assortment of goods; some European empires enjoyed stronger trading positions in Africa than others, creating supply and price discrepancies across imperial borders in the Americas, setting the stage for smuggling; some important sites of American slavery were inland, requiring overland distribution after the Middle Passage. Whatever the reasons, colonial port records document more than seven thousand such shipments originating in British American colonies alone. Thousands more ventures surely occurred—in other regions and in periods not covered by surviving records.

Despite the vast scale of such intercolonial trafficking, historians have been slow to recognize and examine it, a blind spot especially pronounced for the British Atlantic. The oversight may stem partly from the long shadow that Philip Curtin cast on the field. His path-breaking book, The Atlantic Slave Trade: A Census (1969), was framed by a simple and straightforward question: Just how many African people crossed the Atlantic in the slave trade? That question (and his attempt to answer it by synthesizing regional estimates from the extant secondary scholarship) was an essential starting point for slave trade studies. But in some ways, Curtin’s focus on quantifying the transatlantic migration circumscribed the field—in ways both obvious and more surprising. – See more at: http://historynewsnetwork.org/article/156926#sthash.K6rDShzB.dpuf

Most straightforward, for decades after Curtin’s book appeared, slave trade scholars focused on the so-called “numbers game,” with one scholar after another revising Curtin’s estimates. Some used census records and demographic modeling; others counted the captives in port records and shipping returns. Such efforts culminated in Voyages: The Transatlantic Slave Trade Database (www.slavevoyages.org), spearheaded by David Eltis, which seeks to document each individual voyage that carried Africans across the Atlantic. It is a prodigious work that documents more than 35,000 slave-trading ventures. The database improves our knowledge of the trade’s scale, organization, and mortality, and it stands as a monument to scholarly collaboration, with dozens of researchers contributing data. Despite these virtues, however, the database is limited to voyages that crossed the Atlantic—omitting the intercolonial trade—perhaps because that is how Curtin framed the question that launched the field.

More surprising perhaps, critics of such quantitative study have also focused on the Atlantic crossing at the expense of other phases of the trade. In recent years, a rich historiography has called for moving beyond the counting of enslaved people crossing the Atlantic to achieve a more humanizing portrayal—one that reckons more with what enslaved migrants endured, how they understood their journeys, and what cultures they carried with them. Marcus Rediker’s The Slave Ship: A Human History (2008) and Stephanie Smallwood’s Saltwater Slavery (2009), for example, focus explicitly on lived experiences aboard slave ships, on putting a human face on the millions of people who had been counted by other slave trade scholars. Yet these works, too, stop after the Atlantic crossing. They describe the infamous Middle Passage, but do not examine the networks of dispersal that forced beleaguered men and women onward—from Barbados to Savannah, from Jamaica to Panama, or from Charleston to the North American backcountry.

Yet hundreds of thousands of enslaved people did move on. Weary, often ill, angry, and often terrified, they arrived in a first American port only to be purchased by intercolonial speculators. American traders bought enslaved people in one port for transshipment to another, adding additional weeks and new dangers to the voyages of captives. Mortality in this intercolonial trade was devastating for people already debilitated by the Middle Passage. Furthermore, dispersal after the Atlantic crossing often separated transatlantic shipmates who shared language, culture, or even ties of kinship. And the importance of such intra-American trafficking extends beyond the devastating experiences of captives. The intercolonial slave trade spread the institution of slavery to new colonies and helped colonial merchants elaborate their trade networks. Many general traders in the Americas (and imperial policymakers) saw such slave trading as vital to opening a broader business with new customers, entangling the profits of slave trading with all manner of other commerce.

There is a certain irony to slave trade scholars focusing only on the Atlantic crossing—an irony captured in the phrase used to describe that journey. For most twenty-first-century readers, “Middle Passage” conjures thoughts of the horrific experiences of African captives in their forced Atlantic crossings, but the voyage was termed “middle” to reflect European, not African, experience. For European traders the transatlantic voyage typically formed the second leg of a three-part journey: a first passage, from Europe to Africa with trade goods; a “middle” passage, from Africa to America with slaves; and a third voyage, from America back to Europe with colonial staples. This “triangle” trade gave the Middle Passage its name. Despite these Eurocentric origins, scholars have claimed the term for the slave trade’s victims. But ironically, “Middle Passage” actually fits the experiences of African migrants better than most scholars have realized. The journeys of enslaved Africans did not begin at their ports of embarkation for the ocean crossing, nor did they end when transatlantic vessels reached the Americas. Instead, people often fell into slavery deep in the African interior, facing a first passage to the Atlantic coast; likewise, many enslaved people spread outward after the Middle Passage, often settling hundreds or even thousands of miles away from their first American landfall. Understanding the African migration experience—and the full profits of slave trading—requires reckoning with these final passages after the Atlantic crossing.

Gregory E. O’Malley is an Associate Professor of History at the University of California, Santa Cruz and the author of «Final Passages: The Intercolonial Slave Trade of British America, 1619-1807» (2014

Read Full Post »

The frontispiece from the Memoirs of Henry Obookiah, published 1818. Photo: Wikimedia Commons.

The Heathen School: A Story of Hope and Betrayal in the Age of the Early Republic
John Demos

Great failure is often more enduring than we realize. Before the downward spiral, the effort seems to cast the future in its image. It captures a moment and then goes uncommemorated. Yet it does not go away. It is as if the hopes it once contained continue to smolder.

The Paris Commune, the revolutionary socialist government that ruled the French capital in the spring of 1871, was such a failure: virtually erased from the public memory of modern Paris, but an inspiration to generations of socialists before the Russian Revolution and a corresponding source of fear for their opponents. Another such failure was the Foreign Mission School of Cornwall, Connecticut, the subject of John Demos’s new book, The Heathen School, freshly longlisted for the 2014 National Book Award.

The comparison, I concede, seems grandiose. The Commune left thousands, possibly tens of thousands, dead and large swaths of Paris in ruins. The Foreign Mission School destroyed only itself, leaving disillusioned graduates and an embittered and divided local community that threatened, but never executed, violence. It did its damage at a distance.

What unites the Commune with the Foreign Mission School is the bright and defining hope each originally contained and the disappointment each eventually produced. The Commune was a moment when France seemed to augur a new day; the school embodied equivalent optimism for the United States. Cornwall was a visible world of farms, forests, and villages but also an invisible world where God and Satan contested. God’s victory would be America’s gift to posterity.

The Heathen School, as it was called in everyday speech, became an American exercise in revolutionary uplift designed to transform the vast non-Christian world into something that looked like Connecticut. Instead of sending missionaries to the heathen, the school brought the heathen to the missionaries. The school would transform young men into Christians able to become missionaries or to assist them. It was part of an American project to spread republicanism and Protestant Christianity—for Americans regarded the two as inextricably linked—across the globe.

indexDemos possesses an uncanny ability to see the reflection of a much larger world in the towns of colonial New England and the early republic. In The Heathen School, what Demos discerns is American exceptionalism: the proposition that the United States is a chosen nation whose history diverges from all others and whose destiny will determine the fate of the world. It is an idea still embraced by most American politicians (even when they are smart enough not to believe it) and loathed by most American historians.

Extravagant ideas can alight on modest places. Cornwall is a small town in what was, during the early nineteenth century, the heartland of a New England evangelicalism determined to change the world. Some of the locals were articulate proponents of American exceptionalism and made it the rationale for the school. The United States was, according to Yale College President Timothy Dwight, the place where “Empire’s brightest throne shall rise.” Lyman Beecher of Connecticut—the father of Henry Ward Beecher and Harriet Beecher Stowe, who followed the reforming zeal of evangelicalism into abolition—already knew the answer when he asked, “From what nation shall the renovating power go forth?” There was less a fine line between American benevolence and American imperialism than no line at all.

It later became a cliché that Protestant missionaries to Hawaii, including those associated with the Heathen School, “came to do good and did well,” but the original enthusiasm for uplift was genuine. These were people who thought the millennium might be at hand. The American Board of Commissioners for Foreign Mission, sponsors of the Foreign Mission School, reversed the connections between expanding American trade and spreading the Gospel. “Natives of almost every heathen country” were being drawn from their homes by American commerce, the Board said. If not converted, they would bring the worst of American society back to their lands, corrupting their countrymen and prejudicing them against Christianity. The Foreign Mission School would take non-Christians drawn to the United States by commerce, or those who already lived within its boundaries, educate them, convert them, and send them home to transform their homelands.

The school was thus ancestral to a variety of American projects designed to make foreigners into instruments of conversion, people who would turn their countrymen into people like us. Our current rationale in training military officers and economists is not so different than that for training missionaries. As the sponsors of the Heathen School knew, the results could be disappointing. Frequently, they still are, unless you consider the likes of General Abdel Fattah el-Sisi and Mohamed Morsi, both partially educated on American shores, successful at creating New England in Egypt.

• • •

We tend not to look closely at the societies we expect to transform. We collapse them into largely undifferentiated lumps. This is true now as it was then. The very term Heathen School conveyed the American sense of a vast, indistinguishable mass of non-Christians. The students who came to the school were, however, disparate. Hawaiians dominated the first class, but it also included an Abenaki Indian, a Bengali, and a man named John Johnson, whose father was the child of an “English gentleman” and a “Hindoo woman” and whose mother was “a Jewess of the race of black Jews.” Later Tahitians came, as did at least one more Jew, a student from Timor, a Malay held as a slave in China, a Chinese, and two Greek boys from Malta. The students came from the four corners of the earth, but they were heathens one and all.

Demos breaks the undifferentiated mass into particular people. He concentrates on a small set of individuals—Henry Obookiah, who was Hawaiian, John Ridge and Elias Boudinot, both of whom were Cherokees from Georgia, and Sarah Bird Northrup and Harriet Gold, who were from Cornwall. The desire for salvation ran together with more earthly ones. The result is a book as much about psychology as theology and as much about intimacy as commerce.

In Demos’s books people who think they control events find themselves shaken by those supposedly under their influence. But the Hawaiian Henry Obookiah, who both in a sense created the Heathen School and was its chief product, was not the challenge that brought the imperial dream down.

Events far from New England uprooted Obookiah and deposited him in Connecticut. The internal wars that yielded the kingdom of Hawaii orphaned Obookiah, and the China and Pacific trade, of which the Hawaiian Islands were an integral part, set him in motion. He became a Kanaka, an expatriate Hawaiian sailor, who made his way to New England and arrived at Yale in search of an education. In Demos’s interpretation he was in search of family; he thought he found it in Connecticut.

Obookiah underwent a classic Protestant conversion experience and came “home to New Jerusalem,” entering the church on April 9, 1815. It was Obookiah who formulated a plan to return to Hawaii “to preach the Gospel to my Countrymen” in their own language. He became the most celebrated of the group of Hawaiians who formed the nucleus of the Foreign Mission School’s first class. It was, the American Board believed, the hand of providence that brought Obookiah to Connecticut. The founders felt “confident that this thing is from God . . . [and] will, among others, be a means of evangelizing the world.” Obookiah did seem to be the real thing. He invented orthography for writing Hawaiian, learned Hebrew, and grew famous, which proved useful for raising money and advancing the cause.

Obookiah died of typhus in 1818, one of those fortunate deaths that frees a person from responsibility for failures to come. As was the custom, his deathbed scene was fully described and his words recorded. Lyman Beecher preached his eulogy. His ghostwritten Memoirs would go through “about a dozen editions,” according to Demos. His goals, though, were largely unfulfilled. In Hawaii the missionaries, accompanied by several of the graduates of the Foreign Mission School, made converts, but the students were by and large a disappointment. In time the Americans took over the islands, enriched themselves, and largely dispossessed the inhabitants, who dwindled in numbers.

When Obookiah died the Hawaiian missionaries had not yet departed, nor had John Ridge, Elias Boudinot, and the other Cherokee students arrived at the Heathen School. After 1818 American Indians would dominate the student body. There was tension between the Indians and the Pacific Islanders; there were issues with truancy, discipline, and uneven academic achievement. But most troubling were relationships between the Cornwall girls and the scholars, or, as officials put it, “the colored boys.”

The desire to save the Indians, and a long history of sexual relations between Indian women and white men, did not prepare Cornwall for consensual sexual relations—in or out of marriage—between its white women and the school’s Indian men. To many readers, this will not come as a surprise, but the history of interracial sex is far more complicated than most Americans believe, and even more complicated than Demos makes it here. In the nation’s first days, it was fairly common and, if not fully accepted in all configurations, not routinely condemned or punished. But as the nineteenth century went on, prejudices against what became known as miscegenation intensified and hardened. The end of slavery—and with it the guaranteed subordination of black men and the coerced availability of black women—alongside worries about inheritance and property transmission and changing ideas about race all made interracial sex less tolerated than it had been earlier in American history. In Cornwall signs of this resistance appeared early.

John Ridge was from a leading Cherokee family and had already been to mission schools within the Cherokee Nation before he came to Cornwall in 1818. His romance with Sarah Northrup would have been utterly conventional had he not been Cherokee and she not been white. He was sick and entered the Northrup home. Sarah and her mother nursed him. He fell in love with Sarah and she with him.

The family sought to disrupt the romance by sending Sarah to her grandparents. The American Board decided it was time for John to return home, but neither distance nor time stilled their passion for each other—a passion that disturbed the social order. John Ridge published a denunciation of racial prejudice that allowed the “most stupid and illiterate white man” to disdain the most polished Indian. With Sarah’s devotion to John remaining strong, and her parents fearful that she would waste away longing for him and become vulnerable to consumption, Sarah’s family agreed to the marriage. It took place in January 1824, after John returned to Cornwall. Although some defended the marriage, much of Cornwall was outraged, and threats of violence accompanied the denunciations. John and Sarah moved to New Echota in the Cherokee Nation.

The marriage of John Ridge’s cousin Elias Boudinot to Harriet Gold bred even greater resentment and brought public demonstrations of disapproval. Harriet’s brothers and sisters and their spouses bitterly opposed the marriage. One of her brothers-in-law, the Reverend Cornelius Everest, wrote, “We weep; we sigh; our feelings are indescribable. Ah, it all is to be summed up in this—our sister loves an Indian! Shame on such love.” A minister from a neighboring town married Elias and Harriet in March of 1826 because the local minister refused to do so. They, too, would depart for the Cherokee Nation.

The school defended racial equality in the abstract, but not the actual fact of the marriages. Its evangelical supporters would not accept intermarriage, and the Ridge-Northrup wedding appears to have precipitated a decline in contributions. The founders had lost faith in their scholars, the last of whom would leave in 1828. Most of the graduates were disappointments to their teachers.

• • •

With the Boudinot-Gold marriage, Demos’s attention shifts to Cherokee country, and he signals the shift with what he calls an interlude. Demos narrates his own journeys paralleling those of his characters. He traveled to Hawaii to find Obookiah’s birthplace. And nearly two centuries after the Ridges and Boudinots settled in New Echota, Demos went for a visit.

We cannot time travel. A stop in Cornwall, or New Echota, or Obookiah’s birthplace leaves the visitor firmly in the present. But the past often lingers; its evidence endures. There are original buildings in Cornwall, fewer in New Echota. And at these sites stories and storytellers meet. Right here, in this house, this happened; here, these people once lived.

The historian’s next step is at once problematic and wondrous. Demos takes it. “In my mind’s eye I can glimpse the scholars passing in and out,” he writes of his visit to Cornwall. Being there “lessened the distance between my own world and that of the school.” Similarly in Georgia he muses that, for Harriet Gold, New Echota was a blank space to be filled in by experience. “So too, in my own case: an equally blank space. Until I have a chance to go there.” He travels to encounter traces of the past that remain visible.

That past was a Cherokee past, and what happened to the Cherokees in the 1820s and 1830s was a disgrace to the United States, but it was not a simple story, and Demos does not try to suggest otherwise. The Cherokee story shadowed, he writes, “on a vastly grander scale, that of the Foreign Mission School—high hopes, valiant efforts, leading to eventual tragic defeat.”

The same sense of mission and providential destiny that created the mission school ultimately did in the Cherokees. This is not to say the American Board destroyed them; many of their missionaries remained ardent supporters of the Cherokees’ attempt to retain their homeland. But the very sense of Christian superiority and providential favor for the United States embedded in the school also inspired those who sought to dispossess the Cherokees. Indians recognized this, and tried to counter it. They sought to separate American providential thinking into its secular and religious strains and pit them against each other. Indians hoped Christians would not evict Christians. They would, and they did.

Both Ridge and Boudinot had reason to doubt the value of the American Board as an ally, and neither thought that the United States would honor existing treaties. Seeing resistance as hopeless, they joined the Treaty Party, which ceded the Cherokees’ homeland to the United States. The Treaty Party had no authority, and the vast majority of Cherokees who followed Head Chief John Ross opposed them and their treaty, which was ratified, if only barely, by the Senate. In what Demos rightly describes as ethnic cleansing, the Cherokees and their neighbors lost their land, and many lost their lives in government roundups and a forced march west. For enabling this dispossession and dislocation, Ridge and Boudinot would pay with their lives when the surviving Cherokees reached Indian Territory.

The removal of the Cherokees would seem to make the tale of the Heathen School a familiar American story in which race takes the center stage. Racial prejudice sought to thwart the marriages of the Ridges and Boudinots and ultimately did in the school itself. Racial prejudice launched the Cherokees on the Trail of Tears. But if race in the United States is a familiar topic, it is also a complicated one, and Demos shows its complications. His great strength as a historian is his ability to move effortlessly from the personal to the national, and when he does so here, a story about heathens and “colored boys” expands to include black slaves.

Many members of the Cherokee elite were slaveholders, and when Sarah Ridge, née Northrup, moved to Georgia, she mutated from a Yankee to a plantation mistress. She was in the eyes of both Cherokees and black slaves a “white lady,” the very status that brought so much trouble in Cornwall. With her husband’s assassination, Sarah was described as having “a dead heart in a living bosom.” Her Cherokee relatives sought to strip her and her children of their inheritance since she was “a white lady and had no clan.” She lived by hiring out her slaves. Her sons grew up quarrelsome and violent. They, along with a sizeable number of anti-Ross Cherokees, stood with the Confederacy, as did, although Demos does not mention it, Boudinot’s son, Elias Cornelius.

Lyman Beecher’s descendants became abolitionists, but the descendants of the leading Cherokee graduates of the Heathen School joined the Confederacy in defense of human slavery. Two of them, John Rollins Ridge and Elias Cornelius Boudinot, eventually fled the Cherokee Nation under threat of death and ended up alienated from both their New England and Cherokee roots. The failures of the Heathen School had only ramified.

Demos draws a parallel between Cornwall’s opposition to interracial marriage in the nineteenth century and the illegality of same-sex marriage in the twenty-first. His intent, I think, is something more than to compare inequities, particularly since, with same-sex marriage now legal in Connecticut, the analogy might produce comforting feelings of growing tolerance. Demos is too good a historian to think the past will be much of a comfort to us. He has crafted the book otherwise. His heroes, Sara and John Ridge, do not become villains, but they are more than simply victims of racism. Similarly the Cherokees and Hawaiians were betrayed and despoiled, but they were not innocents.

Demos’s analogies have a deeper target: the American sense of being a beacon to the world, its last best hope. This only leads us astray. We want to shape the world without the world touching us and revealing our own limits and prejudices, but more than that we insist on foreigners being unrealized versions of ourselves. We educate the Sisis and Morsis thinking they will become agents of our desires and in so doing forget that they, like the students at the Heathen School, were never ours to shape.

Richard White, Margaret Byrne Professor of American History at Stanford University, is author, most recently, of Railroaded: The Transcontinentals and the Making of Modern America.

Read Full Post »

« Newer Posts - Older Posts »