Feeds:
Entradas
Comentarios

Archive for the ‘Imperialismo norteamericano’ Category

Lying to Children About the California Missions and the Indians

By Deborah A. Miranda

Huffington Post March 23, 2015

All my life, I have heard only one story about California Indians: godless, dirty, stupid, primitive, ugly, passive, drunken, immoral, lazy, weak-willed people who might make good workers if properly trained and motivated. What kind of story is that to grow up with?

The story of the missionization of California.

In 1769, after missionizing much of Mexico, the Spaniards began to move up the west coast of North America in order to establish claims to rich resources and before other European nations could get a foothold. Together, the Franciscan priests and Spanish soldiers «built» a series of 21 missions along what is now coastal California. (California’s Indigenous peoples, numbering more than 1 million at the time, did most of the actual labor.) These missions, some rehabilitated from melting adobe, others in near-original state, are now one of the state’s biggest tourist attractions; in the little town of Carmel, Mission San Carlos Borromeo de Carmelo is the biggest attraction. Elsewhere, so-called Mission décor drenches Southern California, from restaurants to homes, apartment buildings, animal shelters, grocery stores, and post offices. In many neighborhoods, a bastardized Mission style is actually required by cities or neighborhood associations. Along with this visual mythology of adobe and red clay roof tiles comes the cultural storytelling that drains the missions of their brutal and bloody pasts for popular consumption.

In California schools, students come up against the «Mission Unit» in 4th grade, reinforcing the same lies those children have been breathing in most of their lives. Part of California’s history curriculum, the unit is entrenched in the educational system and impossible to avoid, a powerfully authoritative indoctrination in Mission Mythology to which 4th graders have little if any resistance. Intense pressure is put upon students (and their parents) to create a «Mission Project» that glorifies the era and glosses over both Spanish and Mexican exploitation of Indians, as well as enslavement of those same Indians during U.S. rule. In other words, the Mission Unit is all too often a lesson in imperialism, racism, and Manifest Destiny rather than actually educational or a jumping-off point for critical thinking or accurate history.

In Harcourt School Publisher’s California: A Changing State, the sacrifice for gold, riches, settlements, and violence by Spanish, English, and Russian explorers is well enunciated throughout Unit 2 and dressed in exciting language such as on page 113: «In one raid, Drake’s crew took 80 pounds of gold!»

In four opening pages to Chapter 3 devoted to Father Junípero Serra, the textbook urges students to sympathize with the Spanish colonial mission:

Mile after mile, day after day, week after week, the group traveled across the rugged terrain. As their food ran low, many of the men grew tired and sick. Father Serra himself suffered from a sore on one leg that grew worse each day. And yet he never gave up, calling on his faith in God to keep himself going.

The language jumps between an acknowledgement of the subjugation of Indigenous peoples and of mutually beneficial exchanges. In Lesson 3, «The Mission System» opens: «Indians were forced to build a chain of missions.» Subsequent language emphasizes the alleged benefits to the Indians:

At the missions, the priests worked to create loyal Spanish subjects. . . . They would move the California Indians into the missions, teach them to be Christians, and show them European ways. [Emphasis added.]

Visiting the mission as an adult, proud, mixed-blood California Indian woman, I found myself unprepared for gift shops well stocked with CDs of pre-researched Mission Projects; photocopied pamphlets of mission terms, facts, and history (one for each mission); coloring books; packaged models of missions («easy assembly in 10 minutes!»); and other project paraphernalia for the discerning 4th grader and his or her worried parents.

The Carmel Mission website maintains a «4th Grade Corner» where daily life for padres and their «Indian friends» who «shared what little food and supplies they had» is blissfully described. Other websites offer «easy,» «quick,» and «guaranteed A+!!!» Mission Projects, targeting those anxious parents, for a price.

Generations of Californians have grown up steeped in a culture and education system that trains them to think of Indians as passive, dumb, and disappeared. In other words, the project is so well established, in such a predictable and well-loved rut, that veering outside of the worn but comfortable mythology is all but impossible.

On my visit to Mission Dolores, I found that out in a particularly visceral way.

It was over winter break, 2008. I was in San Francisco for a conference, and my friend Kimberly and I had hopped on a streetcar to visit Mission Dolores. As we emerged from the mission church via a side door into a small courtyard (featuring one of those giant dioramas behind glass), we inadvertently walked into video range of a mother filming her daughter’s 4th-grade project.

Excusing ourselves, we studiously examined the diorama while the little girl flubbed her lines a few times. She was reading directly from the flyer given tourists in the gift shop and could say «basilica» but not «archdiocese,» but she maintained her poise through several takes until she nailed it.

Both mothers ourselves, Kimberly and I paused to exchange a few words of solidarity about school projects with the mother, which gave Mom the chance to brag about how she and Virginia were trying to «do something a little different» by using video instead of making a model.

«That’s great!» I said, giving them both a polite smile. «I’ll bet your teacher will be glad to have something out of the ordinary.»

«Well, it is different actually being right here,» Mom said excitedly. «To think about all those Indians and how they lived all that time ago, that’s kind of impressive.»

I could not resist: «And better yet,» I beamed, «still live! Guess what? I’m a member of the Ohlone/Costanoan-Esselen Nation myself! Some of my ancestors lived in this mission. I’ve found their names in the Book of Baptism.» (I didn’t mention that they are also listed in the Book of Deaths soon afterward.)

The mother was beside herself with pleasure, posed me with her daughter for a still photo, and wrote down my name so she could Google my work. Little Virginia, however, was shocked into silence. Her face drained, her body went stiff, and she stared at me as if I had risen, an Indigenous skeleton clad in decrepit rags, from beneath the clay bricks of the courtyard. Even though her mother and I talked a few more minutes, Virginia the 4th grader—previously a calm, articulate news anchor in training—remained a shy shadow, shooting side glances at me out of the corner of her eyes.

As Kimberly and I walked away, I thought, «That poor kid has never seen a live Indian, much less a ‘Mission Indian’—she thought we were all dead!» Having me suddenly appear in the middle of her video project must have been a lot like turning the corner to find the (dead) person you were talking about suddenly in your face, talking back.

Kimberly, echoing my thoughts, chortled quietly, «Yes, Virginia, there really are live Mission Indians.»

The problem is, thanks to Mission Mythology, most 4th graders will never know that and the textbooks don’t help to give visibility to modern California Indians.

Throughout the rest of California: A Changing History, mentions of California Indians are brief and as victims fading into history. On page 242, under the heading of «A Changing Population,» Harcourt states simply, «California Indians were hurt by the gold rush. . . . Many were forced off their lands when the miners found gold there.»

Many pages later, California Indians are mentioned again when the textbook devotes five paragraphs to Indian Governments.
Although 109 tribes are recognized in California, in the text, they are faceless and noted only by red square dots on a map.

It’s time for the Mission Fantasy Fairy Tale to end. This story has done more damage to California Indians than any conquistador, any priest, and soldado de cuera(leather-jacket soldier), any smallpox, measles, or influenza virus. This story has not just killed us, it has also taught us to kill ourselves and kill each other with alcohol, domestic violence, horizontal racism, internalized hatred. We have to put an end to it now.
This article is adapted from Deborah Miranda’s book Bad Indians: A Tribal Memoirand is reprinted here with permission of the author. This article is part of the Zinn Education Project’s If We Knew Our Historyseries.

deborah_miranda_credit_kevin_remintonDeborah A. Miranda is the author of Bad Indians: A Tribal Memoir(Heyday Books, 2012). Miranda is an enrolled member of the Ohlone /Costanoan-Esselen Nation of California, and is also of Chumash and Jewish ancestry. She is a John Lucian Smith Jr. Professor of English at Washington and Lee University, and says reading lists for her students include as many books by «bad Indians» as possible. Visit Deborah Miranda’s blog, BAD NDNS.

Image credits:

Read Full Post »

‘The Age of Acquiescence,’ by Steve Fraser

Naomi Klein

The New York Times  March 16, 2015

KLEIN-superJumbo

Nishant Choksi

For two years running, Oxfam International has traveled to the World Economic Forum in Davos, Switzerland, to make a request: Could the superrich kindly cease devouring the world’s wealth? And while they’re at it, could they quit using “their financial might to influence public policies that favor the rich at the expense of everyone else”?

In 2014, when Oxfam arrived in Davos, it came bearing the (then) shocking news that just 85 individuals controlled as much wealth as half of the world’s population combined. This January, that number went down to 80 individuals.

Dropping this news in Davos is a great publicity stunt, but as a political strategy, it’s somewhat baffling. Why would the victors of a class war choose to surrender simply because the news is out that they have well and truly won? Oxfam’s answer is that the rich must battle inequality or they will find themselves in a stagnant economy with no one to buy their products. (Davos thought bubble: “Isn’t that what cheap credit is for?”)

Still, even if some of the elite hand-wringing about inequality is genuine, are reports really the most powerful weapons out there to fight for a more just distribution of wealth? Where are the sit-down strikes? The mass boycotts? The calls for expropriation? Where, in short, are the angry masses?

Oxfam’s Davos guilt trip doesn’t appear in Steve Fraser’s “The Age of Acquiescence: The Life and Death of American Resistance to Organized Wealth and Power,” but these are the questions at the heart of this fascinating if at times meandering book. Fraser, a labor historian, argues that deepening economic hardship for the many, combined with “insatiable lust for excess” for the few, qualifies our era as a second Gilded Age. But while contemporary wealth stratification shares much with the age of the robber barons, the popular response does not.

As Fraser forcefully shows, during the first Gilded Age — which he defines loosely as the years between the end of the Civil War and the market crash of 1929 — American elites were threatened with more than embarrassing statistics. Rather, a “broad and multifaceted resistance” fought for and won substantially higher wages, better workplace conditions, progressive taxation and, ultimately, the modern welfare state (even as they dreamed of much more).

To solve the mystery of why sustained resistance to wealth inequality has gone missing in the United States, Fraser devotes the first half of the book to documenting the cut and thrust of the first Gilded Age: the mass strikes that shut down cities and enjoyed the support of much of the population; the Eight Hour Leagues that dramatically cut the length of the workday, fighting for the universal right to leisure and time “for what we will”; the vision of a “ ‘cooperative commonwealth’ in place of the Hobbesian nightmare that Progress had become.”

He reminds readers that although “class war” is considered un-American today, bracing populist rhetoric was once the lingua franca of the nation. American presidents bashed “moneycrats” and “economic royalists,” and immigrant garment workers demanded not just “bread and roses” but threatened “bread or blood.” Among many such arresting anecdotes is one featuring the railway tycoon George Pullman. When he died in 1897, Fraser writes, “his family was so afraid that his corpse would be desecrated by enraged workers, they had it buried at night . . . in a pit eight feet deep, encased in floors and walls of steel-reinforced concrete in a lead-lined casket covered in layers of asphalt and steel rails.”

652c3bb7622f54e59746d4bd7b5e8431Fraser offers several explanations for the boldness of the post-Civil War wave of labor resistance, including, interestingly, the intellectual legacy of the abolition movement. The fight against slavery had loosened the tongues of capitalism’s critics, forging a radical critique of the market’s capacity for barbarism. With bonded labor now illegal, the target pivoted to factory “wage slavery.” This comparison sounds strange to contemporary ears, but as Fraser reminds us, for European peasants and artisans, as well as American homesteaders, the idea of selling one’s labor for money was profoundly alien.

This is key to Fraser’s thesis. What ­fueled the resistance to the first Gilded Age, he argues, was the fact that many Americans had a recent memory of a different kind of economic system, whether in America or back in Europe. Many at the forefront of the resistance were actively fighting to protect a way of life, whether it was the family farm that was being lost to predatory creditors or small-scale artisanal businesses being wiped out by industrial capitalism. Having known something different from their grim present, they were capable of imagining — and fighting for — a radically better future.

It is this imaginative capacity that is missing from our second Gilded Age, a theme to which Fraser returns again and again in the latter half of the book. The latest inequality chasm has opened up at a time when there is no popular memory — in the United States, at least — of another kind of economic system. Whereas the activists and agitators of the first Gilded Age straddled two worlds, we find ourselves fully within capitalism’s matrix. So while we can demand slight improvements to our current conditions, we have a great deal of trouble believing in something else entirely.

Fraser devotes several chapters to outlining the key “fables” which, he argues, have served as particularly effective ­resistance-avoidance tools. These range from the billionaire as rebel to the supposedly democratizing impact of mass stock ownership to the idea that contract work is a form of liberation. He also explores various forces that have a “self-­policing” impact — from mass indebtedness to mass incarceration; from the fear of having your job deported to the fear of having yourself deported.

With scarce use of story or development of characters, this catalog of disempowerment often feels more like an overlong list than an argument. And after reading hundreds of pages detailing depressing facts, Fraser’s concluding note — that “a new era of rebellion and transformation” might yet be possible — rings distinctly hollow.

This need not have been the case. Fraser spares only a few short paragraphs for those movements that are attempting to overcome the obstacles he documents — student-debt resisters, fast-food and Walmart workers fighting for a living wage, regional campaigns to raise the minimum wage to $15 an hour or the various creative attempts to organize vulnerable immigrant workers. We hear absolutely nothing directly from the leaders of these contemporary movements, all of whom are struggling daily with the questions at the heart of this book.

That’s too bad. Because if hope is to be credible, we need to hear not just from yesterday’s dreamers but from today’s as well.

THE AGE OF ACQUIESCENCE

The Life and Death of American Resistance to Organized Wealth and Power

By Steve Fraser

470 pp. Little, Brown & Company. $28.

Naomi Klein is an award-winning journalist, syndicated columnist, and author. 

Steve Fraser is Visiting Associate Professor of Economic History  at New York University.

Read Full Post »

Avatar de Sharon QuinsaatMobilizing Ideas

Fifty years ago, on March 8, 1965, the U.S. Marines landed in Da Nang, marking the beginning of the American ground war in Vietnam. Protests erupted all over the U.S., with the largest anti-war demonstration in the U.S.—the March Against the War organized by the Students for Democratic Society—taking place in April 17. Radicalism in the 60s has been the subject of social movement theories that set the direction of contemporary scholarship. But scholars in the field were remiss in examining a contentious group in American society: Asian Americans.

While Sid Tarrow was visiting Pittsburgh early this month, we had a conversation about the dearth of studies on Asian American mobilization, especially in the 1960s. In recent years, we have noticed a rise in scholarship on the Asian American movement (AAM). But based on a cursory look of undergraduate and graduate courses in social movements, Asian Americans remain invisible in mainstream…

Ver la entrada original 1.165 palabras más

Read Full Post »

1377538_1431603770395200_1290030917_n

How Cotton Remade the World

The Civil War cotton shock didn’t just shake the American economy

Politico.com January 30, 2015

The American Civil War is one of the best-researched events in human history. Hundreds of historians have dedicated their professional careers to its study; thousands of articles and books have been published on its battles, politics and its cultural and social impact. Discussions of the war permeate everything from popular films to obscure academic conferences. Would we expect any less for a defining event in our history—an event that can persuasively be described as the second American Revolution? Certainly not.

Yet given all that attention, it is surprising that we have spent considerably less effort on understanding the war’s global implications, especially given how far-reaching they were: The war can easily be seen as one of the great watersheds of 19th-century global history. American cotton, the central raw material for all European economies (and also those of the northern states of the Union), suddenly disappeared from global markets. By the end of the war, even more consequentially, the world’s most important cotton cultivators, the enslaved workers of the American South, had attained their freedom, undermining one of the pillars on which the global economy had rested: slavery. The war thus amounted to a full-fledged crisis of global capitalism—and its resolution pointed to a fundamental reorganization of the world economy.

When we look at capitalism’s history, we usually look at industry, at cities and at wage workers. It is easy for us to forget that much of the change we associate with the emergence of modern capitalism took place in agriculture, in the countryside. With the rise of modern industry after the Industrial Revolution of the 1780s, the pressures on this countryside to supply raw materials, labor and markets increased tremendously. Since modern industry had its origins everywhere in the spinning and weaving of cotton, European and North American manufacturers quite suddenly demanded access to vastly increased quantities of raw cotton.

That cotton came almost exclusively from the slave plantations of the Americas—first from the West Indies and Brazil, then from the United States. When American cotton growers began to enter global markets in the 1790s after the revolution on Saint Domingue—once the world’s most important cotton-growing island—they quickly came to play an important, in fact dominant, role. Already in 1800, 25 percent of cotton landed in Liverpool (the world’s most important cotton port) originated from the American South. Twenty years later that number had increased to 59 percent, and in 1850 a full 72 percent of cotton imported to Britain was grown in the United States. U.S. cotton also accounted for 90 percent of total imports into France, 60 percent of those into the German lands and 92 percent of those shipped to Russia. American cotton captured world markets in a way that few raw material producers had before—or have since.

Planters in the United States dominated production of the world’s most important raw material because they possessed a key combination: plentiful land, recently taken from its native inhabitants, plentiful slave labor, made available by the declining tobacco agriculture of the upper South and access to European capital. European merchants’ earlier efforts to secure cotton crops from peasant producers in places such as Anatolia, India and Africa had failed, as local producers refused to focus on the mono-cultural production of cotton for export, and European merchants lacked the power to force them. It was for that reason that cotton mills and slave plantations had expanded in lockstep, and it was for that reason that the United States became important to the global economy for the first time.

Slave plantations were fundamentally different sites of production than peasant farms. On plantations, and only on plantations, owners could dominate all aspects of production: Once they had taken the land from its native inhabitants, they could force enslaved African-Americans to do the backbreaking labor of sowing, pruning and harvesting all that cotton. They could control that labor with unusual brutality, and could deploy and redeploy it without any constraints, lowering the costs of production. With the expansion of industrial capitalism, this strange form of capitalism expanded, and European capital in search of cotton flowed to the slave areas of the world in ever-greater quantities. This world was not characterized by contracts, the rule of law, wage labor, property rights or human freedom—but by the opposite—arbitrary rule, massive expropriations, coercion, slavery and unfathomable violence. I call this form of capitalism “war capitalism”; it flourished in parts of the United States and eventually resulted in civil war.

Slavery stood at the center of the most dynamic and far-reaching production complex in human history. Herman Merivale, British colonial bureaucrat, noted as much in 1839 when he observed that “the greater part of our cotton [is] raised by slaves,” and Manchester’s and Liverpool’s “opulence is as really owing to the toil and suffering of the negro, as if his hands had excavated their docks and fabricated their steam-engines.”

As the cotton industry of the world expanded, with spinning and weaving mills cropping up in fast-industrializing areas, the cotton-growing complex migrated ever further into the American West, to Alabama, Mississippi and eventually Texas, drawing on ever more slave labor. By 1830, one in 13 Americans grew cotton, one million people in total, nearly all of them enslaved. In one of the most violent episodes in American history, one million enslaved workers were uprooted and sold from the upper South into cotton growing states such as Mississippi, Alabama and Louisiana, where their labor fueled a vast profit-making machine. This machine enriched not just the plantation owners, but also merchants in New York and Boston and Liverpool, as well as manufacturers in Alsace, Lancashire and New England. Slavery in the United States had become central to the functioning of the global economy, as South Carolina cotton planter Sen. James Henry Hammond observed quite accurately when he argued, “Cotton is king.”

***

When war broke out in April of 1861, this global economic relationship collapsed. At first, the Confederacy hoped to force recognition from European powers by restricting the export of cotton. Once the South understood that this policy was bound to fail because European recognition of the Confederacy was not forthcoming, the Union blockaded southern trade for nearly four years. The “cotton famine,” as it came to be known, was the equivalent of Middle Eastern oil being removed from global markets in the 1970s. It was industrial capitalism’s first global raw materials crisis.

The effects were dramatic: In Europe, hundreds of thousands of workers lost employment, and social misery and social unrest spread through the textile cities of the United Kingdom, France, Germany, Belgium, the Netherlands and Russia. In Alsace, posters went up proclaiming: Du pain ou la mort. Bread or death. Since very little cotton had entered world markets from non-enslaved producers in the first 80 years after the Industrial Revolution, many observers were all but certain that the crisis of slavery, and with it of war capitalism, would lead to a fundamental and long-lasting crisis of industrial capitalism as well. Indeed, when Union Gen. John C. Frémont emancipated the first slaves in Missouri in the fall of 1861, the British journal The Economist worried that such a “fearful measure” might spread to other slaveholding states, “inflict[ing] utter ruin and universal desolation on those fertile territories” and also on the merchants of Boston and New York, “whose prosperity … has always been derived” to a large extent from slave labor.

Yet to the surprise of many, the American Civil War did not result in a permanent crisis of industrial capitalism, but instead in the emergence of a fundamentally new relationship between industry and the global countryside, one in which industry drew on peasant, not slave, produced cotton. Already during the war itself, determined European manufacturers and imperial statesmen opened up new sources for raw cotton in India, Brazil, Egypt and elsewhere. So rapid was the expansion in Egypt, for example, that Egyptian historians consider the American Civil War one of the most important events in their own 19th-century history. New infrastructures, new laws, new capital and new administrative capacities were pushed into the global countryside. Combined with rapidly rising prices for raw cotton, these changes resulted in a world where for the first time ever, peasant producers sold large quantities of raw cotton into world markets, preventing the total collapse of the European industry and connecting the countryside to the cities in ways that had never been seen before.

India provides a good example for these transformations. The British imperial government built railroads into the cotton-growing hinterland. It changed Indian contract law to enable merchants to advance capital to cultivators on the security of their crop and land. European merchants, who had until then played a subordinate role in trading Indian cotton, now moved into cotton-growing regions, advanced capital to growers and built steam-powered cotton gins and cotton presses. The newly invented telegraph enabled price information to travel quickly, and by the 1870s European manufacturers could order cotton from hinterland towns in India and have it delivered to their factories in just six weeks.

Indian cultivators, like those elsewhere, increasingly specialized in the production of cotton for export, moving away from their old domestic industry of cloth production, and replacing food crops with cotton. Many of them turned into sharecroppers, highly indebted to local merchants. This model also travelled to the American South in the wake of the Civil War, when freedpeople’s efforts to gain access to land failed just as much as the efforts of landowners to hire them as wage workers. As a result, in Alabama and Georgia, South Carolina and Mississippi, formerly enslaved cotton growers became sharecroppers and tenant farmers. Railroads pushed ever further into the American cotton-growing countryside, bringing with them a new generation of merchants and European and North American capital. So called “Black codes” and new laws regulating advances to sharecroppers attached freedpeople, and, increasingly, white yeoman farmers, to the global cotton empire.

Slavery might have been at the center of the European cotton industry for three generations, but by the last third of the 19th century the new strength of European and North American capital and state power (with its vast infrastructural, administrative, military and scientific might) paved the way for other forms of labor mobilization—solving what was, from the perspective of the Economist,, one of the core problems the world faced at the end of the American Civil War: “It is clear that the dark races must in some way or other be induced to obey white men willingly.”

So successful was the transition of slave labor into sharecropping and tenant farming during and after the war that cotton production actually expanded dramatically. By 1870, American cotton farmers surpassed their previous harvest high, set in 1860. By 1877, they regained and surpassed their pre-war market share in Great Britain. By 1880 they exported more cotton than they had in 1860. And, by 1891, sharecroppers, family farmers and plantation owners in the United States were growing twice as much cotton as in 1861.

As nation states became more central to the global cotton industry, and as the cotton industry remained important to European economies, European states increasingly also tried to capture and politically control their own cotton-growing territories. With the United States now an important—and eventually the most important—industrial power in the world, Europeans wanted to follow the United States model and control cotton growing territories of their own. Pushed by manufacturers concerned about the security of their cotton supply, European colonial powers embarked upon new cotton-growing projects. No one did so more successfully than Russia, which by 1900 already secured a significant share of its cotton needs from its colonial territories in Central Asia. The Germans followed suit in their western African colony of Togo; the British in Egypt, India and throughout Africa; and the French, Belgians and Portuguese in their respective African colonies. Even the Japanese built a small cotton-growing complex in their colony, Korea.

Along with this expansion of cotton agriculture, a new wave of violence descended upon large swaths of the global countryside, as colonial powers forced peasants to grow cotton for export. As late as the 1970s in Mozambique, a former Portuguese colony, the word cotton still evoked, according to two historians, “an almost automatic response: suffering.” Slavery may have disappeared from the empire of cotton, but violence and coercion continued. Moreover, the post-war reconstruction of the global cotton-growing countryside provided ever increasing quantities of ever cheaper cotton to industry, but at the same time created huge new risks for rural cultivators, as plunging prices and political repression brought extreme poverty. In India, in the late 19th century, millions of cotton growers starved to death because the crops they grew could not pay for the food they needed. The British medical journal The Lancet estimated that 19 million Indians died in the famines of the late 1890s, most of them cotton growers.

The American Civil War thus marked one of the most important turning points in the history of global capitalism. The last politically powerful group of cotton growers—the planters of the American South—were now marginalized in the global economy, a global economy newly dominated by its industrial actors. More importantly, slavery, which had been so central to the first 80 years of the expansion of a mechanized cotton growing industry—and thus to global capitalism—had ended. New ways of mobilizing the labor of rural cotton-growing cultivators—in the United States and elsewhere—had emerged. War capitalism’s core features—the violent appropriation of the labor of African slaves, the violent expropriation of territories in the Americas by frontier settlers and the violent domination of global trade by armed entrepreneurs—had been replaced by a new world in which states structured sharecropping regimes and wage labor, built infrastructures and penetrated new territories administratively, judicially and militarily. This industrial capitalism contained within itself the violent legacy of war capitalism, and was all too frequently characterized by significant degrees of coercion. Still, it was a fundamentally new moment in capitalism’s long history.

And while today the world’s cotton growing countryside has changed once more, it is still often characterized by extreme poverty, political repression and a powerful presence of the state. In many years, huge government subsidies keep American and European producers in business, while a semi-military unit of the Chinese People’s Liberation Army is perhaps the single most important producer of cotton in the world today. Children still are forced to harvest cotton in some parts of the world. Extreme poverty characterizes the cotton growing areas of western Africa. As many as 110 million households are involved in the growing of cotton worldwide, testifying to the continued importance of the countryside and of agriculture to global capitalism.

As this episode from the endlessly fascinating global history of cotton shows, the significance of the American Civil War went well beyond the borders of the United States, and indeed, can only be fully understood from a global vantage point. And the same applies to the history of capitalism. Only a global perspective allows us to understand how this vastly productive and often violent new system of economic activity came into being—and only a global perspective allows us to understand the origins of the modern world we live in.

Read Full Post »

The Real Irish American Story Not Taught in Schools 

By Bill Bigelow

Zinn Education Project  March 16, 2015

The Irish Famine, 1850 by George Frederic Watts. Source: Views of the Famine.

The Irish Famine, 1850 by George Frederic Watts. Source: Views of the Famine.

“Wear green on St. Patrick’s Day or get pinched.” That pretty much sums up the Irish-American “curriculum” that I learned when I was in school. Yes, I recall a nod to the so-called Potato Famine, but it was mentioned only in passing.

Sadly, today’s high school textbooks continue to largely ignore the famine, despite the fact that it was responsible for unimaginable suffering and the deaths of more than a million Irish peasants, and that it triggered the greatest wave of Irish immigration in U.S. history. Nor do textbooks make any attempt to help students link famines past and present.

Yet there is no shortage of material that can bring these dramatic events to life in the classroom. In my own high school social studies classes, I begin with Sinead O’Connor’s haunting rendition of “Skibbereen,” which includes the verse:

… Oh it’s well I do remember, that bleak

            December day,

The landlord and the sheriff came, to drive

            Us all away

They set my roof on fire, with their cursed

            English spleen

And that’s another reason why I left old

            Skibbereen.

pearson_AmericaPW_Survey_071

This textbook calls the famine a “horrible disaster,” as if it were a natural calamity like an earthquake.

By contrast, Holt McDougal’s U.S. history textbook The Americans, devotes a flat two sentences to “The Great Potato Famine.” Prentice Hall’s America: Pathways to the Present fails to offer a single quote from the time. The text calls the famine a “horrible disaster,” as if it were a natural calamity like an earthquake. And in an awful single paragraph, Houghton Mifflin’s The Enduring Vision: A History of the American People blames the “ravages of famine” simply on “a blight,” and the only contemporaneous quote comes, inappropriately, from a landlord, who describes the surviving tenants as “famished and ghastly skeletons.” Uniformly, social studies textbooks fail to allow the Irish to speak for themselves, to narrate their own horror.

These timid slivers of knowledge not only deprive students of rich lessons in Irish-American history, they exemplify much of what is wrong with today’s curricular reliance on corporate-produced textbooks.

To support the famine relief effort, British tax policy required landlords to pay the local taxes of their poorest tenant farmers, leading many landlords to forcibly evict struggling farmers and destroy their cottages in order to save money. From Hunger on Trial Teaching Activity.

First, does anyone really think that students will remember anything from the books’ dull and lifeless paragraphs? Today’s textbooks contain no stories of actual people. We meet no one, learn nothing of anyone’s life, encounter no injustice, no resistance. This is a curriculum bound for boredom. As someone who spent almost 30 years teaching high school social studies, I can testify that students will be unlikely to seek to learn more about events so emptied of drama, emotion, and humanity.

Nor do these texts raise any critical questions for students to consider. For example, it’s important for students to learn that the crop failure in Ireland affected only the potato—during the worst famine years, other food production was robust. Michael Pollan notes in The Botany of Desire, “Ireland’s was surely the biggest experiment in monoculture ever attempted and surely the most convincing proof of its folly.” But if only this one variety of potato, the Lumper, failed, and other crops thrived, why did people starve?

Paddy’s Lament” recounts the famine and the Irish diaspora to America

Thomas Gallagher points out in Paddy’s Lament, that during the first winter of famine, 1846-47, as perhaps 400,000 Irish peasants starved, landlords exported 17 million pounds sterling worth of grain, cattle, pigs, flour, eggs, and poultry—food that could have prevented those deaths. Throughout the famine, as Gallagher notes, there was an abundance of food produced in Ireland, yet the landlords exported it to markets abroad.

The school curriculum could and should ask students to reflect on the contradiction of starvation amidst plenty, on the ethics of food exports amidst famine. And it should ask why these patterns persist into our own time.

More than a century and a half after the “Great Famine,” we live with similar, perhaps even more glaring contradictions. Raj Patel opens his book, Stuffed and Starved: Markets, Power and the Hidden Battle for the World’s Food System: “Today, when we produce more food than ever before, more than one in ten people on Earth are hungry. The hunger of 800 million happens at the same time as another historical first: that they are outnumbered by the one billion people on this planet who are overweight.”

Patel’s book sets out to account for “the rot at the core of the modern food system.” This is a curricular journey that our students should also be on — reflecting on patterns of poverty, power, and inequality that stretch from 19thcentury Ireland to 21st century Africa, India, Appalachia, and Oakland; that explore what happens when food and land are regarded purely as commodities in a global system of profit.

But today’s corporate textbook-producers are no more interested in feeding student curiosity about this inequality than were British landlords interested in feeding Irish peasants. Take Pearson, the global publishing giant. At its website, the corporation announces (redundantly) that “we measure our progress against three key measures: earnings, cash and return on invested capital.” The Pearson empire had 2011 worldwide sales of more than $9 billion—that’s nine thousand million dollars, as I might tell my students. Multinationals like Pearson have no interest in promoting critical thinking about an economic system whose profit-first premises they embrace with gusto.

Hunger on Trial teaching activity available online.

Hunger on Trial teaching activity available online. online.

As mentioned, there is no absence of teaching materials on the Irish famine that can touch head and heart. In a role play, “Hunger on Trial,” that I wrote and taught to my own students in Portland, Oregon—included at the Zinn Education Project website— students investigate who or what was responsible for the famine. The British landlords, who demanded rent from the starving poor and exported other food crops? The British government, which allowed these food exports and offered scant aid to Irish peasants? The Anglican Church, which failed to denounce selfish landlords or to act on behalf of the poor? A system of distribution, which sacrificed Irish peasants to the logic of colonialism and the capitalist market?

These are rich and troubling ethical questions. They are exactly the kind of issues that fire students to life and allow them to see that history is not simply a chronology of dead facts stretching through time.

So go ahead: Have a Guinness, wear a bit of green, and put on the Chieftains. But let’s honor the Irish with our curiosity. Let’s make sure that our schools show some respect, by studying the social forces that starved and uprooted over a million Irish—and that are starving and uprooting people today.

_________________________________________________________________

Bill Bigelow taught high school social studies in Portland, Ore. for almost 30 years. He is the curriculum editor of Rethinking Schools magazine and co-director of the online Zinn Education Project, www.zinnedproject.org. This project, inspired by the work of historian Howard Zinn, offers free materials to teach a fuller “people’s history” than is found in commercial textbooks. Bigelow is author or co-editor of numerous books, including A People’s History for the Classroomand The Line Between Us: Teaching About the Border and Mexican Immigration.

Read Full Post »

Could the South Have Won the War?

disunion45By March 1865, it was obvious to all but the most die-hard Confederates that the South was going to lose the war. Whether that loss was inevitable is an unanswerable question, but considering various “what if” scenarios has long been a popular exercise among historians, novelists and Civil War buffs

To explore that question, historians often use a concept known as contingency: During the war, one action led to a particular outcome, but if a different action had been taken it would have led to a different outcome. The problem with each scenario, though, is that although superficially persuasive, it collapses under the weight of contradictory facts.

Perhaps the most common scenario centers on the actions of Gen. Robert E. Lee. Some modern historians have attributed the Confederate defeat to Lee’s aggressiveness, implying that, if he had adopted a more defensive strategy, or even carried out guerrilla warfare after Appomattox, perhaps Lee could have held the North at bay until it tired of the conflict and sought a negotiated settlement.

But was this really possible considering the expectations of the Confederate people? Southerners were convinced they were superior soldiers and expected their armies to defeat the enemy on the battlefield. Politically, Lee could not have adopted a purely defensive strategy because the people would not have stood for it. Nor was guerrilla warfare an option. Events in Missouri, Tennessee and other areas where guerrillas operated during the war clearly showed how such brutal warfare devastated entire regions and broke down morale. There simply would not have been enough popular support to sustain such a strategy for long.

Some argue that the Confederates could have won if they had held Atlanta, Mobile, Ala., and the Shenandoah Valley beyond the 1864 election. Northern voters, dispirited by the stalemate, would have elected George B. McClellan president, and he would have bowed to the Democratic Party’s peace faction and opened negotiations with the Confederates.

Such speculation, however, is not supported by historical fact. In his letter accepting the Democratic nomination, McClellan clearly rejected the peace plank. There seems little doubt McClellan would have continued to fight if he became president, and the Union would still have eventually won. Also, a defeated Lincoln would have had four months left in office to achieve victory by launching winter campaigns. As it turned out, Gen. Ulysses S. Grant forced Lee to surrender just over one month after the inauguration. If a lame-duck Lincoln had adopted a more aggressive policy, Grant probably would have forced an Appomattox-like surrender before McClellan ever took office.

Gen. Robert E. Lee surrendering to Lt. Gen. Ulysses S. Grant at Appomattox Courthouse, April 9, 1865.

Gen. Robert E. Lee surrendering to Lt. Gen. Ulysses S. Grant at Appomattox Courthouse, April 9, 1865.Credit Library of Congress


Confederate defeat has also been blamed on King Cotton diplomacy. If the Confederates had sent as much cotton as possible to Europe before the blockade became effective instead of hording it to create a shortage, they could have established lines of credit to purchase war material. This argument is true, but it misses the point. While the Confederates did suffer severe shortages by mid-war, they never lost a battle because of a lack of guns, ammunition or other supplies. They did lose battles because of a lack of men, and a broken-down railway system made it difficult to move troops and materials to critical points. Cotton diplomacy would not have increased the size of the rebel armies, and an increasingly effective Union blockade would have prevented the importation of railroad iron and other supplies no matter how much credit the Confederates accumulated overseas.

Another diplomatic “what if” concerns European intervention. In the fall of 1862, Britain and France were prepared to extend diplomatic recognition to the Confederacy and offer to mediate a peace, but they backed away when the Union won the Battle of Antietam. In this scenario, if Lee had won the battle, Britain and France would have recognized the Confederacy and secured a peace ensuring Southern independence.

In reality, there is little likelihood the Europeans would have become involved in the war. They had already extended belligerent status to the Confederacy, which allowed it to purchase supplies and use European ports. Diplomatic recognition would have enhanced the Southerners’ prestige — but it would not have materially affected their ability to wage war.

And if the British had offered to mediate a peace, Lincoln certainly would have rebuffed them. Then what? It’s unlikely Britain would have rushed to the Confederates’ aid by breaking the blockade and provoking a war with the Union. By late 1862, emancipation had become a Union goal, and the abolitionist British people would never have supported their government becoming militarily involved to defend slavery. British officials also had not forgotten that American privateers devastated their merchant fleet in the War of 1812. And there was no economic incentive for Britain to become a Confederate ally, because the cotton shortage created by the blockade was soon alleviated by cotton from Egypt and India — and the trade Britain conducted with the Union far outweighed the value of Southern cotton.

Some historians have blamed the Confederate defeat on its strict adherence to states’ rights and a failure to develop a strong sense of nationalism. If the Southern people had been more successful in forming a national identity, Jefferson Davis could have nationalized the railroads and industry, and the governors would have cooperated more with Richmond. A powerful central government and a stronger sense of national identity would also have helped sustain morale when the war began to go badly. Instead, the Southerners’ belief in states’ rights kept the governors at odds with the central government, and the breakdown in civilian morale weakened the army by causing more soldiers to desert.

But that assessment underestimates what the South managed to accomplish. Rather than blaming the Confederates’ defeat on a lack of nationalism, one should marvel that they maintained their government as long as they did. From scratch, Southerners created a functioning constitutional government and a formidable military that included 80 percent of the eligible white males. The Confederates quickly developed a sense of nationalism in the first year of war because they believed they had no choice but either to form a separate nation or to face complete ruin. The string of victories in Virginia in 1861 and 1862 only increased this national pride. Even when the war began to go badly and the enemy occupied large sections of the Confederacy, most Southern whites were determined to fight on because they knew their homes would be the next to feel the invaders’ wrath if they did not.

civil-war-sumter75-popupSlavery and racial views also played an important role in Confederate nationalism. When the Emancipation Proclamation was issued, Southern whites’ resolve strengthened because they realized if they lost the war, the very cornerstone of their society would be destroyed. The sight of black soldiers deep in the Confederate heartland outraged Southern whites, but in the war’s last year those same Southerners were willing to enlist slaves to fight on their side. Confederate emancipation would have been unthinkable earlier in the conflict, but by 1865 many Southerners supported recruiting slaves as a way to strengthen the army and win European recognition. To achieve independence, they were willing to sacrifice the very thing they went to war to protect

There are notable examples in history where a weaker people defeated a stronger one. The American Revolution and the Vietnam War immediately come to mind, but the Americans and North Vietnamese had the military backing of the superpowers France and the Soviet Union, respectively. In virtually all cases where a weaker people have prevailed, they had a greater determination to win and were willing to fight for years and suffer horrendous casualties to wear down the enemy.

The Confederacy had no such backing, and a credible argument can be made that its defeat was inevitable from the beginning. What many fail to recognize is that Northerners were just as committed to winning as the Southerners. Some saw it as a war to free the slaves, while others fought to ensure that their republican form of government survived. Northerners believed that America was the world’s last great hope for democracy, and if the South destroyed the Union by force, that light of liberty might be extinguished forever. Lincoln once said the North must prove “that popular government is not an absurdity. We must settle this question now, whether in a free government the minority have the right to break up the government whenever they choose. If we fail it will go far to prove the incapability of the people to govern themselves.”

The South may have been fighting to preserve a way of life and to protect its perceived constitutional rights, but so was the North. If the Southern people kept fighting even after the devastating defeats at Gettysburg, Vicksburg and Chattanooga, why should we not believe the North would have kept on fighting even if the Confederates had won Gettysburg, Vicksburg and Chattanooga? The fact is that both sides were equally brave and equally dedicated to their cause. Commitment and morale being the same, the stronger side prevailed.


Sources: Terry L. Jones, “The American Civil War.”

 Terry L. Jones

Terry L. Jones is a professor of history at the University of Louisiana, Monroe and the author of several books on the Civil War

Read Full Post »

Project

American Hegemony or American Primacy?

Joseph S. Nye

Project Syndicate      March 9, 2015

CAMBRIDGE – No country in modern history has possessed as much global military power as the United States. Yet some analysts now argue that the US is following in the footsteps of the United Kingdom, the last global hegemon to decline. This historical analogy, though increasingly popular, is misleading.

Britain was never as dominant as the US is today. To be sure, it maintained a navy equal in size to the next two fleets combined, and its empire, on which the sun never set, ruled over a quarter of humankind. But there were major differences in the relative power resources of imperial Britain and contemporary America. By the outbreak of World War I, Britain ranked only fourth among the great powers in terms of military personnel, fourth in terms of GDP, and third in military spending.

The British Empire was ruled in large part through reliance on local troops. Of the 8.6 million British forces in WWI, nearly a third came from the overseas empire. That made it increasingly difficult for the government in London to declare war on behalf of the empire when nationalist sentiments began to intensify.

By World War II, protecting the empire had become more of a burden than an asset. The fact that the UK was situated so close to powers like Germany and Russia made matters even more challenging.

For all the loose talk of an “American empire,” the fact is that the US does not have colonies that it must administer, and thus has more freedom to maneuver than the UK did. And, surrounded by unthreatening countries and two oceans, it finds it far easier to protect itself.

That brings us to another problem with the global hegemon analogy: the confusion over what “hegemony” actually means. Some observers conflate the concept with imperialism; but the US is clear evidence that a hegemon does not have to have a formal empire. Others define hegemony as the ability to set the rules of the international system; but precisely how much influence over this process a hegemon must have, relative to other powers, remains unclear.

Still others consider hegemony to be synonymous with control of the most power resources. But, by this definition, nineteenth-century Britain – which at the height of its power in 1870 ranked third (behind the US and Russia) in GDP and third (behind Russia and France) in military expenditures – could not be considered hegemonic, despite its naval dominance.

Similarly, those who speak of American hegemony after 1945 fail to note that the Soviet Union balanced US military power for more than four decades. Though the US had disproportionate economic clout, its room for political and military maneuver was constrained by Soviet power.

Some analysts describe the post-1945 period as a US-led hierarchical order with liberal characteristics, in which the US provided public goods while operating within a loose system of multilateral rules and institutions that gave weaker states a say. They point out that it may be rational for many countries to preserve this institutional framework, even if American power resources decline. In this sense, the US-led international order could outlive America’s primacy in power resources, though many others argue that the emergence of new powers portends this order’s demise.

But, when it comes to the era of supposed US hegemony, there has always been a lot of fiction mixed in with the facts. It was less a global order than a group of like-minded countries, largely in the Americas and Western Europe, which comprised less than half of the world. And its effects on non-members – including significant powers like China, India, Indonesia, and the Soviet bloc – were not always benign. Given this, the US position in the world could more accurately be called a “half-hegemony.”

Of course, America did maintain economic dominance after 1945: the devastation of WWII in so many countries meant that the US produced nearly half of global GDP. That position lasted until 1970, when the US share of global GDP fell to its pre-war level of one-quarter. But, from a political or military standpoint, the world was bipolar, with the Soviet Union balancing America’s power. Indeed, during this period, the US often could not defend its interests: the Soviet Union acquired nuclear weapons; communist takeovers occurred in China, Cuba, and half of Vietnam; the Korean War ended in a stalemate; and revolts in Hungary and Czechoslovakia were repressed.

Against this background, “primacy” seems like a more accurate description of a country’s disproportionate (and measurable) share of all three kinds of power resources: military, economic, and soft. The question now is whether the era of US primacy is coming to an end.

Given the unpredictability of global developments, it is, of course, impossible to answer this question definitively. The rise of transnational forces and non-state actors, not to mention emerging powers like China, suggests that there are big changes on the horizon. But there is still reason to believe that, at least in the first half of this century, the US will retain its primacy in power resources and continue to play the central role in the global balance of power.

In short, while the era of US primacy is not over, it is set to change in important ways. Whether or not these changes will bolster global security and prosperity remains to be seen.

Photo of Joseph S. Nye

Joseph S. Nye, Jr. a former US assistant secretary of defense and chairman of the US National Intelligence Council, is University Professor at Harvard University and a member of the World Economic Forum Global Agenda Council on the Future of Government. He is the author, most recently, of Is the Ame… read more

Read Full Post »

Read Full Post »

Read Full Post »

The Historical Roots of Fraternity Racism

HNN   March 10, 2015

The response of the University of Oklahoma to the tape of its campus Sigma Alpha Epsilon (SAE) members cheerfully singing a vile racist chant has been one of shock and outrage. This has been reflected in public rallies on campus condemning the ugly racism on that tape and in the swift and decisive action by the university president David Boren in denouncing and closing down the offending fraternity. But while my initial reaction too was outrage at this racist display, as a historian who has researched segregationist student activism in the 1950s and 1960s South, my next thought was of how eerily familiar the fraternity behavior on that tape seemed to be.

The tape shows cheerful, white, well dressed frat boys repeatedly singing “You can hang him from a tree, but he’ll never sign with me. There will never be a nigger SAE.” What astonished me was how reminiscent this chant by Oklahoma fraternity members in 2015 was of the chant of segregationist fraternity members at the University of Georgia in January 1961. Though separated by more than a half century in both cases a lynching reference was combined with the chanting of a pledge to keep the segregationist fraternity tradition in tact.

The only difference between the racist chants in 2015 and 1961 that I can discern is that the fraternities today seem more inclined to do their chanting in private. At Oklahoma this semester the chant came in what started out as a private fraternity setting (a bus apparently transporting fraternity members from some fraternity-related event). The privacy was, of course, violated by the leaking of the tape of the chant, but clearly the chant was not designed for public consumption. The Georgia chant, on the other hand, was made in public, at a segregationist rally at the campus historic archway entrance in January 1961 at the height of the university’s integration crisis. Some 150-200 Georgia students had just hung a black faced effigy of Hamilton Holmes, who along with Charlayne Hunter, had in January 1961 become the first African American student to attend the historically segregated University of Georgia. The white students first “serenaded the effigy with choruses of Dixie and then sang “There’ll never be a nigger in the ________ fraternity house,” whose various names they inserted. Clearly, UGA students in 1961, operating in a historically segregated university and a segregated college town (Athens, Georgia) did not feel the pressure their 21st century fraternity counterparts do – at racially integrated campuses – to keep their racist displays to themselves. But if the venue was different the racist sentiment and mode of expression were virtually identical.

The coupling of lynching metaphors with the chanting of a segregationist pledge «never» to integrate is not accidental. Lynching symbolizes black powerlessness, while white pledges to sustain segregation permanently evoke the power and endurance of white supremacy. The implication seems to be that even if the university integrates the fraternity will remain an outpost of white supremacy and racial exclusion.

The similarity between the racist fraternity chants in these two centuries raises questions about fraternity history and culture that should be of as much interest to university presidents as to historians. It suggests that the responsibility for the ugly racist chant at the Oklahoma SAE rests not merely with the individuals who sang on that bus but the larger fraternity culture. How ,we ought to ask, is it possible that these racist chants have endured for generations? Who is it that preserves such racist traditions and transmits them to each new college generation? This seems a clear case of cultural preservation, transmission, and reproduction. And it is all the more striking because this hoary racist tradition attracts adherents and admirers well into our century when modern science and social science have long since refuted white supremacist assumptions. Finally, we need to ask why even on racially integrated campuses, such as Oklahoma, fraternities remain so racially exclusive that such vintage segregationist chants can be sung so shamelessly. The historical roots of this racist fraternity tradition and the political, cultural and demographic props that sustain it must be understood and confronted honestly if the ghost of Jim Crow is ever to be banished from frat row.

Robert Cohen is a professor of Social Studies and History at New York University, co-editor with David J. Snyder of «Rebellion in Black and White: Southern Student Activism in the 1960s» and editor of «The Essential Mario Savio: Speeches and Writings That Changed America.»

Read Full Post »

« Newer Posts - Older Posts »