Feeds:
Entradas
Comentarios

Archive for 2014

Why the Founding Fathers thought banning Torture Foundational to the US Constitution

By Juan Cole

Informed Comment  December 9, 2014

I have argued on many occasions that the language of patriotism and appeal to the Founding Fathers and the constitution must not be allowed to be appropriated by the political right wing in contemporary America, since for the most part right wing principles (privileging religion, exaltation of ‘whiteness’ over universal humanity, and preference for property rights over human rights) are diametrically opposed to the Enlightenment and Deist values of most of the framers of the Unites States.

We will likely hear these false appeals to an imaginary history a great deal with the release of the Senate report on CIA torture. It seems to me self-evident that most of the members of the Constitutional Convention would have voted to release the report and also would have been completely appalled at its contents.

The Bill of Rights of the US Constitution is full of prohibitions on torture, as part of a general 18th century Enlightenment turn against the practice. The French Encyclopedia and its authors had agitated in this direction.

Two types of torture were common during the lifetimes of the Founding Fathers. In France, the judiciary typically had arrestees tortured to make them confess their crime. This way of proceeding rather tilted the scales in the direction of conviction, but against justice. Pre-trial torture was abolished in France in 1780. But torture was still used after the conviction of the accused to make him identify his accomplices.

Thomas Jefferson excitedly wrote back to John Jay from Paris in 1788:

“On the 8th, a bed of justice was held at Versailles, wherein were enregistered the six ordinances which had been passed in Council, on the 1st of May, and which I now send you. . . . By these ordinances, 1, the criminal law is reformed . . . by substitution of an oath, instead of torture on the question préalable , which is used after condemnation, to make the prisoner discover his accomplices; (the torture abolished in 1780, was on the question préparatoire, previous to judgment, in order to make the prisoner accuse himself;) by allowing counsel to the prisoner for this defence; obligating the judges to specify in their judgments the offence for which he is condemned; and respiting execution a month, except in the case of sedition. This reformation is unquestionably good and within the ordinary legislative powers of the crown. That it should remain to be made at this day, proves that the monarch is the last person in his kingdom, who yields to the progress of philanthropy and civilization.”

Jefferson did not approve of torture of either sort.

The torture deployed by the US government in the Bush-Cheney era resembles that used in what the French called the “question préalable.” They were being asked to reveal accomplices and any further plots possibly being planned by those accomplices. The French crown would have argued before 1788 that for reasons of public security it was desirable to make the convicted criminal reveal his associates in crime, just as Bush-Cheney argued that the al-Qaeda murderers must be tortured into giving up confederates. But Jefferson was unpersuaded by such an argument. In fact, he felt that the king had gone on making it long past the time when rational persons were persuaded by it.

Bush-Cheney, in fact, look much more like pre-Enlightentment absolute monarchs in their theory of government. Louis XIV may not have said “I am the state,” but his prerogatives were vast, including arbitrary imprisonment and torture. Bush-Cheney, our very own sun kings, connived at creating a class of human beings to whom they could do as they pleased.

When the 5th amendment says of the accused person “nor shall be compelled in any criminal case to be a witness against himself” the word “compelled” is referring to the previous practice of judicial torture of the accused. Accused persons who “take the fifth” are thus exercising a right not to be tortured by the government into confessing to something they may or may not have done.

Likewise, the 8th Amendment, “Excessive bail shall not be required, nor excessive fines imposed, nor cruel and unusual punishments inflicted.” is intended to forbid post-sentencing torture.

The 8th Amendment was pushed for by Patrick Henry and George Mason precisely because they were afraid that the English move away from torture might be reversed by a Federal government that ruled in the manner of continental governments.

Patrick Henry wrote,

“What has distinguished our ancestors?–That they would not admit of tortures, or cruel and barbarous punishment. But Congress may introduce the practice of the civil law, in preference to that of the common law. They may introduce the practice of France, Spain, and Germany.”

It was objected in the debate over the Bill of Rights that it could be ignored. George Mason thought that was a stupid reason not to enact it:

“Mr. Nicholas: . . . But the gentleman says that, by this Constitution, they have power to make laws to define crimes and prescribe punishments; and that, consequently, we are not free from torture. . . . If we had no security against torture but our declaration of rights, we might be tortured to-morrow; for it has been repeatedly infringed and disregarded.

Mr. George Mason replied that the worthy gentleman was mistaken in his assertion that the bill of rights did not prohibit torture; for that one clause expressly provided that no man can give evidence against himself; and that the worthy gentleman must know that, in those countries where torture is used, evidence was extorted from the criminal himself. Another clause of the bill of rights provided that no cruel and unusual punishments shall be inflicted; therefore, torture was included in the prohibition.”

It was the insistence of Founding Fathers such as George Mason and Patrick Henry that resulted in the Bill of Rights being passed to constrain the otherwise absolute power of the Federal government. And one of their primary concerns was to abolish torture.

The 5th and the 8th amendments thus together forbid torture on the “question préparatoire” pre-trial confession under duress) and the question préalable (post-conviction torture).

That the Founding Fathers were against torture is not in question.

Fascists (that is what they are) who support torture will cavil. Is waterboarding torture? Is threatening to sodomize a man with a broomstick torture? Is menacing a prisoner with a pistol torture?

Patrick Henry’s discourse makes all this clear. He was concerned about the government doing anything to detract from the dignity of the English commoner, who had defied the Norman yoke and gained the right not to be coerced through pain into relinquishing liberties.

Fascists will argue that the Constitution does not apply to captured foreign prisoners of war, or that the prisoners were not even P.O.W.s, having been captured out of uniform.

But focusing on the category of the prisoner is contrary to the spirit of the founding fathers. Their question was, ‘what are the prerogatives of the state?’ And their answer was that the state does not have the prerogative to torture. It may not torture anyone, even a convicted murderer.

The framers of the Geneva Convention (to which the US is signatory) were, moreover, determined that all prisoners fall under some provision of international law. René Värk argues:

“the commentary to Article 45 (3) asserts that ‘a person of enemy nationality who is not entitled to prisoner-of-war status is, in principle, a civilian protected by the Fourth Convention, so that there are no gaps in protection’.*32 But, at the same time, it also observes that things are not always so straightforward in armed conflicts; for example, adversaries can have the same nationality, which renders the application of the Fourth Convention impossible, and there can arise numerous difficulties regarding the application of that convention. Thus, as the Fourth Convention is a safety net to persons who do not qualify for protection under the other three Geneva Conventions, Article 45 (3) serves yet again as a safety net for those who do not benefit from more favourable treatment in accordance with the Fourth Convention.”

Those who wish to create a category of persons who may be treated by the government with impunity are behaving as fascists like Franco did in the 1930s, who also typically created classes of persons to whom legal guarantees did not apply.

But if our discussion focuses on the Founding Fathers, it isn’t even necessary to look so closely at the Geneva Conventions.

Thomas Jefferson wrote in the Declaration of Independence, “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.”

The phrase “all men” means all persons of any nationality.

We know what the Founding Fathers believed. They believed in universal rights. And they believed in basic principles of human dignity. Above all, they did not think the government had the prerogative of behaving as it pleased. It doesn’t have the prerogative to torture

We know what the Founding Fathers believed. They believed in universal rights. And they believed in basic principles of human dignity. Above all, they did not think the government had the prerogative of behaving as it pleased. It doesn’t have the prerogative to torture.

Read Full Post »

opinionator_main
What General Holtzclaw Saw
By THOM BASSETT

The New York Times   December 15, 2014 

disunion45Viewed as a matter of tactics, the assault by the 13th United States Colored Troops against Confederates holding Overton Hill late on Dec. 16, 1864, contributed nothing to the Union victory at the Battle of Nashville. The Southerners easily repulsed the charge in a handful of minutes, leaving the regiment shot to pieces, its dead and wounded scattered across the muddy ground. In one sense, it was just another bloody and fruitless assay against strong defensive works, something that happened thousands of times in the Civil War.

But something else, virtually unheard of in the war, also occurred as a result of this engagement. The African-American soldiers who charged Overton Hill not only earned the awed respect of white Union troops who witnessed their efforts; they also garnered heartfelt praise from an opposing Confederate general in his official report.

The first day of the Battle of Nashville, which commenced the final act of a desperate rebel campaign into Middle Tennessee, ended with Confederate Lt. Gen. John Bell Hood’s Army of the Tennessee swept from its positions south of the city by a tide of Union forces. Rather than retreating completely from the field, though, Hood devoted the night of Dec. 15 to re-establishing a shortened defensive line across a range of hills southeast of Nashville. The right of Hood’s new emplacements was manned by Lt. Gen. Stephen D. Lee’s corps, whose line re-fused at the extreme right of its position atop Overton Hill (also known as Peach Orchard Hill).

The Battle of Nashville Credit Library of Congress

The Battle of Nashville
Credit Library of Congress

Overton Hill was a strong anchoring point for Hood. It stood 300 feet high, with steep slopes. Federal troops hoping to drive off the Confederates would have to advance across open ground until they reached the hill, which was thick with underbrush and trees. About halfway up the hill the Confederates had downed trees to create further obstacles. Near the top they’d built breastworks with densely bunched abatis in front of them.

The Union Army failed to move quickly on the morning of Dec. 16 to exploit its victory the day before. By early afternoon, though, Maj. Gen. Thomas J. Wood saw an opportunity as he arrayed his forces to the north and east of Overton Hill. It would be difficult to take the hill, but if it could be achieved the Federals could simultaneously cut off Hood’s only line of retreat and attack him from the rear. “The capture of half of the Rebel army would almost certainly [follow],” Wood wrote later, and so “the prize at stake was worth the hazard.”

By the time Wood was ready to attack, the unseasonably warm day had turned cold, and pelting rain fell on his men as they aligned themselves. Wood’s hastily improvised plan called for several regiments to attack simultaneously from different angles to offset the advantages of the Confederate position. Beginning around 2:45 p.m., Union forces slogged through a plowed field turned by the weather to heavy mud and then struggled up the hill.

The attack immediately became a Union disaster. Double canister shot from the hilltop tore into the advancing lines even before they reached the base of the hill. Some units misaligned as they moved forward and were enfiladed by the Confederates. Once the Northerners reached the lines of downed trees up the hill the advance was slowed even further, exposing them to murderous musket fire. The rebel defenses were so entangling, one Union officer reflected later, that his men were like flies caught in a spider’s web.

Wood’s forces soon retreated down the hill and across the churned field in wild disorder, with units dissolving in the destruction and panic. There were over a thousand Federal casualties, which represents almost a third of the total suffered by the Union Army in the entire two-day battle. “I have seen most of the battlefields of the West,” one Confederate would recall, “but never saw dead men thicker than in front of my [forces].”

At this point, with numerous Union regiments mauled and repulsed, the turn of the 13th U.S.C.T. was about to come. The regiment, which at Nashville consisted of merely 556 men and 20 officers, was created in September 1863. Its ranks were filled mainly with ex-slaves freed when Union forces occupied northern Tennessee in 1862. The 13th had principally guarded the rail lines that were gradually extended from the Nashville area to the south and east across the state. (These lines eventually would form the supply and communication basis Gen. William Tecumseh Sherman relied on to prepare for his assault on Georgia and the Carolinas.) On several occasions the unit defended the lines from rebel marauders. In late 1864 it was combined with two other black regiments to create the Second Colored Brigade and assigned to George H. Thomas’s Army of the Cumberland.

The Battle of Nashville was the 13th’s only full-scale combat of the war. As Confederate defenders were occupied by the first waves of the attack, the 13th moved into a position protected by brush at the base of the hill. After the main Union attack melted away under the blaze of the rebel rifles and cannon, the 13th rose with a thundering roar and, to the amazement of Federal and Confederate witnesses alike, ran for the summit of Overton Hill.civil-war-sumter75-popup

It was, as one commentator has put it, “a charge into hell itself.” The African-Americans, untried in fighting like this, assaulted a strongly defended hill in the face of fire, one Ohioan remembered, “such as veterans dread.” The 13th, unlike the other Union attackers, received no artillery support. Since the rest of the Union forces were broken up and scattered, the 13th was attacking by itself an entire corps placed securely behind breastworks and other defenses. As the former slaves and their white officers somehow came on and on, closer and closer to the top, the entire rebel line concentrated its slaughtering fire on them.

In minutes it was over. The regimental flag was lost to a Confederate trophy hunter after five color bearers had been shot down in succession. They were joined by 220 or so other officers and men — nearly 40 percent of the regiment’s fighting strength — who were killed or wounded in the brief attack.

Those who saw it were in awe of the 13th’s bravery. One Union officer wrote: “I never saw more heroic conduct showed on the field of battle than was exhibited by this body of men so recently slaves.” After the Confederate Army was routed by Union attacks that came from the other side of Hood’s line, a Yankee surgeon went to look where the 13th had attacked. “Don’t tell me negroes won’t fight!” he declared in a letter home. “I know better.”

Someone else who now knew better was the Confederate Brig. Gen. James T. Holtzclaw, whose men formed part of the defense on Overton Hill. In a January 1865 official report the Alabamian bore witness to the qualities displayed by the African-American soldiers. He wrote that they “made a most determined charge” and “gallantly dashed up to the abatis” only to be “killed by hundreds.” Holtzclaw also singled out the 13th’s officers for praise: “I noticed as many as three mounted who fell far in advance of their commands urging them forward.”

That Holtzclaw wrote these lines is remarkable, given the prevailing white Southern attitudes and practices toward African-American Union soldiers. Captured members of the U.S.C.T. were officially regarded by the Confederates as fomenting slave rebellion and faced death as punishment. At Fort Pillow the previous April, they were slaughtered en masse after Gen. Nathan Bedford Forrest’s men captured them. African-American soldiers could also be sold into slavery or forced to work alongside slaves in support of the Confederate war effort.

Oddly, Union soldiers on the morning of Dec. 16 discovered that white comrades slain the day before had been stripped naked overnight by desperate Confederates in need of food, supplies and clothing, while the African-American dead were left untouched. Southern infantrymen, it seems, would rather go barefoot in winter than wear the shoes of a black man and go hungry instead of eating his hardtack.

There’s no evidence that Holtzclaw’s officially expressed admiration for the 13th U.S.C.T. reflected any racial progressivism on his part. The better explanation is this: A Southern general who fought in defense of a society built on slavery nevertheless saw, if only for a moment and dimly through the smoke and chaos of battle, the undeniable humanity of the men who charged his lines, willing to fight and die for freedom.

Follow Disunion at twitter.com/NYTcivilwar or join us on Facebook.

Sources: Wiley Sword, “The Confederacy’s Last Hurrah: Spring Hill, Franklin, & Nashville”; The War of the Rebellion: A Compilation of the Official Records of the Union and Confederate Armies,” Vol. 45, Pt. 1.

Thom Bassett lives in Providence, R.I., and teaches at Bryant University. He is at work on a novel.

Read Full Post »

opinionator_main

How Douglass Came Around to Lincoln

Tom Chaffin

The New York Times   December 7, 2014

civil-war-sumter75-popupBy December 1864, Frederick Douglass had become an admirer of the man he later called “our friend and liberator,” and he savored President Lincoln’s re-election the previous month. But Douglass’s path to that admiration had been anything but direct.

Four years earlier, he had quietly supported candidate Lincoln in the November 1860 election. But the Republican president-elect soon gave him pause. Lincoln’s silence during the final months of Democrat James Buchanan’s presidency irked Douglass, and he complained of Lincoln’s failure to condemn pro-South actions by Buchanan’s lame-duck administration.

Moreover, during Lincoln’s days as president-elect and his presidency’s first months, Douglass was also disturbed by the new chief executive’s receptiveness to proposed peace deals with the South that would have left its “peculiar institution” — slavery — intact. He was likewise troubled by Lincoln’s continuing advocacy of black emigration schemes as a means of addressing the secession crisis — schemes that would have sent African-Americans, both free and slave, to Africa or the Caribbean. Indeed, in spring 1861, Douglass, though throughout most of his life an opponent of such schemes, grew so wary of President Lincoln that he planned a 10-week trip to Haiti to ponder emigrating there himself (he eventually canceled the trip).

Frederick Douglass

Frederick Douglass Library of Congress

By late 1862, though, Lincoln had begun to change, and so did Douglass’s estimation of him. In January 1863, Douglass and his fellow abolitionists exulted over Lincoln’s Emancipation Proclamation, freeing slaves in all rebel-controlled areas and authorizing the recruitment of black troops. With the stroke of a pen, Lincoln, acting in his role as commander in chief, had elevated the war effort from a fight to preserve a political nation-state, the Union, into a moral campaign against human bondage.

The proclamation did not abolish American slavery, nor did it free all American slaves. It left in bondage close to a million souls in areas exempted by the edict: the nominally Union and free-soil “border states” of Kentucky, Delaware, Missouri and Kentucky, as well as all parts of the Confederacy already occupied by Union forces. But Douglass saw that Lincoln’s edict put the nation on an irreversible course. “For my own part,” he later recalled, “I took the proclamation, first and last, for a little more than it purported, and saw in its spirit a life and power far beyond its letter.” He added:

Its meaning to me was the entire abolition of slavery, wherever the evil could be reached by the Federal arm, and I saw that its moral power would extend much further. It was, in my estimation, an immense gain to have the war for the Union committed to the extinction of slavery, even from a military necessity.

Douglass’s voice was, by then, being heard at the nation’s highest levels; at Douglass’s request, President Lincoln met with him, at the White House, on Aug. 10, 1863. “I was somewhat troubled with the thought of meeting someone so august and high in authority, especially as I had never been in the White House before, and had never spoken to a President of the United States before.” Upon entering the office, however, Douglass was put at ease. He found the tall president seated in a low chair, surrounded by books and papers. “On my approach he slowly drew his feet in from the different parts of the room into which they had strayed, and he began to rise, and continued to rise until he looked down upon me, and extended his hand and gave me a welcome.”

“Reaching out his hand, he said, ‘Mr. Douglass, I know you; I have read about you, and Mr. Seward” — Secretary of State William H. Seward — “has told me about you’; putting me quite at ease at once.” Their ensuing conversation focused on the black regiments then being organized, those from Massachusetts and other Union states, as well as others, made up of former slaves, in South Carolina, Tennessee and other Union-controlled areas of the South. To the president, Douglass made several requests: He asked for an end to pay inequities between black and white soldiers; that black soldiers be promoted, just as white soldiers, for “meritorious” battlefield performance; and that the president enunciate a policy for the Union’s military to retaliate in kind against rebel prisoners of war if the Confederacy made good on threats to execute captured black soldiers. Lincoln, in turn, asked Douglass how the Union Army might more effectively recruit former slaves now in Union-occupied parts of the South.

As their exchange drew to a close, Senator Samuel C. Pomeroy of Kansas, who, at its start, had introduced Douglass to the president, told Lincoln that Secretary of War Edwin M. Stanton intended to commission Douglass adjutant-general to Gen. George H. Thomas. The commission would authorize Douglass to travel down the Mississippi and recruit former slaves into the army. The president, by Douglass’s account of the meeting, seemed pleased with that news: “I will sign any commission that Mr. Stanton will give Mr. Douglass.”

When they met that August, Lincoln was well into his first term and already pondering his re-election campaign the following year. He was facing stiff political headwinds from both Republicans and Democrats: Radical Republicans were demanding that he move more aggressively against the South; and Democrats — motivated by the sort of anti-black sentiment that flared during New York’s Draft Riots — were complaining that, through the Emancipation Proclamation and similar measures, he was pursuing policies injurious to whites, and unduly favorable toward blacks.

The president thus exercised caution in answering Douglass’s requests. Lincoln said that, while he was prepared, eventually, to accede to the equal pay request, he would, in the meantime — for political reasons — be unable to grant the other entreaties. Moreover, Lincoln cautioned that he regarded the very idea of any black enlistments as, for the time being, an “experiment.” Regardless of his own favorable view of the value of recruiting black soldiers into the war effort, the president said that he was also aware that many whites remained skeptical of the change in policy. “He spoke,” Douglass recalled, “of the opposition generally to employing negroes as soldiers at all.”

By the war’s end, Lincoln did remove most pay disparities between black and white soldiers, and after atrocities were inflicted on black soldiers, he also issued a warning to the Confederacy that any similar, future actions against black soldiers would produce commensurate retaliations against rebel prisoners. However, the change in policy, requested by Douglass, concerning promotions for black soldiers, never occurred; for the war’s duration, black enlistees were rarely elevated to higher ranks. As for Douglass’s military commission, that too never came — whether for reasons of bureaucratic error or deliberate policy, he never learned.

Even so, Douglass left the meeting satisfied that, in Lincoln, he had met a trustworthy leader with whom he could work. Speaking to an abolitionist convention the following December, Douglass reflected, “I never met with a man, who, on the first blush, impressed me more entirely with his sincerity, with his devotion to his country, and with his determination to save it at all hazards.”

disunion45A year later, on Aug. 19, 1864, Douglass met again with Lincoln at the White House, but this time, at the president’s request. “I need not say I went most gladly,” he recalled. “The main subject on which he wished to confer with me was as to the means most desirable to be employed outside the army to induce the slaves in the rebel States to come within the Federal lines,” where, by terms set forth in the Emancipation Proclamation, the bondsmen would be guaranteed their liberty.

Growing war opposition in the North — much of it fueled by complaints that the Emancipation Proclamation had rendered it an “abolition war,” recalled Douglass — “alarmed Mr. Lincoln.” The president was also “apprehensive that a peace might be forced upon him which would leave still in slavery all who had not come within our lines.”

Fearing such a forced peace, the president told Douglass that he wanted to render the Emancipation Proclamation “as effective as possible” as long as it remained the law of the land. While the order was in effect, he wanted it to liberate as many slaves as possible. More specifically, Lincoln worried that slaves in rebel areas “are not coming so rapidly and so numerously to us as I had hoped.” To increase their numbers, Lincoln made a proposal: He asked if Douglass would be willing to organize “a band of scouts, composed of colored men, whose business should be somewhat after the original plan of John Brown, to go into the rebel States, beyond the lines of our armies, and carry the news of emancipation, and urge the slaves to come within our boundaries.” Union military advances, however, soon rendered Lincoln’s idea unnecessary.

By the fall 1864 presidential campaign — following the Emancipation Proclamation and the two White House meetings — Lincoln had thus earned Douglass’s trust and admiration. Even so, Douglass, again as he had in 1860, remained mostly quiet in his support for the president’s election campaign. This time, Lincoln’s opponent was the former Union general George McClellan, who had expressed a willingness to discuss an armistice with the rebel South that would have left the region’s slavery in place. Explaining his reticence to the journalist Theodore Tilton, Douglass confided, “I am not doing much in this Presidential Canvass for the reason that Republican committees do not wish to expose themselves to the charge of being the ‘Niggar’ party.” In the end, Lincoln handily defeated McClellan — winning by 2,218,000 to 1,813,000 in the popular vote, 212 to 21 in the Electoral College.

Join Disunion on Facebook »


Tom Chaffin

Tom Chaffin’s books include “Pathfinder: John Charles Frémont and the Course of American Empire,” recently reissued with an updated introduction, and the just published “Giant’s Causeway: Frederick Douglass’s Irish Odyssey and the Making of an American Visionary,” from which the above essay is adapted. For

Read Full Post »

Lincoln, God and the Constitution

Disunion

 On Dec. 3, 1864, Abraham Lincoln proposed putting God in the Constitution. Preparing to submit his annual address on the state of the union, the president drafted a paragraph suggesting the addition of language to the preamble “recognizing the Deity.” The proposal shocked his cabinet during a read-through. With his re-election secured and the political utility of such a move dubious, the most religiously skeptical president since Thomas Jefferson proposed blowing an irreparable God-size hole through the wall separating church and state. What was Lincoln thinking?

Recalling the meeting in his memoirs, Secretary of the Navy Gideon Welles wrote that the imprudent idea had been put in the president’s head “by certain religionists” – namely, the Covenanters. A tiny sect from Scotland that had resided in America since before the Revolution, they believed the Constitution contained two crippling moral flaws: its protection of slavery, and its failure to acknowledge God’s authority. With the Emancipation Proclamation poised to fix the one sin, they believed, why not correct the other? At their first meeting with Lincoln in late 1862 (it was much easier for citizens to get an audience with the president at the time), a group of influential Covenanters suggested doing just that.

In that first meeting, Abraham Lincoln was quintessentially Abraham Lincoln — by turns respectful, humorous and reflective. He regaled his guests with the rough-hewn ideas that became his second inaugural address. He observed that each side in the war prayed to the same God, read the same Bible and invoked divine favor against the other; perhaps, Lincoln suggested, the war would ultimately decide which nation God chose

Abraham Lincoln

Abraham LincolnCredit Library of Congress

 

The Covenanter ministers left their meeting emboldened. Thereafter they were instrumental in forming a coalition of denominations dedicated to acknowledging Christianity in the Constitution. This group, the National Reform Association, hoped to reference Jesus’ authority just after “We the People” and before “in order to form a more perfect Union.” They visited the president again in 1864 with an official request for action.

Lincoln is often remembered as a religious skeptic, at best, but throughout the war he showed exceptional shrewdness in wielding the political power of religion. As a young man, he openly scoffed at Christianity and once wrote an essay examining all the falsehoods contained in the Bible. Open heresy proved politically perilous; he lost his first bid for a congressional nomination amid accusations that Christians could not vote for him in good conscience.

From that experience, and from his active participation in grass-roots political organizing, Lincoln came to respect the ability of church networks to mobilize voters on moral issues. Thereafter, he used religion to great effect in his political career, casting his campaign platforms in stark moral terms, calling for more thanksgiving and fast days than any previous president and meeting constantly with clergy. Those ministers returned from their audiences at the White House preaching sermons that baptized the Union, the War and the president with religious purpose. Cultivating relationships with religious leaders paid dividends when Lincoln won re-election in a landslide.

Even so, the 16th president paid more than lip service to religion. Raised by a Bible-thumping mother in a hard-drinking culture, he somehow managed to reject and have sympathy for both. He was a moralist, a fatalist and prone to bouncing between soaring hope and sinking melancholy. For all these reasons, the president understood and reverenced religion better than many believers did, both as a political advantage and a safe harbor for troubled souls in the midst of storms. Lincoln’s own storms — the disintegration of the Union, the death of his young son Willie and regular wartime casualty reports — heightened his belief in Providence and shook the skepticism of his youth. A distant, impersonal sense of divinity was replaced by the president’s increasing conviction that God was concerned with the affairs of humanity. More important, Lincoln came to believe what he said in 1862 – that this inscrutable God might actually choose sides.

In contemplating a religious amendment, then, Lincoln brought to bear his own conflicted sense about God and America. The nation was at war both with itself and with what he believed remained its ultimate destiny to be an example of free government to the world. With the Emancipation Proclamation, and soon through the 13th Amendment, Lincoln christened the Civil War with the moral name of abolition. Perhaps, if only briefly, he considered that Reconstruction would need a higher calling as well. Americans, in the rubble of war, would share little in common besides enmity. They might yet find unity in rebuilding a nation that possessed a divine destiny.

But Lincoln the philosopher was also Lincoln the lawyer; such a move would open a Pandora’s box of divisive constitutional issues. The Cabinet’s very loud concerns were joined by some of his own. He struck the paragraph and it was never mentioned again.

By suggesting it at all, though, the president put on display, however briefly, the exceptional power of the Civil War to remake American society. Just 10 years before, most Americans might easily have conceived of a constitutional amendment invoking the name of God. Few would have predicted one eradicating slavery. Then the nation’s greatest crisis proved capable of taking slavery out of the national compact but not of putting God in it. High-profile campaigns to Christianize the Constitution continued well into the 20th century and even made their way into congressional committees. Still, they never came closer to realization than that one paragraph read aloud by Lincoln in a cabinet meeting. Those brief words bespoke the limits of religion and reform in American government at the nation’s most malleable moment.

Follow Disunion at twitter.com/NYTcivilwar or join us on Facebook.


Sources: “Diary of Gideon Welles, Vol. II”; John Alexander, “History of the National Reform Association”; Richard Carwardine, “Lincoln: A Life of Purpose and Power.”


Photo

Joseph S. Moore, an assistant professor of history at Gardner-Webb University, is the author of the forthcoming book “Covenanters and the American Republic.”

Read Full Post »

Shannon Freshwater

NEWBURYPORT, Mass. — WHEN we think of the South, a host of images come to mind: slaves and masters, Klansmen and freedom riders, magnolias and cotton fields.

Americans have fewer enduring impressions of the North. It simply stands as the nation’s default region. Most Northerners behave as though they come from America writ large, rather than from a subsection of it. The North seems unremarkable. It holds no dark mystery, no agonies buried deep within. We forget that many parts of the North have an identity, culture, politics and racial history all their own.

Americans know that we cannot understand Southern history, or our nation’s history more generally, without coming to grips with slavery and Jim Crow. But we fail to apply this lesson to the North. We like to think that the struggle for racial equality is tangential to Northern history. This leads us to distort our perceptions of the North and to misinterpret American history as a whole.

Northern cities and states have long harbored movements for racial democracy, as well as for racial segregation, within the same heart and soul. Progress and regression have existed together. That duality helps to explain the mind of the North. Only a clearer understanding of the North’s mottled past can enable us to better reckon with this painful moment in our racial history, after the death of Eric Garner on Staten Island and a grand jury’s decision not to indict the officer whose chokehold led to his death.

Few have written more eloquently about the North and the South than the historian C. Vann Woodward. In Woodward’s formulation, those who came up in the South shouldered the “burden of Southern history.” The past, defined by slavery and segregation, was something to overcome.

The Northern past admits to no such torment. Tales of the Pilgrims and abolitionists sketch a noble portrait. Northern history looms as a source of aspiration and inspiration. It is something to affirm.

This has been true particularly in the Northeast, which has stood as a place of possibility and a model for the country. To E. B. White, New York was the nation’s “visible symbol of aspiration”; John F. Kennedy saw the democratic institutions of Massachusetts as “beacon lights for other nations as well as our sister states.” These ideals could serve as a spur to action and at some moments, Northeasterners drew upon the region’s mystique in order to propel themselves ahead of the rest of the nation. Yet they could also deploy this mystique as a mask, a way for whites to obscure and excuse their region’s dogged racism and oppression.

The history of the Northeast contains stunning steps toward racial progress as well as vicious episodes of backlash. In 1947, many Brooklyn residents welcomed a black ballplayer and anointed Ebbets Field as the frontier of interracial democracy. At the same time, African-American families from the South were shunted into Brooklyn’s burgeoning ghettos. When Jackie and Rachel Robinson attempted to buy a home in the suburbs of Westchester County, N.Y., and Fairfield County, Conn., they encountered hostile white homeowners who did not want African-Americans as neighbors (although the couple was eventually able to buy a house in Stamford, Conn.).

The story of school segregation is even more insidious. To give one example, the School Committee in Springfield, Mass., pursued redistricting and student-transfer policies that produced virtually all-black schools. African-American parents filed a lawsuit in 1964, and the N.A.A.C.P. took up their case. While on the witness stand, members of the School Committee claimed innocence and ignorance, and denied the very existence of segregation. In 1965, the state of Massachusetts went on to pass a law that outlawed “racial imbalance” — the first such law in the nation. The following year, Massachusetts voters would become the first to popularly elect a black senator, Edward W. Brooke. Just as whites forged a breakthrough in the electoral arena, segregation increased in the schools of Springfield, not to mention Boston.

In 1970, Abraham A. Ribicoff of Connecticut stood on the Senate floor and gave public expression to the region’s open secret. “The North is guilty,” Senator Ribicoff charged, “of monumental hypocrisy” in its treatment of African-Americans. One year later, he proposed a policy that would desegregate every metropolitan school system. The plan was big and bold, and it was to take 12 years. It allowed each locality to determine the specifics. Senator Ribicoff envisioned a combination of strategically located educational malls, magnet schools and redistricting. The N.A.A.C.P. opposed his policy. Black leaders thought that the early 1980s was too long to wait for widespread school integration. Of course, we are still waiting.

Many Americans know New York City’s recent history of racial violence, which includes the killing of Yusuf Hawkins in Bensonhurst and Michael Griffith in Howard Beach. But there were many others who are all but forgotten, like Willie Turks, a black transit worker, who was beaten to death by a group of white teenagers in Gravesend in 1982.

Northeasterners do not think of this history as one that shapes our identity. But if we really grapple with the mind of the North, we will be forced to acknowledge, finally, that our region is not just a land of liberty. We will also confront a racial past that is far messier than we might like. It is neither a triumphant story of progress nor a tale of segregation without relief.

We carry the two warring stories with us still. And now we stand at a crossroads. We can summon our better angels, and act forcefully, or we can continue to live like this. Which heritage will we act on? Which story will win out?

Read Full Post »

Remember the Sand Creek Massacre

Credit Christine Marie Larsen

NEW HAVEN — MANY people think of the Civil War and America’s Indian wars as distinct subjects, one following the other. But those who study the Sand Creek Massacre know different.

On Nov. 29, 1864, as Union armies fought through Virginia and Georgia, Col. John Chivington led some 700 cavalry troops in an unprovoked attack on peaceful Cheyenne and Arapaho villagers at Sand Creek in Colorado. They murdered nearly 200 women, children and older men.

Sand Creek was one of many assaults on American Indians during the war, from Patrick Edward Connor’s massacre of Shoshone villagers along the Idaho-Utah border at Bear River on Jan. 29, 1863, to the forced removal and incarceration of thousands of Navajo people in 1864 known as the Long Walk.

In terms of sheer horror, few events matched Sand Creek. Pregnant women were murdered and scalped, genitalia were paraded as trophies, and scores of wanton acts of violence characterize the accounts of the few Army officers who dared to report them. Among them was Capt. Silas Soule, who had been with Black Kettle and Cheyenne leaders at the September peace negotiations with Gov. John Evans of Colorado, the region’s superintendent of Indians affairs (as well as a founder of both the University of Denver and Northwestern University). Soule publicly exposed Chivington’s actions and, in retribution, was later murdered in Denver.

After news of the massacre spread, Evans and Chivington were forced to resign from their appointments. But neither faced criminal charges, and the government refused to compensate the victims or their families in any way. Indeed, Sand Creek was just one part of a campaign to take the Cheyenne’s once vast land holdings across the region. A territory that had hardly any white communities in 1850 had, by 1870, lost many Indians, who were pushed violently off the Great Plains by white settlers and the federal government.

These and other campaigns amounted to what is today called ethnic cleansing: an attempted eradication and dispossession of an entire indigenous population. Many scholars suggest that such violence conforms to other 20th-century categories of analysis, like settler colonial genocide and crimes against humanity.

Sand Creek, Bear River and the Long Walk remain important parts of the Civil War and of American history. But in our popular narrative, the Civil War obscures such campaigns against American Indians. In fact, the war made such violence possible: The paltry Union Army of 1858, before its wartime expansion, could not have attacked, let alone removed, the fortified Navajo communities in the Four Corners, while Southern secession gave a powerful impetus to expand American territory westward. Territorial leaders like Evans were given more resources and power to negotiate with, and fight against, powerful Western tribes like the Shoshone, Cheyenne, Lakota and Comanche. The violence of this time was fueled partly by the lust for power by civilian and military leaders desperate to obtain glory and wartime recognition.

The United States has yet to fully recognize the violent destruction wrought against indigenous peoples by the Civil War and the Union Army. Connor and Evans have cities, monuments and plaques in their honor, as well as two universities and even Colorado’s Mount Evans, home to the highest paved road in North America.

Saturday’s 150th anniversary will be commemorated many ways: The National Park Service’s Sand Creek Massacre Historic Site, the descendant Cheyenne and Arapaho communities, other Native American community members and their non-Native supporters will commemorate the massacre. An annual memorial run will trace the route of Chivington’s troops from Sand Creek to Denver, where an evening vigil will be held Dec. 2.

The University of Denver and Northwestern are also reckoning with this legacy, creating committees that have recognized Evans’s culpability. Like many academic institutions, both are deliberating how to expand Native American studies and student service programs. Yet the near-absence of Native American faculty members, administrators and courses reflects their continued failure to take more than partial steps.

While the government has made efforts to recognize individual atrocities, it has a long way to go toward recognizing how deeply the decades-long campaign of eradication ran, let alone recognizing how, in the face of such violence, Native American nations and their cultures have survived. Few Americans know of the violence of this time, let alone the subsequent violation of Indian treaties, of reservation boundaries and of Indian families by government actions, including the half-century of forced removal of Indian children to boarding schools.

One symbolic but necessary first step would be a National Day of Indigenous Remembrance and Survival, perhaps on Nov. 29, the anniversary of Sand Creek. Another would be commemorative memorials, not only in Denver and Evanston but in Washington, too. We commemorate “discovery” and “expansion” with Columbus Day and the Gateway arch, but nowhere is there national recognition of the people who suffered from those “achievements” — and have survived amid continuing cycles of colonialism.

Correction: November 27, 2014
An earlier version of this article incorrectly stated that the American Indian leader Black Kettle was killed in the Sand Creek Massacre. He died at the Battle of Washita in Oklahoma in 1868. 

Read Full Post »

America’s Founding Myths

This Thanksgiving, it’s worth remembering that the narrative we hear about America’s founding is wrong. The country was built on genocide.

Massacre of American-Indian women and children in Idaho.

Under the crust of that portion of Earth called the United States of America — “from California . . . to the Gulf Stream waters” — are interred the bones, villages, fields, and sacred objects of American Indians. They cry out for their stories to be heard through their descendants who carry the memories of how the country was founded and how it came to be as it is today.

It should not have happened that the great civilizations of the Western Hemisphere, the very evidence of the Western Hemisphere, were wantonly destroyed, the gradual progress of humanity interrupted and set upon a path of greed and destruction. Choices were made that forged that path toward destruction of life itself—the moment in which we now live and die as our planet shrivels, overheated. To learn and know this history is both a necessity and a responsibility to the ancestors and descendants of all parties.

US policies and actions related to indigenous peoples, though often termed “racist” or “discriminatory,” are rarely depicted as what they are: classic cases of imperialism and a particular form of colonialism—settler colonialism. As anthropologist Patrick Wolfe writes, “The question of genocide is never far from discussions of settler colonialism. Land is life — or, at least, land is necessary for life.

The history of the United States is a history of settler colonialism — the founding of a state based on the ideology of white supremacy, the widespread practice of African slavery, and a policy of genocide and land theft. Those who seek history with an upbeat ending, a history of redemption and reconciliation, may look around and observe that such a conclusion is not visible, not even in utopian dreams of a better society.

That narrative is wrong or deficient, not in its facts, dates, or details but rather in its essence. Inherent in the myth we’ve been taught is an embrace of settler colonialism and genocide. The myth persists, not for a lack of free speech or poverty of information but rather for an absence of motivation to ask questions that challenge the core of the scripted narrative of the origin story.

Woody Guthrie’s “This Land Is Your Land” celebrates that the land belongs to everyone, reflecting the unconscious manifest destiny we live with. But the extension of the United States from sea to shining sea was the intention and design of the country’s founders.

“Free” land was the magnet that attracted European settlers. Many were slave owners who desired limitless land for lucrative cash crops. After the war for independence but before the US Constitution, the Continental Congress produced the Northwest Ordinance. This was the first law of the incipient republic, revealing the motive for those desiring independence. It was the blueprint for gobbling up the British-protected Indian Territory (“Ohio Country”) on the other side of the Appalachians and Alleghenies. Britain had made settlement there illegal with the Proclamation of 1763.

In 1801, President Jefferson aptly described the new settler-state’s intentions for horizontal and vertical continental expansion, stating, “However our present interests may restrain us within our own limits, it is impossible not to look forward to distant times, when our rapid multiplication will expand itself beyond those limits and cover the whole northern, if not the southern continent, with a people speaking the same language, governed in similar form by similar laws.”

Origin narratives form the vital core of a people’s unifying identity and of the values that guide them. In the United States, the founding and development of the Anglo-American settler-state involves a narrative about Puritan settlers who had a covenant with God to take the land. That part of the origin story is supported and reinforced by the Columbus myth and the “Doctrine of Discovery.”

The Columbus myth suggests that from US independence onward, colonial settlers saw themselves as part of a world system of colonization. “Columbia,” the poetic, Latinate name used in reference to the United States from its founding throughout the nineteenth century, was based on the name of Christopher Columbus.

The “Land of Columbus” was—and still is—represented by the image of a woman in sculptures and paintings, by institutions such as Columbia University, and by countless place names, including that of the national capital, the District of Columbia. The 1798 hymn “Hail, Columbia” was the early national anthem and is now used whenever the vice president of the United States makes a public appearance, and Columbus Day is still a federal holiday despite Columbus never having set foot on any territory ever claimed by the United States.

To say that the United States is a colonialist settler-state is not to make an accusation but rather to face historical reality. But indigenous nations, through resistance, have survived and bear witness to this history. The fundamental problem is the absence of the colonial framework.

Settler colonialism, as an institution or system, requires violence or the threat of violence to attain its goals. People do not hand over their land, resources, children, and futures without a fight, and that fight is met with violence. In employing the force necessary to accomplish its expansionist goals, a colonizing regime institutionalizes violence. The notion that settler-indigenous conflict is an inevitable product of cultural differences and misunderstandings, or that violence was committed equally by the colonized and the colonizer, blurs the nature of the historical processes. Euro-American colonialism had from its beginnings a genocidal tendency.

The term “genocide” was coined following the Shoah, or Holocaust, and its prohibition was enshrined in the United Nations convention adopted in 1948: the UN Convention on the Prevention and Punishment of the Crime of Genocide.

The convention is not retroactive but is applicable to US-indigenous relations since 1988, when the US Senate ratified it. The terms of the genocide convention are also useful tools for historical analysis of the effects of colonial- ism in any era. In the convention, any one of five acts is considered genocide if “committed with intent to destroy, in whole or in part, a national, ethnical, racial or religious group”:

  • killing members of the group;
  • causing serious bodily or mental harm to members of the group; deliberately inflicting on the group conditions of life
  • calculated to bring about its physical destruction in whole or in part;
  • imposing measures intended to prevent births within the group;
  • forcibly transferring children of the group to another group.

Settler colonialism is inherently genocidal in terms of the genocide convention. In the case of the British North American colonies and the United States, not only extermination and removal were practiced but also the disappearing of the prior existence of indigenous peoples—and this continues to be perpetuated in local histories.

Anishinaabe (Ojibwe) historian Jean O’Brien names this practice of writing Indians out of existence “firsting and lasting.” All over the continent, local histories, monuments, and signage narrate the story of first settlement: the founder(s), the first school, first dwelling, first everything, as if there had never been occupants who thrived in those places before Euro-Americans. On the other hand, the national narrative tells of “last” Indians or last tribes, such as “the last of the Mohicans,” “Ishi, the last Indian,” and End of the Trail, as a famous sculpture by James Earle Fraser is titled.

From the Atlantic Ocean to the Mississippi River and south to the Gulf of Mexico lay one of the most fertile agricultural belts in the world, crisscrossed with great rivers. Naturally watered, teeming with plant and animal life, temperate in climate, the region was home to multiple agricultural nations. In the twelfth century, the Mississippi Valley region was marked by one enormous city-state, Cahokia, and several large ones built of earthen, stepped pyramids, much like those in Mexico. Cahokia supported a population of tens of thousands, larger than that of London during the same period.

Other architectural monuments were sculpted in the shape of gigantic birds, lizards, bears, alligators, and even a 1,330-foot-long serpent. These feats of monumental construction testify to the levels of civic and social organization. Called “mound builders” by European settlers, the people of this civilization had dispersed before the European invasion, but their influence had spread throughout the eastern half of the North American continent through cultural influence and trade.

What European colonizers found in the southeastern region of the continent were nations of villages with economies based on agriculture and corn the mainstay. This was the territory of the nations of the Cherokee, Chickasaw, and Choctaw and the Muskogee Creek and Seminole, along with the Natchez Nation in the western part, the Mississippi Valley region.

To the north, a remarkable federal state structure, the Haudenosaunee Confederacy — often referred to as the Six Nations of the Iroquois Confederacy — was made up of the Seneca, Cayuga, Onondaga, Oneida, and Mohawk Nations and, from early in the nineteenth century, the Tuscaroras. This system incorporated six widely dispersed and unique nations of thousands of agricultural villages and hunting grounds from the Great Lakes and the St. Lawrence River to the Atlantic, and as far south as the Carolinas and inland to Pennsylvania.

The Haudenosaunee peoples avoided centralized power by means of a clan-village system of democracy based on collective stewardship of the land. Corn, the staple crop, was stored in granaries and distributed equitably in this matrilineal society by the clan mothers, the oldest women from every extended family. Many other nations flourished in the Great Lakes region where now the US-Canada border cuts through their realms. Among them, the Anishinaabe Nation (called by others Ojibwe and Chippewa) was the largest.

In the beginning, Anglo settlers organized irregular units to brutally attack and destroy unarmed indigenous women, children, and old people using unlimited violence in unrelenting attacks. During nearly two centuries of British colonization, generations of settlers, mostly farmers, gained experience as “Indian fighters” outside any organized military institution.

Anglo-French conflict may appear to have been the dominant factor of European colonization in North America during the eighteenth century, but while large regular armies fought over geopolitical goals in Europe, Anglo settlers in North America waged deadly irregular warfare against the indigenous communities.

The chief characteristic of irregular warfare is that of the extreme violence against civilians, in this case the tendency to seek the utter annihilation of the indigenous population. “In cases where a rough balance of power existed,” observes historian John Greniew, “and the Indians even appeared dominant—as was the situation in virtually every frontier war until the first decade of the 19th century—[settler] Americans were quick to turn to extravagant violence.”

Indeed, only after seventeenth- and early- eighteenth-century Americans made the first way of war a key to being a white American could later generations of ‘Indian haters,’ men like Andrew Jackson, turn the Indian wars into race wars.” By then, the indigenous peoples’ villages, farmlands, towns, and entire nations formed the only barrier to the settlers’ total freedom to acquire land and wealth. Settler colonialists again chose their own means of conquest. Such fighters are often viewed as courageous heroes, but killing the unarmed women, children, and old people and burning homes and fields involved neither courage nor sacrifice.

US history, as well as inherited indigenous trauma, cannot be understood without dealing with the genocide that the United States committed against indigenous peoples. From the colonial period through the founding of the United States and continuing in the twenty-first century, this has entailed torture, terror, sexual abuse, massacres, systematic military occupations, removals of indigenous peoples from their ancestral territories, and removals of indigenous children to military-like boarding schools.

Once in the hands of settlers, the land itself was no longer sacred, as it had been for the indigenous. Rather, it was private property, a commodity to be acquired and sold. Later, when Anglo-Americans had occupied the continent and urbanized much of it, this quest for land and the sanctity of private property were reduced to a lot with a house on it, and “the land” came to mean the country, the flag, the military, as in “the land of the free” of the national anthem, or Woody Guthrie’s “This Land Is Your Land.”

Those who died fighting in foreign wars were said to have sacrificed their lives to protect “this land” that the old settlers had spilled blood to acquire. The blood spilled was largely indigenous.

Adapted from An Indigenous Peoples’ History of the United Statesout now from Beacon Press.

Read Full Post »

This 75th Anniversary’s Been Overlooked. It Shouldn’t Be

Paula Rabinowitz  

HNN   November 22, 2014

Seventy-five years ago, paperback books returned to the United States with the brandname Pocket Books, which began publishing its mass-market paperbacks, sold at a quarter each, with ten titles, among them: Frank Buck’s Bring ‘Em Back Alive, Bambi by Felix Salten, James Hilton’s Lost Horizon and Emily Brontë’s Wuthering Heights. Returned, because nineteenth-century printers often bound books in paper, yet the practice had all but disappeared during the early part of the twentieth century. It may seem odd to commemorate the advent of cheap pulpy books instead of the far more significant anniversary: the signing of the Hitler-Stalin Pact on August 23, 1939. But the saga of cheap paperbacks’ arrival on American soil is intimately tied to the Second World War and its aftermath in a number of ways, deriving from and contributing to wartime innovation, necessity, mobility and censorship.

Modern paperbacks were the Depression-era brainchild of English editor Allen Lane, who developed Penguin Books in 1935 in order to provide high-quality literary works as cheaply as a pack of cigarettes. Publishing on such a massive scale depended on huge supplies of paper, which, once Britain declared war on Germany, was sharply curtailed in the UK. But the US still had an abundance of trees and paper mills and whether Lane’s assistant, Ian Ballantine and others, stole the idea, as E.L. Doctorow remembers in Reporting the Universe, or Lane shipped the enterprise overseas with Kurt Enoch and Victor Weybright (as he recalled in his memoir The Making of a Publisher), the new medium appearing on drugstore racks, bus stations and corner candy stores, became a kitschy icon that indelibly altered American tastes and habits during the mid-twentieth century. Within a few months of their initial arrival, paperbacks were everywhere. Despite the ubiquity of radio and the Hollywood banner year of 1939, when Gone with the Wind and The Wizard of Oz swept into movie theaters with lush colors, books were the mass media of wartime America. The advent of color assured a renewed love affair with the movies, even as the 1939 World’s Fair in New York marked the introduction of television, the next frontier in mass communications, which would come into its own in the 1950s. But the ability to own a book, one printed by the millions, connected Americans to new ideas in science, economics, art—not to mention new sensations about reading and the self and each other.

These new objects, emblazoned with lurid cover art and risqué tag lines, were priced to sell and, once the US entered the war, were imprinted with an admonition to send the volumes overseas to servicemen. War spurs technological breakthroughs, usually in weaponry or communication; paperbacks were part of this process, a new technology that transformed both the battlefield and home front. Books, unlike other mass media, such as the radio or movies, were tangible things that could be purchased and, like a salami from Katz’s Delicatessen, then sent “to your boy in the army.” Paperback books participated directly in the war effort when publishers and booksellers banded together to produce the Armed Services Editions—millions of books distributed free to the Army and Navy, which left a legacy that influenced a generation of Japanese and Italian scholars to study American literature when they found these handy yellow-covered books or their companions, the Overseas Editions, among their grandfathers’ war surplus (the ASE books could not be brought back the US so were dumped overseas). Books are always causing trouble; even this patriotic gesture ran afoul of Congressional attempts, through amendments to the 1944 Soldier’s Voting Act, to limit the use of certain words in publications distributed to troops that might appear to sway their political opinions.

By the 1950s, after paperback publishing exploded to encompass many imprint houses and augmented reprints with PBOs (paperback originals), the books’ provocative covers—which had been a crucial design elements meant to spur sales but also to bring vernacular modernist visual culture into private life—sparked police departments to impound books and Congress to investigate “Current Pornographic Materials” (during the 1952 Gathings subcommittee hearings), including paperbacks. What had been allowed to proliferate during the Second World War, when millions were in uniform and social mores superintending men’s and women’s behaviors loosened considerably, needed to be reined in during the Korean War and the Cold War.

Paperback book publishers had long been aware of real and potential censorship efforts mounted in the United States most notably, the 1933 case, United States v. One Book Called “Ulysses.” Its 1934 appeal decision by Augustus Hand declared that the book must be “taken as a whole,” so that even patently “obscene” portions “relevant to the purpose of depicting the thoughts of the characters … are introduced to give meaning to the whole.” This decision was aimed at the literary merits of the work and its “sincerity” of portraying characters, but because the law was aimed at “one book,” the book itself, as a total package from cover to cover, might be considered “as a whole.” Paperback publishers exploited the pulpy aspects of their product, with louche and debauched cover art attracting visual attention; but the covers rigorously conformed to the Ulysses decision ruling: each depicted a scene found within the book—even if only in a few words. The paperback was a complete work consisting not only of words but art as well.

This handy package, arriving on American shores in the midst of war’s horrors—offering its owners a “complete and unabridged” work, easily carried in pocket or pocketbook, complete with a visually compelling cover—helped usher readers into new sensations through art, science and literature. As objects that circulated along with their owners during and after WWII, they brought modernism to Main Street.

Paula Rabinowitz is the author of AMERICAN PULP: HOW PAPERBACKS BROUGHT MODERNISM TO MAIN STREET and Editor-in-Chief of the Oxford Encyclopedia of Literature. She is a Professor of English at University of Minnesota, where she teaches courses on twentieth-century American literature, film and visual cultures, and material culture.

Read Full Post »

 

hnn-logo-new

 

 

 

 

Slavery in America: Back in the Headlines

Daina Ramey Berry

HNN  November 23, 1014

 

People think they know everything about slavery in the United States, but they don’t. They think the majority of African slaves came to the American colonies, but they didn’t. They talk about 400 hundred years of slavery, but it wasn’t. They claim all Southerners owned slaves, but they didn’t. Some argue it was a long time ago, but it wasn’t.

Slavery has been in the news a lot lately. Perhaps it’s because of the increase in human trafficking on American soil or the headlines about income inequality, the mass incarceration of African Americans or discussions about reparations to the descendants of slaves. Several publications have fueled these conversations: Ta-Nehisi Coates’ The Case for Reparations in The Atlantic Monthly, French economist Thomas Picketty’s Capital in the Twenty First Century, historian Edward Baptist’s The Half Has Never Been Told: Slavery and The Making of American Capitalism, and law professor Bryan A. Stevenson’s Just Mercy: A Story of Justice and Redemption.

As a scholar of slavery at the University of Texas at Austin, I welcome the public debates and connections the American people are making with history. However, there are still many misconceptions about slavery.

I’ve spent my career dispelling myths about “the peculiar institution.” The goal in my courses is not to victimize one group and celebrate another. Instead, we trace the history of slavery in all its forms to make sense of the origins of wealth inequality and the roots of discrimination today. The history of slavery provides deep context to contemporary conversations and counters the distorted facts, internet hoaxes and poor scholarship I caution my students against.

Four myths about slavery

Myth One: The majority of African captives came to what became the United States.

Truth: Only 380,000 or 4-6% came to the United States. The majority of enslaved Africans went to Brazil, followed by the Caribbean. A significant number of enslaved Africans arrived in the American colonies by way of the Caribbean where they were “seasoned” and mentored into slave life. They spent months or years recovering from the harsh realities of the Middle Passage. Once they were forcibly accustomed to slave labor, many were then brought to plantations on American soil.

Myth Two: Slavery lasted for 400 years.

Popular culture is rich with references to 400 years of oppression. There seems to be confusion between the Transatlantic Slave Trade (1440-1888) and the institution of slavery, confusion only reinforced by the Bible, Genesis 15:13:

Then the Lord said to him, ‘Know for certain that for four hundred years your descendants will be strangers in a country not their own and that they will be enslaved and mistreated there.

Listen to Lupe Fiasco – just one Hip Hop artist to refer to the 400 years – in his 2011 imagining of America without slavery, “All Black Everything”:

[Hook]

You would never know

If you could ever be

If you never try

You would never see

Stayed in Africa

We ain’t never leave

So there were no slaves in our history

Were no slave ships, were no misery, call me crazy, or isn’t he

See I fell asleep and I had a dream, it was all black everything

[Verse 1]

Uh, and we ain’t get exploited

White man ain’t feared so he did not destroy it

We ain’t work for free, see they had to employ it

Built it up together so we equally appointed

First 400 years, see we actually enjoyed it

tt9cfqkm-1413841594.jpg

A plantation owner with his slaves. (National Media Museum from UK)

Truth: Slavery was not unique to the United States; it is a part of almost every nation’s history from Greek and Roman civilizations to contemporary forms of human trafficking. The American part of the story lasted fewer than 400 years.

How do we calculate it? Most historians use 1619 as a starting point: 20 Africans referred to as ”servants” arrived in Jamestown, VA on a Dutch ship. It’s important to note, however, that they were not the first Africans on American soil. Africans first arrived in America in the late 16th century not as slaves but as explorers together with Spanish and Portuguese explorers. One of the best known of these African “conquistadors” was Estevancio who traveled throughout the southeast from present day Florida to Texas. As far as the institution of chattel slavery – the treatment of slaves as property – in the United States, if we use 1619 as the beginning and the 1865 Thirteenth Amendment as its end then it lasted 246 years, not 400.

Myth Three: All Southerners owned slaves.

Truth: Roughly 25% of all southerners owned slaves. The fact that one quarter of the Southern population were slaveholders is still shocking to many. This truth brings historical insight to modern conversations about the Occupy Movement, its challenge to the inequality gap and its slogan “we are the 99%.”

Take the case of Texas. When it established statehood, the Lone Star State had a shorter period of Anglo-American chattel slavery than other Southern states – only 1845 to 1865 – because Spain and Mexico had occupied the region for almost one half of the 19th century with policies that either abolished or limited slavery. Still, the number of people impacted by wealth and income inequality is staggering. By 1860, the Texas enslaved population was 182,566, but slaveholders represented 27% of the population, controlled 68% of the government positions and 73% of the wealth. Shocking figures but today’s income gap in Texas is arguably more stark with 10% of tax filers taking home 50% of the income.

Myth Four: Slavery was a long time ago.

Truth: African-Americans have been free in this country for less time than they were enslaved. Do the math: Blacks have been free for 149 years which means that most Americans are two to three generations removed from slavery. However, former slaveholding families have built their legacies on the institution and generated wealth that African-Americans have not been privy to because enslaved labor was forced; segregation maintained wealth disparities; and overt and covert discrimination limited African-American recovery efforts.

The value of slaves

Economists and historians have examined detailed aspects of the enslaved experience for as long as slavery existed. Recent publications related to slavery and capitalism explore economic aspects of cotton production and offer commentary on the amount of wealth generated from enslaved labor.

My own work enters this conversation looking at the value of individual slaves and the ways enslaved people responded to being treated as a commodity. They were bought and sold just like we sell cars and cattle today. They were gifted, deeded and mortgaged the same way we sell houses today. They were itemized and insured the same way we manage our assets and protect our valuables.

y6br69t3-1413802703.jpg

Extensive Sale of Choice Slaves, New Orleans 1859, Girardey, C.E. (Natchez Trace Collection, Broadside Collection, Dolph Briscoe Center for American History)

Extensive Sale of Choice Slaves, New Orleans 1859, Girardey, C.E.

(Natchez Trace Collection, Broadside Collection, Dolph Briscoe Center for American History)

Enslaved people were valued at every stage of their lives, from before birth until after death. Slaveholders examined women for their fertility and projected the value of their “future increase.” As they grew up, enslavers assessed their value through a rating system that quantified their work. An “A1 Prime hand” represented one term used for a “first rate” slave who could do the most work in a given day. Their values decreased on a quarter scale from three-fourths hands to one-fourth hands, to a rate of zero, which was typically reserved for elderly or differently abled bondpeople (another term for slaves.)

Guy and Andrew, two prime males sold at the largest auction in US History in 1859, commanded different prices. Although similar in “all marketable points in size, age, and skill,” Guy commanded $1240 while Andrew sold for $1040 because “he had lost his right eye.” A reporter from the New York Tribune noted “that the market value of the right eye in the Southern country is $240.” Enslaved bodies were reduced to monetary values assessed from year to year and sometimes from month to month for their entire lifespan and beyond. By today’s standards, Andrew and Guy would be worth about $33,000-$40,000.

Slavery was an extremely diverse economic institution; one that extrapolated unpaid labor out of people in a variety of settings from small single crop farms and plantations to urban universities. This diversity is also reflected in their prices. Enslaved people understood they were treated as commodities.

“I was sold away from mammy at three years old,” recalled Harriett Hill of Georgia. “I remembers it! It lack selling a calf from the cow,” she shared in a 1930s interview with the Works Progress Administration. “We are human beings” she told her interviewer. Those in bondage understood their status. Even though Harriet Hill “was too little to remember her price when she was three, she recalled being sold for $1400 at age 9 or 10, “I never could forget it.”

Slavery in popular culture

Slavery is part and parcel of American popular culture but for more than 30 years the television mini-series Roots was the primary visual representation of the institution except for a handful of independent (and not widely known) films such as Haile Gerima’s Sankofa or the Brazilian Quilombo. Today Steve McQueen’s 12 Years a Slave is a box office success, actress Azia Mira Dungey has a popular web series called Ask a Slave, and in Cash Crop sculptor Stephen Hayes compares the slave ships of the 18th century with third world sweatshops.

From the serious – PBS’s award-winning Many Rivers to Cross – and the interactive Slave Dwelling Project- whereby school aged children spend the night in slave cabins – to the comic at Saturday Night Live, slavery is today front and center.

The elephant that sits at the center of our history is coming into focus. American slavery happened — we are still living with its consequences.

Daina Ramey Berry, Ph.D. is an associate professor of history and African and African Diaspora Studies at the University of Texas at Austin. She is also a Public Voices Fellow, author and award–winning editor of three books, currently at work on book about slave prices in the United States funded in part by the National Endowment for the Humanities. Follow her on Twitter: @lbofflesh. This articles was first published by Not Even Past.

Read Full Post »

What Happened the Last Time Republicans Had a Majority This Huge? They lost it.

Josh Zeitz

Politico.com    November 15, 2014

Since last week, many Republicans have been feeling singularly nostalgic for November 1928, and with good reason. It’s the last time that the party won such commanding majorities in the House of Representatives while also dominating the Senate. And, let’s face it, 1928 was a good time.

America was rich—or so it seemed. Charles Lindbergh was on the cover of Time. Amelia Earhart became the first woman to fly across the Atlantic. Jean Lussier went over Niagara Falls in a rubber ball (thus trumping the previous year’s vogue for flagpole sitting). Mickey Mouse made his first appearance in a talkie (“Steamboat Willie”). Irving Aaronson and His Commanders raised eyebrows with the popular—and, for its time, scandalous—song, “Let’s Misbehave,” and presidential nominee Herbert Hoover gave his Democratic opponent, Al Smith, a shellacking worthy of the history books.

The key takeaway: It’s been a really, really long time since Republicans have owned Capitol Hill as they do now.

But victory can be a fleeting thing. In 1928, Republicans won 270 seats in the House. They were on top of the world. Two years later, they narrowly lost their majority. Two years after that, in 1932, their caucus shrunk to 117 members and the number of Republican-held seats in the Senate fell to just 36. To borrow the title of a popular 1929 novel (which had nothing whatsoever to do with American politics): Goodbye to all that.

A surface explanation for the quick rise and fall of the GOP House majority of 1928 is the Great Depression. As the party in power, Republicans owned the economy, and voters punished them for it. In this sense, today’s Republicans have no historical parallel to fear. Voters—at least a working majority of the minority who turned out last week—clearly blame Barack Obama for the lingering aftershocks of the recent economic crash.

But what if the Republicans of 1928 owed their demise to a more fundamental force? What if it was demography, not economics, that truly killed the elephant?

In fact, the Great Depression was just one factor in the GOP’s stunning reversal of fortune, and in the 1930 cycle that saw Republicans lose their commanding House majority it was probably a minor factor. To be sure, the Republicans of yesteryear were victims of historical contingency (the Great Depression), but they also failed to appreciate and prepare for a long-building trend—the rise of a new urban majority comprised of over 14 million immigrants, and many millions more of their children. Democrats did see the trend, and they built a majority that lasted half a century.

The lesson for President Obama and the Democrats is to go big—very, very big—on immigration reform. Like the New Dealers, today’s Democrats have a unique opportunity to build a majority coalition that dominates American politics well into the century.

***

For the 1928 GOP House majority, victory was unusually short-lived. About one in five GOP House members elected in the Hoover landslide served little more than a year and a half before losing their seats in November 1930.

On a surface level, the Great Depression was to blame.

The stock market crash of October 1929 destroyed untold wealth. Shares in Eastman Kodak plunged from a high of $264.75 to $150. General Electric, $403 to $168.13. General Motors, $91.75 to $33.50. In the following months, millions of men and women were thrown out of work. Tens of thousands of businesses shut their doors and never reopened.

But in the 1920s—before the rise of pensions and 401Ks, college savings accounts and retail investment vehicles—very few Americans were directly implicated in the market. Moreover, in the context of their recent experience, the sudden downtick of 1929-1930 was jarring but not altogether unusual. Hoover later recalled that “for some time after the crash,” most businessmen simply did not perceive “that the danger was any more than that of run-of-the-mill, temporary slumps such as had occurred at three-to-seven year intervals in the past.”

By April 1930, stocks had recouped 20 percent of lost value and seemed on a steady course to recovery. Bank failures, though vexing, were occurring at no greater a clip than the decade’s norm. Yes, gross national product fell 12.6 percent in just one year, and roughly 8.9 percent of able-bodied Americans were out of work. But events were not nearly as dire as in 1921, when a recession sent GNP plunging 24 percent and 11.9 percent of workers were unemployed.

In fact, Americans in the Jazz Age were accustomed to a great deal of economic volatility and risk exposure. It was the age of Scott and Zelda, Babe Ruth, the Charleston, Clara Bow and Colleen Moore—the Ford Model T and the radio set. But it was also an era of massive wealth and income inequality. In these days before the emergence of the safety net—before unemployment and disability insurance—most industrial workers expected to be without work for several months of each year. For farm workers, the entire decade was particularly unforgiving, as a combination of domestic over-production and foreign competition drove down crop prices precipitously.

In hindsight, we know that voters in November 1930 were standing on the edge of a deep canyon. But in the moment, hard times struck many Americans as a normal, cyclical part of their existence.

Unsurprisingly, then, many House and local races in 1930 hinged more on cultural issues—especially on Prohibition, which in many districts set “wet” Democrats against “dry” Republicans—than economic ones.

If the Depression was not a singular determinant in the 1930 elections, neither had Herbert Hoover yet acquired an undeserved reputation for callous indifference to human suffering. Today, we think of Hoover as the laissez-faire foil to Franklin Roosevelt’s brand of muscular liberalism. But in 1930, Hoover was still widely regarded as a progressive Republican who, in his capacity as U.S. relief coordinator, saved Europe from starvation during World War I. When he was elected president, recalled a prominent journalist, we “were in a mood for magic … We had summoned a great engineer to solve our problems for us; now we sat back comfortably and confidently to watch problems being solved.”

In 1929 and 1930, Hoover acted swiftly to address what was still a seemingly routine economic emergency. He jawboned business leaders into maintaining job rolls and wages. He cajoled the Federal Reserve System into easing credit. He requested increased appropriations for public works and grew the federal budget to its largest-ever peacetime levels. In most contemporary press accounts, he had not yet acquired the stigma of a loser.

Still, in 1930 Hoover’s party took a beating. Republicans lost eight seats in the Senate and 52 seats in the House. By the time the new House was seated in December 1931, several deaths and vacancies resulted in a razor-thin Democratic majority.

If the election was not exclusively or even necessarily about economics, the same cannot be said of the FDR’s historic landslide two years later. As Europe plunged headlong into the Depression in 1931 and 1932, the American banking and financial system all but collapsed. With well over 1,000 banks failing each year, millions of depositors lost their life savings. By the eve of the election, more than 50 percent of American workers were unemployed or under-employed.

In response to the crisis, Hoover broke with decades of Republican economic orthodoxy. He stepped up work on the Boulder Dam and Grand Coulee Dam (popular lore notwithstanding, these were not first conceived as New Deal projects). He signed legislation outlawing anti-union (“yellow dog”) clauses in work agreements. And he chartered the Reconstruction Finance Corporation, a government-sponsored entity that loaned money directly to financial institutions, railroads and agricultural stabilization agencies, thereby helping them maintain liquidity. The RFC was in many ways the first New Deal agency, though Herbert Hoover pioneered it. Even the editors of the New Republic, among the president’s sharpest liberal critics, admitted at the time, “There has been nothing quite like it.”

Read Full Post »

« Newer Posts - Older Posts »