Feeds:
Entradas
Comentarios

The Cuba Embargo Has Actually Worked Like a Charm

 HNN June 21, 2015

159733-cuna.jpg

First item on the anti-embargo teleprompter: “But the embargo hasn’t worked. After half a century the Castro regime still stands. So why should we continue this failed policy?’

But who–besides the Castro lobby–ever claimed “regime-change” was the embargo’s rationale? To wit:

On January, 21, 1962 at Punta del Este Uruguay U.S. Secretary of State Dean Rusk gave a speech to the Organization of American States recommending the members join the U.S. in voting for an economic embargo of Cuba. In this speech there is not a single word–or even an inference–that regime-change was the embargo’s goal. «The United States objects to Cuba’s activities and policies in the international arena not its internal system or arrangements.» Indeed, Secretary Rusk went out of his way to stress that regime-change was NOTthe embargo’s goal.

The much-ballyhooed (mostly by KGB-trained Cuban DGI Colonel Fabian Escalante who authored “634 ways to Kill Castro”) Operation Mongoose also appears less serious and concerted under close scrutiny, and further supports Rusk’s public statements at the OAS meeting.

In fact, many of the actual participants in Operation Mongoose–both American and Cuban-exile—finally became convinced they were risking their lives on mostly intelligence-gathering missions rather than in trying to decapitate the Castro regime.

«I will never abandon Cuba to Communism!» declared JFK while addressing the recently ransomed Bay of Pigs freedom fighters and their families in Miami’s Orange Bowl Dec. 29, 1962. «I promise to deliver this Brigade banner to you in a free Havana!»

«That was the first time it snowed in the Orange Bowl,» later wrote CIA man named Grayston Lynch who was in attendance. Lynch helped train many of the Bay of Pigs freedom fighters and he landed on the beach with them, firing the first shots of the invasion. Then — from 1961-64 — he led several dozen commando raids into Cuba, as part of Operation Mongoose. Probably nobody had the “hands-on” experience with the actual nuts and bolts of Mongoose as Grayston Lynch. The “snow” comment appears in Lynch’s book Decision to Disaster published in 1998, after years of analyzing the entire Mongoose matter, and obviously refers to the “snow job” JFK was pulling by claiming he’d help overthrow Cuban communism.

The famous hearings in 1975 by the Frank Church Committee into the CIA’s deviltry as alleged by Castro’s intelligence Colonel Fabian Escalante lend further credence to Lynch’s claims:

“ In August 1975, Fidel Castro gave Senator George McGovern a list of twenty-four alleged attempts to assassinate him in which Castro claimed the CIA had been involved…The Committee has found no evidence that the CIA was involved in the attempts on Castro’s life enumerated in the allegations that Castro gave to Senator McGovern.”

In brief, the U.S. was mostly trying to contain Soviet-Cuban sponsored international terrorism. And on very sound grounds: every terror group from The Weathermen to Puerto Rico’s Macheteros, from Argentina’s Montoneros, to Colombia’s FARC, from the Black Panthers, to the PLO, to the IRA received training and funding from Castro.

Granted, while most were not immediately defeated they were certainly contained. Then for three decades the Soviet Union was forced to pump the equivalent of almost ten Marshall Plans into Cuba.This cannot have helped the Soviet Union’s precarious solvency or lengthened her life span.

Second item on the anti-embargo teleprompter: “But the Cold War’s over, for heaven’s sake! Why then continue this relic of that era?”

Because international terrorism still rages with slightly different sponsors and Castro’s Cuba is still prominent among them. A DEA report attributes half of the world’s cocaine supply to Columbia’s FARC (Fuerzas Armadas Revolucionarias de Colombia), the largest, oldest and most murderous terrorist group in our Hemisphere whose murder toll dwarfs that of Al Qaeda and ISIScombined and includes some murdered U.S. citizens. This same FARC thanks Fidel Castro for their immense fame and fortune. «Thanks to Fidel Castro» boasted late FARC commander Tiro-Fijo in a 2002 interview, «we are now a powerful army, not a hit and run band

But let’s forward a bit even from there:Just last month Cuba (practically) got caught red-handed supplying Chinese-made arms to the Western hemisphere’s oldest biggest and most murderous terror-group, Colombia’s FARC. The terror-death toll from these Fuerzas Armadas Revolucionarias de Colombia (FARC) exceeds 200,000, and includes more U.S. citizens than have been murdered by ISIS.

So maybe it was a mere coincidence that the very week Obama planned to remove Cuba as a terror-sponsor the mainstream media blacked-out any mention of this blatant terror-sponsorship by Cuba in our own backyard.

Back in February, you see, Colombian authorities found 99 missile heads, 100 tons of gunpowder, 2.6 million detonators, and over 3,000 artillery shells hidden under rice sacks in a ship bound from Red China to Cuba that docked in the port of Cartagena, Colombia.

Most Cuba-watchers immediately guessed what was up. And just last month Colombian reporters (actually worthy of the name, unlike so many of ours) exposed the scheme. In brief:

*The arms were from a Chinese manufacturer named Norinco and the recipient was a Cuban company named Tecnoimport.

*But the ship stopped in the Colombian ports of Cartagena and Baranquilla (where the FARC is based, remember.)

* Colombia’s crackerjack newspaper El Espectator also reports that many Norinco-manufactured arms have already been captured from FARC guerrillas over the past ten years. This proliferation of Cuba-smuggled Chinese arms to the terrorist FARC got so bad that in 2007-08 the Colombian authorities even send a diplomatic protest note to the Chinese.

This awkward information at this awkward time, needless to say, might have hampered Obama’s plan to cleanse Castro from any taint of terror-sponsorship—assuming, that many people would have switched off the Kardashians to learn of it. Hence you’re only reading about it here at HNN.

Furthermore, last summer Cuba was caught trying to smuggle military contraband though the Panama Canal to North Korea, in what the UN Security Council itself denounced as the worst violation of the arms embargo against North Korea to date. The arms embargo was imposed in 2006 by the very United Nations.

Third item on the anti-embargo teleprompter: “But the embargo mainly punishes the Cuban people, and gives Castro an excuse for his economic failures and human rights violations.”

Well, why not ask the Cuban people themselves how they feel about it? Granted, polls are difficult to conduct in a Stalinist nation but every atom of observable evidence proves that the Cuban people actually want the embargo tightened.

In 2007, for instance Spanish pollsters conducted a clandestine poll in Cuba and found that less than a third of Cubans blame the U.S. «blockade» for their economic plight. In addition, Cuban dissidents almost en-masse condemn Obama’s loop-holing of the sanctions against the KGB-trained Stalinists who oppress them. “And now, the U.S. — our ally,” lamented Cuban dissident Guillermo Fariñas last month, “turns its back on us and prefers to sit with our killers.”

Cubans themselves have seen and felt it during the past few years: record foreign investment and record tourism to Cuba = enrichment of the Cuban regime and increased repression. The “libertarian” pipe-dream was blown to smithereens years ago. Alas, these dogmatists never bothered to poke their nose from their books on economic theory and look at the real world.

Fourth item on anti-embargo teleprompter: “But we trade with China, for crying out loud! So why not with Cuba?”

China’s (admittedly despicable regime) allows a genuine private sector, pays its bills and has lots of goods Americans want—even need. An American can do business with a Chinese businessman not directly affiliated to the Chinese government. Whereas Cuba’s constitution outlaws private property. Every business transaction and tourist expenditure in Cuba enriches the communist regime. As mentioned, the proof and verdict on this item has been in for years, for anyone who bothers to look.

Fifth item on anti-embargo teleprompter: “But why not try something new? At least President Obama is attempting a new policy.”

In fact every U.S. President—especially Republicans–since 1960 attempted an “opening” to Cuba. But most realized that no advantages whatsoever would accrue for U.S. interests or for those of the Cuban people. In fact Ronald “Evil Empire” Reagan probably went furthest in this regard, sending Alexander Haig to meet personally in Mexico City with Cuba’s «Vice President» Carlos Raphael Rodriguez to feel him out. Then he sent diplomatic troubleshooter General Vernon Walters to Havana for a meeting with the Maximum Leader himself.

Castro, as usual, turned on the charm but Walters returned telling President Reagan that it would be Castro’s way or no way.

With our current President this Castroite attitude proved no impediment whatsoever, as already plumbed by some of Castro’s terror affiliates who are already gloating, snickering and rubbing their hands. Take Hezbollah leader Ammar Moussawi for instance: “The firmness of Cuba’s positions and the steadfastness and patience of the Cuban people has pushed the hand of US administration … the achievements of Cuba, which was firm on its principles, is a lesson for all people of the world who are suffering from American hegemony.”

Now that’s comforting.

Humberto Fontova holds an M.A. in Latin American Studies from Tulane University and is the author of five books, including Fidel; Hollywood’s Favorite Tyrant, Exposing the Real Che Guevara and The Longest Romance; the U.S. Media and Fidel Castro. «The terns scoundrel and traitor should precede every mention of Humberto Fontova!» declares the Castro regime’s official newspaper Cubadebate. For more info visit www.hfontova.com.

Reconstructing the American Tradition of Domestic Terrorism


African American men, women, and children outside of church

African American men, women, and children outside of church, 1899. Compiled by W.E.B. Du Bois (Photo: Library of Congress)

Yesterday’s horrific murder of nine people worshipping at Charleston’s Emanuel African Methodist Episcopal Church replayed a central theme in American history. It is the question, fought for centuries with both words and weapons: to whom does this country belong?

The alleged gunman, twenty-one year old white man Dylann Roof, killed six women and three men, including pastor Clementa Pinckney, who was also a South Carolina state senator. A witness to the shooting reported that the killer said: “I have to do it. You rape our women and you’re taking over our country. And you have to go.”

That a white terrorist murdered an African American politician and African American bystanders in a black church, using language straight out of Reconstruction, is not an accident. It reflects the vital intersection of American politics, race, and religion since 1866.

In the wake of the Civil War, white southern Democrats initially refused to face the reality that they would have to share any sort of economic, political, or social power with their former slaves. With the encouragement of President Andrew Johnson, who had taken over from the slain President Lincoln during Congress’s long summer recess, white legislatures in the South ratified the Thirteenth Amendment abolishing slavery, but then promptly set about recreating the conditions of servitude. In most states, black people could not congregate, had to sign year long work contracts, and could be arrested on charges of “vagrancy,” fined, and then bound to whoever paid their fine. Nowhere could a black person testify in court against a white person, so nowhere could a black American claim the protection of the law against theft, rape, or murder.

When Congress reconvened in December 1865, congressmen refused to return their black wartime allies to quasi-slavery under the very men who had spent four years trying to destroy the Union. They put forward the Fourteenth Amendment to give black men a civic identity that would give them legal rights as a condition for the readmission of the southern states to the Union. When southern whites retorted that they would rather remain under military rule than submit to black equality, northern congressmen passed the Military Reconstruction Act of 1867, which called for new southern state constitutional conventions to rewrite state constitutions providing for black civic rights before the states could be readmitted to the Union. Crucially, the Military Reconstruction Act permitted African American men to vote.

White southern Democrats recoiled at the idea of sharing political rights with black men. But African Americans and white southern Republicans, who had supported the Union during the war, recognized the power of their position. Republicans across the South began to organize black voters. One of their most common venues for political organization was among the very powerful black churches, especially the African Methodist Episcopal Church, and many of the early leading black politicians were clergymen.

At first, white Democrats stood against the political awakening of southern African Americans by simply refusing to enroll voters. This prompted Congress to put the military in charge of voter registration. When both white and black Republicans registered to vote and elected moderate constitutional conventions, white Democrats organized a new force to stop their political opponents from taking over their states: the Ku Klux Klan. Before the 1868 elections, members of the Ku Klux Klan murdered at least a thousand African Americans and their white allies. In South Carolina, they killed African American clergyman and state legislator B. F. Randolph at a train depot in broad daylight.

Congress stood against Klan terrorism with an 1871 law making their political intimidation a federal offense, a distinction that enabled President Grant to stop the depredations of the Ku Klux Klan by imposing martial law in parts of the South and by having federal courts, rather than local courts, try offenders. For the next twenty years, white southerners controlled black political voices by finding ways either to work with black voters or to silence them. This was imperative, they insisted, for black voters were only interested in social welfare legislation that would cost tax dollars and thus “corrupt” the American government.

In 1889, the threat of a new Republican administration to mount a federal defense of black voting brought a new construction to the idea of the corruption of government. A new generation of white Democrats worried far less about political than about social issues. They insisted that black men must not vote because if they voted, they would take local political offices. This would give them patronage power, for in the nineteenth century, local positions depended on the goodwill of local politicians. Black men would, for example, become school principals. There, they would use their power to hire teachers to force young innocent white girls to have sex with them in exchange for jobs. This political exchange very quickly turned to the idea that black political power meant widespread rape. By the early twentieth century, lynching black men was almost a civic duty for white citizens: only by purging the government of black voices could the nation be made safe.

When Roof said: “I have to do it. You rape our women and you’re taking over our country. And you have to go,” he was echoing the fear of black political power laid down in the aftermath of the Civil War, when white American men had to face the reality that this nation is, in fact, made up of far more women and people of color than it is of white men. That fact inspired terror – and terrorism – among white men in the late nineteenth century. It did so again after 1954, when Brown v. Board warned white Americans that they would again have to share their country with African Americans. Then, as in the late nineteenth century, white Americans turned to terrorism against black political voices as, for example, when four Ku Klux Klan members bombed the 16th Street Baptist Church in Birmingham, Alabama, and murdered four little girls.

Yesterday, it seems, our history echoed again.

About the Author

Heather Cox Richardson

Historian. Author. Professor. Budding Curmudgeon. Heather Cox Richardson studies the contrast between image and reality in America, especially in politics.

[Re-posted with permission from Who Makes Cents?]

Today’s guest discusses the history of Empire of the Airaviation and how this provides a lens to interpret the history of capitalism and U.S. foreign relations across the twentieth century. Amongst other topics, Jenifer Van Vleck tells us how the airline industry helped solve various political and logistical challenges for the U.S. government during World War II and how the airlines relied on the government and vice-versa.

Jenifer Van Vleck is Assistant Professor of History and American Studies at Yale University. She is author of Empire of the Air: Aviation and the American Ascendancy (Harvard University Press, 2013).

The Hypocritical and Shameful History of the Democratic Party Before the Civil War 

HNN  June 15, 2015

Hoosier William Kennedy was apoplectic in July 1854. “We may expect no better of such as Douglas and Pettit and all the Northern Doughfaces that Followed in their wake,” Kennedy wrote furiously to his congressman. “They have made themselves far beneath Judas he got thirty pieces of silver for betraying his lord and master but they have betrayed him in his ministry.” Clearly, Kennedy felt deceived. But what could have made him so angry? The answer is plain: His Congressional representatives disregarded the desires of their constituents when they voted in favor of the Kansas-Nebraska Act. Many Northern Democrats were stunned when their congressmen supported a bill that violated Northern free soil sentiment by permitting the expansion of slavery into the western territories.

Kennedy understood in the summer of 1854 what most Northern Democrats would not realize until the winter of 1857 – that a significant portion of Northern Democratic office-holders held anti-democratic principles; that they favored minority rule over the majority will, and had no qualms about ignoring their own constituents in order to implement such policies and further their own careers. These few Democrats, unfortunately, held the balance of power both within the party and in the federal government, casting the crucial Northern votes for pro-slavery candidates and legislation, causing, in the end, the fragmentation of their party and the near destruction of the Union. Kennedy and Northern Democrats would find the 1850s a troubling time as their Congressmen seemed to be serving Southern slaveowners rather than Northern free men. Southern minority domination was what they sought, but civil war was what they wrought.1

Democracy, and its kin concept of egalitarianism, was at the heart of the political and social rhetoric of the antebellum Democratic Party. In fact, the party was commonly known as simply “the Democracy,” primarily by its adherents. Democratic Party icons Thomas Jefferson and Andrew Jackson employed this rhetoric with astonishing success, claiming that the party was the sole defender of individual liberties and the laboring masses against the evils of special privilege and concentrated wealth, while they, themselves, were wealthy, powerful, and privileged. The rhetoric masked a deeper, darker agenda that was, ironically, largely contradictory to the party’s professed principles. From its inception under Jefferson in the 1790s, the Democratic Party was controlled by Southern slave-owners, and aggressively pursued a pro-Southern, pro-slavery program, often at the expense of its own Northern wing. Nevertheless, the rhetoric was potent enough to hold the loyalty of a generation of Northerners, even as many of them recognized the uneven nature of the coalition. The banner may have read freedom, equality, and democracy, but the reality was an organization dedicated to slavery, concentrated wealth in the form of land and slaves, and anti-democratic, minority rule.

Southern power and slavery expansion were the fundamental principles of the antebellum Democracy. The party structure, itself, was designed to protect and further these objectives and was a glaring example of minority, anti-democratic rule. From the beginning, Southerners used parliamentary procedure to preserve control of the party apparatus, despite the North’s fast increasing population. In the 1810s and 1820s, they employed the secret caucus, then, after widespread backlash against that practice, the “two-thirds rule,” which required the consent of two-thirds of convention delegates to achieve passage of resolutions, platforms, and nominations. The two-thirds rule successfully prevented any Northerner who did not endorse slavery from gaining the presidential nomination, as well as prevent any anti-slavery notions from sneaking into resolutions. (Indeed, it was enough to thwart former President Martin Van Buren’s bid for the nomination in 1844, despite the fact that a majority of the convention supported him. The Southern-controlled meeting instead turned to the slave-owning expansionist James Polk of Tennessee.) While a two-thirds rule might seem to ensure the will of the majority, it was, in practice, a form of minority domination. Southerners, though a numeric minority, could, with the aid of a few willing Northerners, dictate policy and candidates.

Surprisingly, by the 1850s Democrats no longer bothered to hide their anti-democratic values, especially Northern Democrats who were well-aware that their actions violated the will of their constituents. Their votes on pro-slavery legislation were clear enough, but in the halls of Congress, in partisan presses, and in personal correspondence, they bragged about their disregard for the populace. Furthermore, they weaved warped arguments about the dangers of majority rule and the necessity for unfettered elected officials. When former Democratic Senator Daniel Dickinson of New York was asked in 1853 why he had repeatedly turned a deaf ear to both his constituents and the New York state legislature, he struck a cavalier attitude: “I should best discharge my duty to the constitution and the Union by disregarding such instructions altogether; and although they were often afterwards repeated, and popular indignities threatened, I disregarded them accordingly.”2

In both word and deed, Democrats advocated for anti-democratic minority rule. The most potent example is the Lecompton Constitution of Kansas. Created by a small minority of pro-slavery militants in the summer of 1857, the Lecompton Constitution was designed to force slavery on the free-state majority of that territory. In the fall of that year, the document was sent to Congress for approval. If Congress accepted it, majority rule would be discarded and minority, pro-slavery rule would be enthroned against the will of the people. By employing parliamentary tricks, presidential influence, and outright bribery, Democrats were able to push the bill through Congress (though the people of Kansas rejected it regardless).

In addition to their votes, Democrats gave voice to a virulent strain of anti-democratic, minority rule theory that has heretofore been ignored by scholars. Though Southern Democrats had been vocal about their fear of the majority for generations, such words were shocking from the mouths of Northern Democrats. The first, and most prominent argument in favor of minority rule was that the crisis in Kansas had created a national calamity that needed to be brought to a swift end. A conclusion to the crisis, Northern Democrats asserted, could only be achieved through a speedy acceptance of the flawed Lecompton Constitution, regardless of the will of the territorial majority.

A second argument reasoned that it made no difference whatsoever that Lecompton violated the will of the majority, only that the process of its creation appeared to be legal. Others fashioned an alternative view of history in order to rationalize forcing Lecompton onto Kansas, emphasizing the dangers of the mob and claiming that the federal government had been designed by the Founding Fathers to prevent majority rule.

Still others justified their actions in the service of the Slave Power by claiming that Congressmen, once elected, were free to follow their own judgment, regardless of the wishes of their constituents. And finally, a fifth rationale maintained that the public could not always be trusted to vote on legislation and constitutions, and therefore the sentiments of majorities – either in Kansas or the nation as a whole – were sometimes irrelevant. These themes, though distinct, often operated simultaneously in the reasoning of Northern Democrats as they struggled to defend their service to the South.

From President Buchanan’s message on the first day of the first session of the Thirty-Fifth Congress, to closing remarks and final votes on Lecompton in April 1858, determined Democrats labored to convince their colleagues and the nation that pro-slavery, minority rule in Kansas was both desirable and necessary. The first argument, that of expediency, was the most common. Following Buchanan’s lead, Democrats asserted that “Bleeding Kansas” and the crisis over slavery in the territories could only be brought to a conclusion through a speedy, unceremonious acceptance of Lecompton. It mattered not, they argued, that Lecompton was unrepresentative, fraudulent, and enormously unpopular. Rather, it was the only bill before Congress that would bring the territory into the Union immediately; Lecompton and Kansas were just waiting for admission, and all Congress had to do was vote “aye.”

Like the plea for expediency, the second rationale – the veneer of legality (provided by administration recognition) was enough to accept Lecompton, despite its many flaws – appeared in the arguments of several Democrats. President Buchanan’s message to Congress on December 8, 1857 provides an example of this line of thought. Downplaying election frauds in Kansas, Buchanan declared that since the elections that produced the constitutional conventional in Lecompton appeared legal, the results must be binding, regardless of the will of the majority. The free-state majority that boycotted the elections, he reasoned, had been given every opportunity to exercise their voting rights and chose not to do so, thus forfeiting its right to oppose the outcome. “A large portion of the citizens of Kansas,” he explained, “did not think proper to register their names and to vote at the election for delegates; but an opportunity to do this having been fairly afforded, their refusal to avail themselves of their right could in no manner affect the legality of the convention.” Or, as Senator Graham Fitch of Indiana later stated, “That many, and perhaps a majority of the citizens of Kansas did not vote either at the election of representatives to the Territorial Legislature, or delegates to the convention, may be true. Where is your remedy? You cannot compel men to vote. They can only be permitted and invited to do so.” This rationale stunned Northern voters who saw clearly that the Kansas free-state majority had boycotted the elections because of widespread electoral fraud and violence.3

The third argument against majority will – that majorities are dangerous – can be seen in Senator Fitch’s comments in late December. The Hoosier senator argued that citizens should not have control over their own constitutions, and that popular approval of constitutions was undesirable. “The recognition of popular sovereignty by the repeal of the Missouri line,” he claimed, “consisted in the fact that it placed the question of slavery where all others previously were. It did not provide, nor did it contemplate, nor did its supporters imagine, nor did its author intimate, that it contemplated the submission of every bank proposition, every internal improvement project, every school system, every election qualification in a new constitution, to the people, before the people by and for whom it was formed should be admitted to the Union.” Fitch then launched an attack on majority rule in general. “Our Government is one of checks and balances; and some of its checks apply even to the people themselves. Among the objects of our government, one is to protect the legal rights of the minority against an illegal assumption or a denial of those rights by a majority . . . If a majority resolve itself into a mob, and will neither vote nor observe law or order, the minority who are law-abiding, who form and obey government, cannot be deprived of the benefits and protection of that government by such majority. Is mobocracy to be substituted for democracy?” By equating majority rule with “mobocracy,” Fitch dismissed any opposition to Lecompton as catering to the ignorant masses and violating the intent of the Founding Fathers.4

In addition to Fitch’s surprising attack on majority rule, still another argument was raised in defense of the Slave Power. Senator Jesse Bright, also from Indiana, took the floor in March 1858. His lecture on republican government presents a fourth theme in the fight for Lecompton and minority rule. He argued that congressmen, once elected, were no longer bound to follow the wishes of their constituents. Their election, he maintained, constituted a moral and political blank check, regardless of the manner of election. “Nothing . . . can be clearer to my mind than the proposition that the act of delegates legally elected, and acting within the scope of the powers conferred upon them, is the act of the people themselves. According to the genius and theory of American constitutions, it is entirely immaterial by what majority such delegates are elected, or what number of voters appeared at the polls.” It did not matter, reasoned Bright, if the election was fraudulent or unrepresentative, only that it occurred; the will of the majority was irrelevant.5

Bright’s lecture led him to a fifth argument in favor of ignoring the will of the majority – that the American people could not be trusted to make decisions. The practice of submitting constitutions to a popular vote, or subjecting legislation to the majority will, he asserted, was destructive to American government. “So strong . . . is my conviction of the viciousness of the principle of submitting to a direct vote of the people the propriety of the enactment or rejection of laws, that for one I am prepared to extend the same objection to the submission of entire constitutions to the same tribunal.” The people, either the majority in Kansas or the nation as a whole, should have no voice in government, except at elections, and then only as long as the will of the people did not run counter to the interests of the Slave Power. Though Bright explained it best, many other Democrats used this rationale to defend their own actions in the service of slavery. These five arguments in favor of minority rule rationalized their disregard for their constituents and their support for a highly unpopular and blatantly undemocratic policy in Kansas.6

It is clear, then, that the antebellum Democratic Party was democratic in name only. By practicing minority rule within the party, trying to force slavery on an unwilling populace, denying people the right to vote, and arguing that majority rule was dangerous and disagreeable, Democrats’ violated their own professed principles.

1 James Shields to Charles Lanphier, Oct 25, 1854, Charles Lanphier Papers, Abraham Lincoln Presidential Library; William Kennedy to John G. Davis, July 1, 1854, John G. Davis Papers, Indiana Historical Society.

2 Daniel Dickinson to Henry E. Orr, Sept 13, 1853 in Daniel S. Dickinson, Speeches, Correspondence, Etc., of the Late Daniel S. Dickinson, of New York. Ed by John R. Dickinson (New York: G.P. Putnam & Son, 1867), 476-481.

3 CG, 35C-1S, 4-5, 138; Appendix to the CG, 35C-1S, 1-5.

4 CG, 35C-1S, 137-138.

5 Appendix to the CG, 35C-1S, 163-166.

6 Appendix to the CG, 35C-1S, 163-166.

Michael Todd Landis, an Assistant Professor of History at Tarleton State University, is the author of Northern Men with Southern Loyalties: The Democratic Party and the Sectional Crisis.

 

Theodore Roosevelt: An Old West Sheriff in the White House 

HNN June 14, 2015

President Obama’s recent announcement that he will sign an executive order to prevent the nation’s police from using combat equipment that “militarizes” their function grates on the ear of law-and-order conservatives, who believe that maintaining an orderly society means that our elected leaders must sometimes take extreme measures to achieve that end. Understanding history, they remember that George Washington used an overwhelming force of 13,000 militiamen to smash the so-called “Whiskey Rebellion” (a ragtag uprising of backwoods distillers who refused to pay the nation’s new tax of spirits) that flared up in western Pennsylvania in 1791. To hurl this many troops against unorganized malcontents who had burned the government tax collector’s home was massive overkill (500 trained soldiers could have easily put down the Lilliputian revolt), but not when measured by the larger goals the nation’s first president wanted to achieve in decisive, unequivocal fashion.

Washington’s bold action was critically important to the development of the country, establishing the power of the federal government during the nation’s fragile infancy and creating a beneficial precedent that violent disruptions to the civil order would not be tolerated in the new democratic republic. The Obama of his generation, Thomas Jefferson disapproved of President Washington’s police action, viewing it as a heavy-handed over-reaction to the reasonable complaints of those adversely affected by Alexander Hamilton’s new tax on spirits (Jefferson opposed the tax). Just as Obama sympathized with the criminal element which disturbed the peace of Ferguson, Missouri and Baltimore, Maryland by caving in to complaints that “militarized” police had provoked violence in the streets, Jefferson sided with lawbreakers who claimed laws enacted by the nation’s duly elected government were tyrannical edicts that justified violent civil unrest.

In striking contrast to Obama and Jefferson stands Theodore Roosevelt, who was arguably the greatest law-and-order president in American history. In recent years, he has been vilified by many right wing pundits as a statist “progressive” (code for bleeding-heart “liberal”) hell bent on achieving “social justice” (a phrase he popularized) for the less well-off in the population. In truth, TR was a no-nonsense conservative cut from the mold of Washington—a hardheaded realist who had no naiveté about the dangers posed to society by the base passions of mankind. Like Washington, he looked with discomfort on the barbaric “tar and feather” tactics Sam Adams used to trigger the American Revolution and was sickened by the ferocious bloodletting perpetrated by the French Revolutionists. Fully embracing the Social Darwinism that was so popular during his own time, he saw society as a fierce “survival of the fittest” competition that would devolve into destructive anarchy if the restraints of civilization were removed.

From his denunciation of the Governor of Illinois, John Altgeld, for pardoning anarchist bombers who triggered the infamous Haymarket labor riot in 1885, to his enthusiastic support for President Grover Cleveland’s use of the U.S. Army to put down the Pullman Strike in 1895, TR consistently supported aggressive means to stamp out and prevent civil unrest. As President of the United States, he believed it was his duty to use whatever means necessary to maintain societal order. There can be no doubt that he would have used the military to quell domestic disturbances during his presidential administration if the need had arisen, as plainly shown by his order to General Scofield in 1902 to use the U.S. Army to end the Anthracite Coal Strike if a peaceful settlement could not be reached in the labor dispute.

Not surprisingly, “police” was one of TR’s favorite words. He used it most famously in 1904 when he announced that the United States would henceforth become the “policeman” of the Western hemisphere, that it would “spank” disorderly Latin American nations that “misbehaved.” The news of this extraordinary “Roosevelt Corollary” to the Monroe Doctrine produced howls of outrage among the nation’s liberals that the paternalistic decree was insensitive to the feelings of the people who lived south of the border. TR brushed aside the criticism. He feared that if the United States did not exercise hegemonic control over Latin America that European powers (especially Germany) would fill the power void and begin to carve colonies out of South America just as they had carved up Africa during the previous generation.

The “Roosevelt Corollary” proved to be one of TR’s greatest mistakes during his presidency, giving birth to a spirit of hostility in Latin America toward the United States that lingers to this day (Franklin D. Roosevelt was wise to repudiate his predecessor’s corollary when he announced his mild “Good Neighbor” policy in the 1930s). Misguided and abrasive, TR’s decree is nevertheless a useful lens that lets us view the real man. He believed that the maintenance and spread of civilization required that the great nations of the world exercise hegemonic control over their respective “spheres of influence”, “policing” weaker nations that fell within their region of power. Thus his strident advocacy of the Monroe Doctrine and his implicit belief that Britain, Russia, Germany and Japan had similar, albeit unstated, doctrines that gave them license to act as regional hegemons.

The enthusiasm TR showed during his presidency for “policing” those who “misbehaved” was evident early in his career. During the mid-1880s when he was not yet 30 years old, he appointed himself “Deputy Sheriff” of the territory around his cattle ranch in the Dakota Badlands and quickly showed that it was much more than a paper title, tracking down and bringing to justice horse thieves in a dramatic incident that he made sure the nation’s newspapers noticed. Possessing a genius for self-promotion rivaled in our own day only by Donald Trump, he often used flamboyant “police” actions like this to shine the spotlight on himself so that the American people would see him as a heroic opponent of criminality in all its forms.

TR saw himself as a Sheriff of the Old West—a throwback to the rough-and-ready lawmen who had administered frontier justice with a cool head and a loaded gun before civilization spread itself over the continent. Once he reached the White House, he went out of his way to give Bat Masterson (the Sheriff of Dodge City), Pat Garrett (the lawman who killed Billy the Kid) and Seth Bullock (the Sheriff who cleaned up Deadwood) government jobs, declaring that they “correspond to those Vikings, like Hastings and Rollo, who finally served the cause of civilization.” He understood that his Old West heroes had often violated the letter of the law in order to keep the peace, but he was anything but a legalistic Pharisee obsessed with narrow definitions. He forgave their transgressions just as he forgave his own in the same regard throughout his political career because they were, he believed, just like him—righteous men who could be trusted to bend the rules put in place to restrain lesser men.

During TR’s time as Police Commissioner of New York City (1895-1897) he acted in the spirit of these grim lawmen of the Old West, cracking down on crime and vice in unprecedented fashion. He started with his own police force, taking to the city’s streets after midnight to catch police officers sleeping on the job. Next, he enforced the so-called “Raines Law,” which prohibited saloons from selling alcohol on Sundays. The extraordinary action infuriated the city’s large alcohol consuming population, especially German-Americans linked to the brewing industry and “Tammany Hall” Democrats, who used their control of the police to operate an extortion racket that extracted financial kickbacks from saloon owners, who were allowed to open for business on Sunday if they paid off the local Tammany machine boss.

Enforcing laws that others ignored was one of the principal drivers of TR’s career, helping him make newspaper headlines and bolster his image as a corruption fighter. He was in many respects the Eliot Ness of his day—incorruptible and indefatigable in his determination to take down the bad guys. As a U.S. Civil Service Commissioner between 1889 and 1893 he even defied his boss, President Benjamin Harrison, by insisting that the Pendleton Act (which Congress had enacted to curtail the “spoils system”) must be enforced. When Harrison refused to fully enforce the law (he needed the “spoils system” fully operational to win re-election), TR took his case directly to the American people, engaging in a nasty public feud with the chief “spoils-man” of the Harrison administration, Postmaster General John Wanamaker.

Of course, the best example of TR’s enforcing moribund laws came in 1902 when he directed his Attorney General to bring suit against J.P. Morgan’s Northern Securities railroad combination. Up until then the Sherman Anti-Trust Act of 1890 had been ignored by both Republican (Benjamin Harrison and William McKinley) and Democratic presidential administrations (Grover Cleveland). Blowing the dust off of this long neglected statute, he used it to catapult himself into the nation’s consciousness as a “Trust Buster” engaged in a bruising struggle with sinister “malefactors of great wealth.” As he told his friend Henry Cabot Lodge: “I am a great believer in practical politics, but when my duty is to enforce a law, that law is surely going to be enforced, without fear or favor.”

As much as TR relished enforcing the law, there was one glaring instance when vigorous “policing” was needed in which he sat on his hands and did nothing. This occurred during his presidency when he refused to do anything at all to enforce the 14th and 15th Amendments to the Constitution, which promised all Americans—including blacks in the South—equal protection under the law and the right to vote. On its face his failure in this regard makes him seem hypocritical and racist (he had sworn an oath to uphold the Constitution), but in his defense it should be noted that every president in the century that passed between the end of the Civil War and the Civil Rights movement of the 1960s was guilty of the same dereliction of duty.

For TR, enforcing the law was never an end in itself, but rather the means to a larger end, namely, the orderly functioning of society. In his conservative value system, order came before justice as a priority for statesmen to pursue because he understood that without the former, the latter was not possible. He wanted very much to heal the racial divisions of his time, but knew this was an impossible task given the ingrained attitudes of his generation regarding race. He ignored the 14th and 15th Amendments because he felt that maintaining stability in the segregated South and cementing the region back into the nation as a whole after the destructive whirlwind of Civil War and Reconstruction was more important than beginning what in his words was a pointless “Peter the Hermit crusade” against an intractable problem that he could never solve.

TR’s acceptance of the status quo regarding the nation’s terrible racial problems was a reasonable approach when seen in the context of the century of segregation that followed the Civil War, but it undoubtedly led to one of his greatest mistakes as president—his infamous decision in 1906 to discharge “without honor” a battalion of black soldiers in the U.S. Army accused of “shooting up” the town of Brownsville, Texas. The reason for his unprecedented and unjust action (the accused did not receive a public trial) remains mysterious, but probably was heavily influenced by a horrific race riot that occurred around the same time in Atlanta, in which a white mob killed a dozen blacks to avenge the alleged rape of a white woman. In arbitrarily dismissing the black soldiers, TR appears to have once again been more concerned with not provoking white Southerners than with the pursuit of justice (after the Brownsville case was reopened in 1973, the U.S. government officially reversed TR’s decision and President Nixon signed an order restoring the pensions of the dismissed men).

In strenuously enforcing the law when it was in his power to do so and looking the other way and allowing it to be broken with impunity when he felt a policy of inaction served the greater good, Theodore Roosevelt always acted in what he believed were the best interests of the United States as a whole. Given his stellar law enforcement credentials, we can be sure that were he president today he would not waste time wringing his hands about what equipment the police used to deal with riotous miscreants. If anything, he would likely choose to use an overwhelming show of force just as Washington did to put down the “Whiskey Rebellion” in order to send a loud message that violent unrest would not be tolerated on his watch.

As much as TR would favor using a firm hand, he would not offer knee-jerk approval of the police. Hardheaded realist that he was he understood that all men—even those entrusted with enforcing the law—could succumb to criminality (his record as Police Commissioner of New York City, when he demonstrated zero tolerance for misconduct within the force he led, proves this conclusively). This said, his sympathy would naturally gravitate toward the police and, as long as their conduct was lawful and professional, he would vigorously support them.

After all, TR was one of their fraternity—a Wyatt Earp in spirit who saw the world as an unruly and dangerous Old West town, a Tombstone that needed to be cleaned up and kept peaceful under the gaze of a steely-eyed lawman like himself who was willing to draw his gun if needed to maintain civilization. Fittingly, he carried a concealed revolver on his person during his presidency, making him the last Commander-in-Chief to arm himself with a firearm while in the White House.

Daniel Ruddy is the author of «Theodore the Great: Conservative Crusader,» which defends TR’s historical reputation against a flurry of attacks on his character and policies over the last decade. The book will be published by Regnery in October, 2015.

By 

New Republic June 8, 2015

American cinema has never been particularly kind to black folks. As blacks moved north into cities with burgeoning movie palaces and industrial jobs and laws designed to keep them impoverished and on the margins, filmic representation of black life—limited for most of the time between World War I and Vietnam to “Toms, Coons, Mulattoes, Mammies and Bucks,” as historian Donald Bogle put it in 1973—was the lens through which American society viewed blacks.1

Anyone who took silver screen representations of blacks at face value would come away with the notion that black folks were, by and large, stupid, cowardly, lazy and worthy of subjugation, censure and plunder. That this is just how America has largely treated its African American population both before and since is no accident. When the blustery but occasionally insightful critic Armond White said “you have a culture of criticism that simply doesn’t want Black people to have any kind of power, any kind of spiritual understanding or artistic understanding of themselves,” he wasn’t wrong.

“ARoad Three Hundred Years Long: Cinema and the Great Migration,” a series at New York’s Museum of Modern Art that began last Monday and runs through June 12th, provides a much-needed corrective, a narrative of black resistance to the dominant mode of slanderous Hollywood storytelling—both now and, somewhat miraculously, in the days of Jim Crow. Curated by MoMA’s Joshua Siegel and independent curator Thomas Beard of Light Industry, the country’s leading micro-cinema for experimental film, the exhibition is pegged to the Museum’s show of Jacob Lawrence’s Great Migration paintings. “A Road Three Hundred Years Long” includes many films that catered to black audiences in the era of their flight from terror, showcasing a significant amount of pre-war black filmmaking as well as movies by contemporary black filmmakers that explore the mysterious legacy of the Great Migration.

Blacks were unable to go to segregated movie theaters in the south except under special circumstances—“midnight rambles,” for example, which were late night screenings of popular films and black-themed movies for black patrons that were barred during the day. But some black owned movie theaters persisted and some of the work shown there, both home movies and narrative features, found appreciative pre-war audiences. MoMA’s program includes many of these films, as well as WPA-style proletarian news reels and avant-garde non-fiction films, to showcase images of black life that—despite the specter of prejudice—highlight unadorned, provincially black, middle class normality in a way that is painfully rare. These spaces have largely escaped Hollywood visions of black American life, obsessed as they are with broad stereotype (see this summer’s Dope) or elegantly rendered historical struggle (Selma12 Years a Slave).

But for black experimentalists (like Kevin Jerome Everson, who directed Company Line) or great, sadly underemployed black narrative directors—like Charles Burnett, who made To Sleep with Anger, and Julie Dash, who did Daughters of the Dust—the forgotten spaces of black American life provide wonderful fodder for exploring the mysterious resilience of black middle class life.

The program’s most interesting discoveries are rooted further in the past. Oscar Micheaux and Spencer Williams were the only African Americans to make narrative feature films in the years before the Civil Rights Movement and they are very much at the center of “A Road Three Hundred Years Long.”Several works by Micheaux, the most prolific of the black pre-war filmmakers, are on display, including his 1920 film The Symbol of the Unconquered. A rejoinder of sorts to The Birth of a Nation, it remains one of the watershed moments in black film history.

Meanwhile, Williams’s work is the subject of the documentary Juke: Passages from the films of Spencer Williams, a series-opening compilation film by noted found footage documentarian Thom Andersen (Los Angeles Plays Itself). Williams is known more as the star of the popular early ’50s sitcom the Amos ‘n’ Andy than as a groundbreaking independent filmmaker. Financed by a Jewish Texan named Alfred Sack, Williams’ moralizing work is often compared to Tyler Perry’s, but given the era he worked in he couldn’t hope to control the means of production on his own (as Perry famously has).

Williams’ race movies made great use of magic realist flourishes in his early films; in The Blood of Jesus, undeniably his masterwork, the devil drives a flatbed truck with sinners perched on the back and Angels—given etherealness through that most basic cinematic trickery, the double exposure—hover over the bed of a woman who lies mortally wounded.Andersen’s interest, however, lies mainly in recontextualizing clips from Williams’ films to highlight the documentary-like aspects they contain. Because so few motion picture images of black middle class life were taken in Dallas suburbs or Harlem night clubs, in cab stands and in juke joints, in sitting rooms and in neighborhood streets, Williams’ scenes have a representative quality that transcends their narratives of jazz age sin and Christian redemption, one that Andersen’s deft cutting compiles in a lucid, if decidedly non-narrative way. (Next winter, arthouse film distributor Kino Lorber will release a box set of Micheaux and Williams movies.)

This has been a watershed year for black repertory programming in the city. Between the Film Society of Lincoln Center’s “Telling It Like It Is: Black Independents in New York 1968-1986,” the Brooklyn Academy of Music’s Space is the Place: Afrofuturism in Film,” and “A Road Three Hundred Years Long,” movies about black life have been given significant platforms in some of New York’s august cultural institutions. The takeaway from these wonderfully curated programs, though, is grimmer: Looking through the filmographies of the artists involved, very little of their work has come with the support of our national film culture’s most enduring brands, the Hollywood studios. One quickly gets the sense that thoughtful, nuanced stories about the African American experience aren’t welcome out there.

Today African Americans are underrepresented in the movie industry’s workforce, both in front of and behind the camera. In the boardroom and in the postproduction house, blackness is scarce. This holds true for every strata of the industry, from the dying vestiges of New York’s indiewood to the hallways of power at Los Angeles’ major studios. I imagine there may be a black person within the studio apparatus who holds the power to green light movies about the parochial anxieties, dreams and predilections of American negroes without fear of quizzical glances from his peers and superiors, but I’ve yet to meet or hear of this person. I’m not holding my breath.

This year and last year are the only consecutive years in cinematic history in which films directed by people of African descent were nominated for the Oscar for best picture. But look for African Americans when you walk through the offices of a major film production or distribution outfit—you’ll still only find them at the guard’s desk or near the janitor’s closet. That narrative has to change.

Brandon Harris

A visiting assistant Professor of Film at SUNY Purchase and Contributing Editor of Filmmaker Magazine, Brandon Harris’ criticism and journalism has appeared in The New YorkerVice, The Daily Beast and n+1. His film Redlegs is a New York Times Critic’s Pick.

June 7, 2015   HNN

Eugene Victor «Gene» Debs

The recent announcement by U.S. Senator Bernie Sanders, an avowed “democratic socialist,” that he is running for the Democratic nomination for President raises the question of whether Americans will vote for a candidate with that political orientation.

During the first two decades of the twentieth century, the idea of democratic socialism — democratic control of the economy — had substantial popularity in the United States. At the time, the Socialist Party of America was a thriving, rapidly-growing political organization, much like its democratic socialist counterparts abroad — the British Labour Party, the French Socialist Party, the German Social Democratic Party, the Australian Labor Party, and numerous other rising, working class-based political entities around the world. In 1912, when the United States had a much smaller population than today, the Socialist Party had 118,000 dues-paying members and drew nearly a million votes for its candidate, Eugene V. Debs, the great labor leader, for President. (The victor that year was the Democratic candidate, Woodrow Wilson, who drew six million votes.) Furthermore, the party held 1,200 public offices in 340 cities, including 79 mayors in 24 states. Socialist administrations were elected in Minneapolis, Minnesota, Butte, Montana, Flint, Michigan, Schenectady, New York, and all across the country. In 1912, the Socialist Party claimed 323 English and foreign language publications with a total circulation in excess of two million.

Of course, this socialist surge didn’t last. The Democratic and the Republican parties, faced with this threat to their political future, turned to supporting progressive agendas — breaking up or regulating giant corporations, curbing corporate abuses, and championing a graduated income tax — that stole the socialists’ thunder. In addition, after U.S. entry into World War I, an action opposed by the socialists, the federal and state governments moved to crush the Socialist Party — arresting and imprisoning its leaders (including Debs), purging its elected officials, and closing down its publications. Moreover, one portion of the party, excited by the success of revolutionaries in overthrowing Russia’s Czar and establishing the Soviet Union, broke with the Socialist Party and established Communist rivals. Co-opted by the mainstream parties, repressed by government, and abandoned by would-be revolutionaries, the Socialist Party never recovered.

Even so, democratic socialism retained a lingering influence in American life. When a new wave of reform occurred during the New Deal of the 1930s, it included numerous measures advocated and popularized by the Socialist Party: Social Security; public jobs programs like the WPA; minimum wage laws; maximum hour laws; and a steep tax on the wealthy. Here and there, although rarely, socialists even secured public office, and Milwaukee voters regularly elected socialist mayors until 1948. Starting in 1928 and running through the early post-World War II era, Norman Thomas became the attractive, articulate leader of the Socialist Party, and was widely respected among many American liberals and union leaders.

What nearly eliminated the Socialist Party was a combination of New Deal measures (which drew labor and other key constituencies into the Democratic Party) and the public’s identification of Socialism with Communism. Although, in fact, the American Socialist and Communist parties were bitter rivals — the former championing democratic socialism on the British model and the latter authoritarian socialism on the Soviet model — many Americans, influenced by dire conservative warnings, confused the two. Particularly during the Cold War, this further undermined the Socialist Party.

In the early 1970s, with the party barely surviving, most democratic socialists decided it was time to reassess their strategy. They asked: Did the collapse of the Socialist Party mean that, in the United States, democratic socialism was unpopular, or did it mean that third party voting was unpopular? After all, large numbers of Americans supported democratic socialist programs, ranging from national healthcare to public education, from public transportation to taxing the rich, from preserving the environment to defending workers’ rights. What would happen if democratic socialists worked for their programs within the Democratic Party, where the typical constituencies of the world’s democratic socialist parties — unions, racial minorities, women’s rights activists, and environmentalists — were already located? Led by the party’s titular leader, Michael Harrington, whose book The Other America sparked the War on Poverty of the 1960s, they organized Democratic Socialists of America (DSA) and plunged into major social movements and into the Democratic Party.

Although, in the ensuing decades, DSA made little progress toward rebuilding a mass, high profile democratic socialist organization, it did manage to pull thousands of union, racial justice, women’s rights, and environmental activists into its orbit. DSA also discovered a significant number of leftwing Democratic and, sometimes, independent candidates for office who welcomed its support and occasionally joined it. Bernie Sanders — an independent who was elected as mayor of Burlington, Vermont’s only Congressman, and a U.S. Senator from Vermont — is certainly one of the most successful of these politicians. Indeed, in 2012 he won re-election to the Senate with 71 percent of the vote.

But will Americans actually support a democratic socialist in the Democratic Presidential primaries? Sanders himself has conceded that the odds are heavily against him. Even so, although a Quinnipiac poll of American voters in late May of this year found him far behind the much better known and better funded Hillary Clinton, his 15 percent of the vote placed him well ahead of all other potential Democratic candidates. Also, there’s great potential for broadening his support. The latest poll on Americans’ attitudes toward “socialism,” taken in December 2011, found that 31 percent of respondents had a positive reaction to it. And what if Americans had been asked about their attitude toward “democratic socialism”?

Consequently, even if Hillary Clinton emerges as the Democratic nominee, as seems likely, a good showing by Sanders could strengthen the democratic socialist current in American life.

Dr. Lawrence Wittner (http://lawrenceswittner.com) is Professor of History emeritus at SUNY/Albany. His latest book is a satirical novel about university corporatization and rebellion, What’s Going On at UAardvark?

MICHELLE NICKERSON  Mothers of Conservatism: Women and the Postwar Right

PRINCETON UNIVERSITY PRESS, 2012
by CHRISTINE LAMBERS
New Books in Histroy Newtwork   March 18, 2015

Michelle Nickerson

Recently, historians have shown that the modern conservative movement is older and more complex than has often been assumed by either liberals or historians. Michelle Nickerson’s book, Mothers of Conservatism: Women and the Postwar Right (Princeton University Press, 2012) expands that literature even further, demonstrating not only the longer roots of conservative interest in family issues, such as education, but also the important role women played in shaping the early movement. Mothers of Conservatism does this by examining the role of women in the rise of grassroots conservatism during the 1950s. Nickerson explains how women in Southern California became politicized during the height of the Cold War, coming to see communist threats in numerous, mostly local, battles. These women, who were primarily homemakers, argued that they had a special political role as mothers and wives, translating their domestic identities into political activism. Nickerson traces their activism in battles over education and mental health issues among others. She further explains the ideology behind their activism and demonstrates how important these women were to shaping the coming conservative movement and in the long-term, the Republican Party.

Mothers of Conservatism draws on rich archival material as well as on oral history interviews conducted by the author. With these archival sources and interviews, Nickerson brings the activists’ stories, politics, and humanity to life. In this interview, we discuss the ideology, activism, and legacy of the women as well as Nickerson’s experience interviewing her sources.

How the Civil War Changed the Constitution

disunion45The most obvious constitutional result of the Civil War was the adoption of three landmark constitutional amendments. The 13th ended slavery forever in the United States, while the 14th made all persons born in the United States (including the former slaves) citizens of the nation and prohibited the states from denying anyone the privileges and immunities of American citizenship, due process or law, or equal protection of the law. Finally, the 15th Amendment, ratified in 1870, prohibited the states from denying the franchise to anyone based on “race, color, or previous condition of servitude.”

These amendments, however, have their roots in the war itself, and in some ways can been seen as formal acknowledgments of the way the war altered the Constitution. Other changes came about without any amendments. Thus, the war altered the Constitution in a variety of ways. A review of some of them underscores how the Union that President Lincoln preserved was fundamentally different — and better — than the Union he inherited when he became president.

Slavery
The first and most obvious change involves slavery. The 13th Amendment was possible (as were the other two Civil War amendments) only because the war broke slavery’s stranglehold over politics and constitutional development. The Constitution of 1787 protected slavery at every turn. Although framers did not use the word “slavery” in the document, everyone at the Constitutional Convention understood the ways in which the new form of government protected slavery. Indeed, the word “slavery” was not used at the request of the Connecticut delegation and some other Northerners, who feared that their constituents would not ratify the Constitution if the word was in the document — not because the delegates objected to the word itself.

It would take many pages to review all the proslavery features of the Constitution, but here are some of the most significant ones. The three-fifths clause gave the South extra members of the House of Representatives, based on the number of slaves in each state. Without these representatives, created entirely by slavery, proslavery legislation like the Missouri Compromise of 1820 and the Fugitive Slave Law of 1850 could never have been passed.

Equally important, votes in the Electoral College were based on the number of representatives in the House, and so slavery gave the South a bonus in electing the president. Without the electors created by slavery, the slaveholding Thomas Jefferson would have lost the election of 1800 to the non-slaveholding John Adams.

The “domestic insurrections clause” guaranteed that federal troops would be used to suppress slave rebellions, as they were in the Nat Turner Rebellion in 1831 and John Brown’s attempt to start a slave rebellion in 1859.

An image from Harper's Weekly showing the House of Representatives during the passage of the 13th Amendment, Jan. 31, 1865.

An image from Harper’s Weekly showing the House of Representatives during the passage of the 13th Amendment, Jan. 31, 1865.Credit Library of Congress

Finally, it took two-thirds of Congress to send a constitutional amendment to the states, and it took three-fourths of the states to ratify any amendment. Had the 15 slave states all remained in the Union, to this day, in 2015, it would be impossible to end slavery by constitutional amendment, since in a 50-state union, it takes just 13 states to block an amendment.

The political power of the slave states meant that the nation was always forced to protect slavery. Thus the South in effect controlled politics from 1788 until 1861. Slave owners held the presidency for all but 12 years between 1788 and 1850. All of the two-term presidents were slave owners. Three Northerners held the office from 1850 to 1860 — Fillmore, Pierce and Buchanan – but all were proslavery and they bent over backward to placate the South.

It took the Civil War to break slavery’s stranglehold on politics and fundamentally alter the nature of constitutional law and constitutional change.

The demise of slavery began with slaves running away and the army freeing them. But the key moment was the Emancipation Proclamation, which was the first important executive order in American history. In order to destroy slavery — and save the Union — Lincoln found new power for his office.

Secession and Nullification
Since the beginning of the nation, claims that states could nullify federal law or even secede had destabilized American politics and constitutional law. Sometimes Northerners made these claims, such as the disgruntled New Englanders who organized the Hartford Convention to oppose the War of 1812. But most claims of nullification came from the slave South. In 1798 Jefferson secretly wrote the “Kentucky Resolutions,” while his friend James Madison wrote the “Virginia Resolutions”; both asserted the right of the states to nullify federal law.

From the earliest debates over the Union, in the Second Continental Congress, until the eve of the Civil War, numerous Southern politicians publicly advocated secession if they did not get their way on issues involving slavery and other issues. In 1832-33 South Carolina asserted the right to nullify the federal tariff, and then officially (although mostly symbolically) passed an ordinance to nullify the Force Law, which authorized the president to use appropriate military or civil power to enforce federal laws. At this time Georgia also brazenly declared it did not have to abide by a federal treaty with the Cherokees. In 1850 Southerners held two secession conventions, which went nowhere. In the debates over what became of the Compromise of 1850, Senator John C. Calhoun of South Carolina asserted the right of the South to block federal law.

Some Northern opponents of slavery — most notably William Lloyd Garrison — argued for Northern secession because they rightly understood that slavery dominated the American government. But Garrison had few followers, and even many of them never accepted his slogan of “No Union With Slaveholders.” In the mid-1850s the Wisconsin Supreme Court declared the Fugitive Slave Law unconstitutional, but when the Supreme Court upheld the law the Wisconsin Court backed off.

In short, nullification and secession were not new ideas in 1861, when 11 states left the union, but had been part of the warp and weft of constitutional debate since the founding. But the Civil War ended the discussion. The question of the constitutionality of nullification or secession was permanently settled by the “legal case” of Lee v. Grant, decided at Appomattox Court House in April 1865. Grant had successfully defended the Constitution and the idea of a perpetual Union. Secession lost, and the United States won. The Supreme Court would weigh in on this in Texas v. White (1869), holding that secession had never been legal and that the state governments in the Confederacy lacked any legal authority.

Money and National Power
From the beginning of the nation there had been debates over whether the United States government could issue currency. Indeed, before the Civil War there was no national currency, only “bank notes” issued by private banks or state banks. For two periods (1791-1811 and 1816-1836) the federally chartered Bank of the United States circulated bank notes that functioned as a national currency. But Andrew Jackson vetoed the bank’s recharter on the grounds that it was unconstitutional, and for the next 25 years the nation’s economy was hampered by the lack of a stable, national currency.

civil-war-sumter75-popupThe war changed this, too. In order to finance the war, Secretary of the Treasury Salmon P. Chase developed a policy that led to the issuing of “greenbacks,” and suddenly the constitutional issue was settled — not in court, but by the exigency of the conflict. The Supreme Court was perplexed by this new policy and after the war the court briefly declared that issuing greenbacks was unconstitutional, but then quickly changed its mind. Since then, the dollar has emerged as the most important currency in the world. Although no longer backed by gold or silver, American currency remains “the gold standard” for international transactions.

Military Law and Civilians
The war also created a new set of rules — laws that are still with us — for when and how military tribunals or martial law can apply to civilians. For example, when the war began there were no federal laws prohibiting acts of sabotage or for preventing civilians from forming armies to make war on the United States. Nor was there any national police force. Thus, President Lincoln suspended habeas corpus along the railroad route from Philadelphia to Washington and used the Army to arrest pro-Confederate terrorists, like John Merryman, who was tearing up railroads leading to Washington, D.C., and trying to organize a Confederate army in Maryland.

Again, this was a matter of necessity, not ideology: Congress was not in session, and so Lincoln acted on is own authority. Indeed, if Merryman had been successful, members of Congress would have been unable to reach Washington to meet. Congress later approved Lincoln’s actions and authorized even more-massive suspensions of habeas corpus. Thus, the Constitutional rule from the Civil War is that in a dire emergency the government may act to restrain people to preserve public safety.

But what happens when the immediate and pressing emergency is over? May the military still be used to arrest and try civilians? The answer from the Civil War is an emphatic no. During the war military officials in Indiana arrested Lamdin P. Milligan for trying to organize a Confederate army in that state. There was no combat in Indiana at the time, civil society was smoothly functioning, and even Milligan’s allies were not blowing up bridges or destroying railroads as Merryman had been doing. Nevertheless, the Army tried Milligan and sentenced him to death. In 1866, in Ex parte Milligan, the Supreme Court ruled that the trial was unconstitutional. The military might arrest Milligan because of the emergency of the war (just as it had arrested Merryman), but the court ruled that if the civilian courts were open, as they were in Indiana, it was unconstitutional to try a civilian in a military court.

This has generally been the law of the land ever since. In the aftermath of 9/11 the Supreme Court upheld the rule that civilians (even terrorists in the United States) could not be tried by military tribunals, but could only be tried by civilian courts. The Justices relied on Milligan.

The-New-York-Times-Disunion-106-Articles-from-The-New-York-Times-Opinionator-Hardcover-P9781579129286

Racial Change and the Movement Toward Racial Equality
When the war began, federal law denied African-Americans virtually all constitutional rights. In Dred Scott v. Sandford, decided in 1857, Chief Justice Roger B. Taney ruled that blacks could never be citizens of the United States, even if they were treated as citizens in the states where they lived. This led to the oddity that blacks could vote for members of Congress and presidential electors in six states, and could hold office in those states and some others, but they were not citizens of the nation. Federal law nevertheless supported Taney’s rulings. For example, before the war blacks could not be members of state militias, serve in the national army, receive passports from the State Department, or be letter carriers for the post office.

During the war all this began to change. In 1862 Congress authorized the recruitment of blacks in the national army and in state militias. While most black soldiers were enlisted men, some served as noncommissioned officers, and a few served as officers. Martin Delaney held the rank of major. Just as striking, Eli Parker, a member of the Seneca nation, served on Ulysses S. Grant’s personal staff as a lieutenant colonel and was promoted to brevet brigadier general at the very end of the war.

The war also broke down racial and ethnic/religious taboos and attitudes. Abraham Lincoln became the first president to meet with blacks, and in the case of Frederick Douglass, seek out their advice. In 1864 and 1865 Congress gave charters to street railway companies that required that there be no discrimination in seating. Congress also changed the law that limited military chaplains to ministers of the gospel, thus allowing rabbis and Roman Catholic priests to become chaplains. During the war Congress created the office of recorder of the deeds for the city of Washington. The first officer holder was Simon Wolfe, a Jewish immigrant, but after that, the office was held by African-Americans for the rest of the century, including Frederick Douglass, Blanch Bruce, a former senator, and Henry P. Cheatham, a former congressman. In his last public speech Lincoln called for enfranchising black veterans and other members of their race. Five years later the Constitution would reflect that goal in the 14th and 15th amendments.

Today we rightly look back at these two amendments, and the 13th, as the most important lasting constitutional legacies of the Civil War. And that they are. But it is also important that we look at how America’s understanding of the Constitution, especially as it related to racial and ethnic equality, changed during the course of the war, and not simply as a consequence of it. Put differently: The Civil War amendments changed the Constitution. But even if, somehow, they had never happened, the war itself would have altered the way Americans saw one another, and their government.

finkelman

Paul Finkelman is a senior fellow in the Penn Program on Democracy, Citizenship and Constitutionalism at the University of Pennsylvania and a scholar-in-residence at the National Constitution Center.

Boston Review   June 9, 2015

G.I. Bill student with wife and children walking in front of the Old Capitol, The University of Iowa, 1948. Image: Iowa Digital Library.

One of the deep, long-term changes in American lives has been what social historians call the “standardization” of the life course. From the nineteenth into the twentieth century, increasingly more young Americans were able to follow a common sequence: get educated, get a job, leave parents’ home, get married, have children, and become financially secure (to be followed by empty nest, retirement, and “golden years”)—the American Dream in one, widely-shared package.

In recent decades, however, Americans’ life courses have become less standardized, less shared. A new study, by Jeremy Pais and D. Matthew Ray, shows how much this historical reversal is connected to economic fortunes. The less affluent, who were late to standardization in the twentieth century, are in the twenty-first increasingly leading “non-standard” lives.

Standardization

A century and more ago, many a young American’s expectations of life were overturned by the mishaps of life, such as the early death of a parent or spouse, debilitating illness or accident, and farm- or job-devastating weather or depression. Over time, life became more straightforward and predictable as death and illness retreated while affluence and security grew. (Earlier posts on this point are herehere, and here.) Life-planning way into the future became more sensible.

Over the same period, American society developed institutions that increasingly structured the life course and made it more shared. Schooling became required of all, with specific starting, grade, and graduation ages for everyone. New laws stipulated minimum ages for when one could marry, could work, and should stop working.

Americans also increasingly chose to pursue common life courses. One important indication is number of children. Around 1900, Americans often had either many children or no children at all, but by 2000 most American parents had converged to having two children, give or take another one. Rather than generating more diversity, Americans’ greater freedom from circumstance allowed them to follow shared norms.

Historian David Stevens reported in 1990 that the correlation between how old Americans were and when they took critical life steps strengthened from 1900 to 1970. That is, Americans increasingly took these steps at the same age.

De-Standardization

After about 1970, diversity in the sequencing of life transitions grew: children before marriage, full-time employment while still living with parents, being unmarried late in life, long delays in attaining financial security, and so on. In that same 1990 study, Stevens found that the age-transition correlation–that is, standardization–began reversing after 1970. Later studies showed the trend accelerating. Indeed, the seemingly new disarray and unpredictability of life for twenty-somethings (actually, a throwback to a century ago) gained an academic label, “emerging adulthood,” and an accompanying academic journal.

The most likely factor in de-standardization is the economic setbacks—not the cultural shifts—of the last few decades.

This historical reversal of standardization might be attributed to cultural shifts since the 1960s: acceptance of divorce, of cohabitation, of women breaking old restraints, and so on. The extended education that at least middle-class youth pursued also contributed. But the most likely or most important factor seems to have been the economic setbacks—or, more precisely, the unequal economic setbacks—of the last few decades.

In their new study, Pais and Ray describe what they label the “Adult [Male] Attainment Project,” another way of viewing the standard life course. The American ideal, they write, is that, by about age 40 men should have attained, and sustained, these five statuses:

  • working (or studying);
  • marriage;
  • living independently with their wives;
  • settled parenthood—i.e, living with their children; and
  • owning a home.

These attainments did, indeed, generally come as a package; adult men who had one tended to have the others. Pais and Ray found that, from 1980 through 2010, almost half of American men aged 35 to 45 had the whole package, the American Dream.

However, the proportion of American men who occupied these statuses declined from 1980 to 2010, and much more so among the less affluent. In 1980, men who were in the top two-fifths of family income were four times as likely as men in the bottom two-fifths to have completed the “adult attainment project”; in 2010, they were eight times as likely. The class gap in being married and in residing with children widened especially.

The regularization and predictability of American life that had developed over a few generations and culminated in the post-war era seems to have started unraveling, at least for the less advantaged. The era of standardization may have been a passing phase; unpredictability may be the long-term norm. Perhaps, but if so, today’s non-standard patterns must be–given that early death and similar misfortunes are still at historical lows—the result of other factors, largely economic inequality, it appears.

Claude S. Fischer is Professor of Sociology at the University of California, Berkeley and author of Made in America. In his bimonthly BR column, Fischer explores controversial social and cultural issues using tools of sociology and history.