Feeds:
Entradas
Comentarios

Maryland Man May Have Found Two Lost of Forgotten Photos of Lincon´s Funeral Procesion

Michael E. Ruane

Washington Post   March 19, 2014

In the first photograph, the crowd outside the church seems to be waiting for something to come down the street. Children stand up front so they can see. Women, in the garb of the mid-1800s, shield themselves from the sun with umbrellas. White-gloved soldiers mill around. And a few people have climbed a tree for a better view.

(Mathew Brady/The National Archives)

In this second shot, some heads are bowed. Men have taken off their hats. And the blur of a large black object is disappearing along the street to the left of the frame. What the scene depicts, why it was photographed, or where, has been a mystery for decades, experts at the National Archives say. But a Maryland man has now offered the theory that the two photos are rare, long-forgotten images of Abraham Lincoln’s funeral procession in New York City.

(Mathew Brady/The National Archives)

Paul Taylor, 60, of Columbia, a retired federal government accountant, believes the scene is on Broadway, outside New York’s historic Grace Church.

The day is Tuesday, April 25, 1865, 11 days after Lincoln was shot at Ford’s Theatre in Washington.

 And the crowd is waiting for, and then seems to be paying homage before, a horse-drawn hearse, whose motion makes it appear as a black blur as it passes by in the second picture.

If Taylor is right, scholars say he has identified rare photos of Lincoln’s marathon funeral rites, as well as images that show mourners honoring the slain chief executive.

Plus, it appears that the photographs were taken from an upper window of the studio of famed Civil War photographer Mathew Brady, which was across the street from the church.

“It’s a big deal,” said Richard Sloan, an expert on the Lincoln funeral ceremonies in New York. “What makes it even a bigger deal is to be able to study the people. Even though you can’t see faces that well, just studying the people tells a story.”

Sloan added, “It’s as if you’re there, and you can see the mood.”

Many people, including children, are in their Sunday best. A few look up at the camera. Flowers are in bloom. But there is no levity.

Sloan said he is convinced that the pictures show the funeral scenes: “There’s no doubt about it.”

But experts at the Archives caution that although the theory sounds good, there could be other explanations, and no way to prove it conclusively.

The digital photographs were made from some of the thousands of Brady images acquired by the federal government in the 1870s and handed down to the National Archives in the 1940s, according to Nick Natanson, an archivist in the Archives’ still-picture unit.

Next year is the 150th anniversary of Lincoln’s assassination.

The two photos in question, both captioned “scene in front of church,” apparently had gone unnoticed for decades.

“We’ve had many inquiries about many images in the Brady file,” he said. “I can’t remember . . . any inquiries about these two particular images. I don’t think I ever noticed them before.”

But something about them intrigued Taylor when he saw them among the hundreds of Brady photographs posted on an Archives Flickr photo-sharing site in January.

Both were unusual four-image pictures — four shots of the same scene grouped together.

“I was just struck by the scene,” Taylor said. “That is not your normal scene in front of church. There’s just people everywhere: the streets, the sidewalks, the roof. They’re in the trees. This is not your normal Sunday.”

In the second picture, “I saw this black streak,” he said. “When I looked at it closer, I saw what it was. It was a funeral vehicle. . . . I knew it was Lincoln. It had to be. It couldn’t be anybody else.”

Natanson, of the Archives, was skeptical. “It still strikes me as odd that . . . there wouldn’t have been some mention or some hint [in the caption] of the monumental nature of the event,” he said.

There could have been other events, “maybe even other processions, maybe even other funerals” during that time period, he said. “I don’t think its possible to establish this without any doubt.”

But if Taylor is right, it could be an important discovery, Natanson said: “It isn’t as if there are dozens of images of the funeral procession anywhere.”

The funeral observances for Lincoln, who was assassinated by actor John Wilkes Booth on April 14, 1865, went on for more than two weeks. During that time, the president’s body was moved by train on a 13-day, 1,600-mile journey from Washington to Springfield, Ill., where he was buried May 4.

Along the way, the train stopped in over a dozen major cities, and his coffin was removed for numerous processions and elaborate tributes.

Washington historian James L. Swanson has called the funeral journey a “death pageant” that was viewed by millions of people and that helped create the image of Lincoln the martyred president.

New York was the fourth major stop on the journey, after Baltimore, Harrisburg, and Philadelphia.

The president’s coffin, with the lid unfortunately open, was placed on view in New York’s City Hall on April 24, according to Swanson’s account. Lincoln had been dead for 10 days, and his face was “not a pleasant sight,” the New York Times reported.

The next day, with the lid closed, the coffin was borne through jammed streets aboard a black hearse decorated with flags and black plumes and drawn by a team of 16 horses shrouded in black.

A half-million people lined the route, much of which was along Broadway.

“Thousands and thousands of these lookers on were too young . . . and were doubtless brought in order that in old age they might say they saw the funeral procession of Abraham Lincoln,” the Times wrote the next day.

Taylor said his investigation of the photos began Jan. 4, when he first noticed them. The captions didn’t give him much to go on. The problem was that the original glass negatives probably didn’t have captions on them, said Brady biographer Robert Wilson. And by the time the government acquired the negatives, any caption information that went with them was probably lost.

Taylor turned to the Internet for images of historic churches, to see whether he could find the one in the Brady images. He looked up historic churches in Baltimore. No luck. Then he tried historic churches in New York.

That search brought up Grace Episcopal church, the 168-year Gothic edifice on Broadway at Tenth Street.

“I’m looking at it, and that was it,” he said. “I had it.”

He e-mailed his findings to the Archives on March 3.

Taylor, who said he has long been fascinated by historic photographs, said he does not think the images have ever been published before.

Bob Zeller, president of the Center for Civil War photography, agreed, but he wrote in an e-mail: “There is always a slim chance that somebody somewhere has recognized and printed [them] in some obscure . . . publication.”

“Either way, it’s incredibly historic, (a) totally fresh piece of our American photo history,” he wrote. “Even if someone materializes, that still means 99.9 percent of us, enthusiasts and historians, have never seen it.”

Fifty years later, new accounts of its fraught passage reveal the era’s real hero—and it isn’t the Supreme Court.
Michael O’Donell
The Atlantic  March 19, 2014

President Johnson confronts Senator Richard Russell, the leader of the filibuster against the civil-rights bill. (Yoichi Okamoto/National Archives)

In the winter of 1963, as the Civil Rights Act worked its way through Congress, Justice William Brennan decided to play for time. The Supreme Court had recently heard arguments in the appeal of 12 African American protesters arrested at a segregated Baltimore restaurant. The justices had caucused, and a conservative majority had voted to decide Bell v. Maryland by reiterating that the Fourteenth Amendment’s equal-protection clause did not apply to private businesses like restaurants and lunch counters—only to “state actors.” The Court had used this doctrine to limit the reach of the Fourteenth Amendment since 1883. Brennan—the Warren Court’s liberal deal maker and master strategist—knew that such a decision could destroy the civil-rights bill’s chances in Congress. After all, the bill’s key provision outlawed segregation in public accommodations. Taxing his opponents’ patience, he sought a delay in order to request the government’s views on the case. He all but winked and told the solicitor general not to hurry.

And then the conservatives on the Court lost their fifth vote. Justice Tom Clark changed his mind and circulated a draft opinion granting the appeal. In a revolutionary constitutional change, lunch counters and restaurants would suddenly be liable if they violated the equal-protection clause. But Brennan foresaw a new difficulty. By now it was June 1964, and a coalition of northern Democratic and Republican senators looked set to break a southern filibuster and pass a strong civil-rights bill. Would a favorable Supreme Court ruling actually give wavering senators an excuse to vote no? They might say there was no need for legislation because the Court had already solved the problem. So Brennan, ever nimble, engineered a tactical retreat by assembling a majority that avoided the merits of the case altogether. It was an alley-oop to the political branches. They grabbed the ball and dunked it. Ten days after the Court’s decision, Congress passed the Civil Rights Act and the president signed it into law.

In the popular imagination, the Supreme Court is the governmental hero of the civil-rights era. The period conjures images of strong white pillars, Earl Warren’s horn-rims, and the almost holy words Brown v. Board of Education. But in Bell, the Court vindicated civil rights by stepping aside. As Bruce Ackerman observes in The Civil Rights Revolution, Brennan realized that a law passed by democratically elected officials would bear greater legitimacy in the South than a Supreme Court decision. He also doubtless anticipated that the act would be challenged in court, and that he would eventually have his say. The moment demonstrated not merely cooperation among the three branches of government, but a confluence of personalities: Brennan slowing down the Court, President Johnson leaning on Congress to hurry up, and the grandstanders and speechmakers of the Senate making their deals, Everett Dirksen and Hubert Humphrey foremost among them. In this age of obstruction and delay, it is heartening to recall that when the government decides to act, it can be a mighty force.

But three equal branches rarely means three equal burdens, and the civil-rights era was no exception. Although the Court-centered narrative undervalues the two political branches, of those two branches it was the executive that provided decisive leadership in the 1960s. Just as the intragovernmental cooperation of 1964 is striking in light of today’s partisan gridlock, the presidential initiative displayed during the mid-’60s is worth considering in light of Barack Obama’s perceived hands-off approach to lawmaking. Of course, no discussion of civil-rights leadership is complete without including Martin Luther King Jr., who provided moral and spiritual focus, infusing the movement with resolution and dignity. But the times also called for a leader who could subdue the vast political and administrative forces arrayed against change—for someone with the strategic and tactical instincts to overcome the most-entrenched opponents, and the courage to decide instantly, in a moment of great uncertainty and doubt, to throw his full weight behind progress. The civil-rights movement had the extraordinary figure of Lyndon Johnson.

The Civil Rights Act turns 50 this year, and a wave of fine books accompanies the semicentennial. Ackerman’s is the most ambitious; it is the third volume in an ongoing series on American constitutional history called We the People. A professor of law and political science at Yale, Ackerman likens the act to a constitutional amendment in its significance to the country’s legal development. He acknowledges the Supreme Court’s leadership during the 1950s, when President Eisenhower showed little enthusiasm for civil rights, and when Congress passed the largely toothless Civil Rights Act of 1957. During those same years, the Court spoke with a loud, clear voice, unanimously deciding Brown, which ordered the desegregation of schools, and Cooper v. Aaron, which held that state segregation laws conflicting with the Constitution could not stand. But the Supreme Court does not command the National Guard or control the budget. Someone needed to enforce those decisions in the defiant South. That is why, Ackerman writes, “the mantle of leadership passed to the president and Congress,” beginning with the 1964 law.

But the political branches ventured into the fray only in the last weeks of 1963. President Kennedy had introduced the bill in June of that year with much ambivalence. As Todd S. Purdum, a senior writer at Politico, recounts in An Idea Whose Time Has Come, Kennedy had led a sheltered life in matters of race. While generally sympathetic to civil-rights ideals, he “believed that strong civil rights legislation would be difficult if not impossible to pass, and that it could well jeopardize the rest of his legislative program.” He had tried to attack literacy tests and other barriers to voting with legislation but had twice been defeated in the Senate, where the old bulls of the South wielded the filibuster with practiced skill. (Roy Wilkins of the NAACP observed, “Kennedy was not naïve, but as a legislator he was very green.”) He regarded Martin Luther King Jr. warily, and with each new southern crisis saw his agenda slipping away. But events finally forced Kennedy to act. The Freedom Riders in Montgomery, the dogs and water cannons in Birmingham, and the sit-in in Jackson all made further equivocation on civil rights impossible by the spring of 1963. Four hours after Kennedy’s speech calling for legislation, an assassin murdered the NAACP organizer Medgar Evers in his own driveway. Five months after that, the bill was stuck in the House Rules Committee—“the turnstile at the entry to the House of Representatives,” in Purdum’s phrase—and the country had a new president.

In 1963, the Reverend Joseph Carter (far left) was the first African American in his Louisiana parish to register to vote. He was jeered as he walked down the courthouse steps. (Bob Adelman/Corbis)

Purdum, whose book is an astute, well-paced, and highly readable play-by-play of the bill’s journey to become a law, describes the immense challenges facing Lyndon Johnson after Kennedy’s assassination. “When it came to civil rights, much of America was paralyzed in 1963,” he writes. That certainly included Congress. The civil-rights bill, which had been languishing in the House since June, had no hope of coming to a full vote in the near future, and faced even bleaker prospects in the Senate. In fact, Kennedy’s entire legislative program was at a standstill, with a stalled tax-cut bill, eight stranded appropriations measures, and motionless education proposals. And Congress was not Johnson’s only problem. He also had to ensure the continuity of government, reassure the United States’ allies, and investigate Kennedy’s assassination. Purdum’s version of this story is excellent, but he cannot surpass the masterful Robert A. Caro, who offers a peerless and truly mesmerizing account of Johnson’s assumption of the presidency in The Passage of Power.

Days after Kennedy’s murder, Johnson displayed the type of leadership on civil rights that his predecessor lacked and that the other branches could not possibly match. He made the bold and exceedingly risky decision to champion the stalled civil-rights bill. It was a pivotal moment: without Johnson, a strong bill would not have passed. Caro writes that during a searching late-night conversation that lasted into the morning of November 27, when somebody tried to persuade Johnson not to waste his time or capital on the lost cause of civil rights, the president replied, “Well, what the hell’s the presidency for?” He grasped the unique possibilities of the moment and saw how to leverage the nation’s grief by tying Kennedy’s legacy to the fight against inequality. Addressing Congress later that day, Johnson showed that he would replace his predecessor’s eloquence with concrete action. He resolutely announced: “We have talked long enough in this country about equal rights. We have talked for 100 years or more. It is time now to write the next chapter, and to write it in the books of law.”

President Johnson talks with civil-rights leaders in the Oval Office in January 1964. From left: Martin Luther King Jr., LBJ, Whitney Young, and James Farmer. (Yoichi Okamoto/AP)

The New York Times journalist Clay Risen contends in The Bill of the Century that Johnson’s contribution to the Civil Rights Act’s success was “largely symbolic.” One might say the same thing about Neil Armstrong’s walk on the moon. Sometimes symbolism is substance—especially where the presidency is concerned. The head of the executive branch firmly seized the initiative, taking up a moribund bill addressing the nation’s most agonizing problem. Here was Johnson, president for only five days, working out of the Executive Office Building because the White House was still occupied by Kennedy’s family and staff, with an election already looming less than a year away. Instead of proceeding tentatively, as most anyone in those circumstances would have done, he radiated decisiveness, betting everything he had right after he got it. As Caro shows so persuasively, from that moment, Johnson’s urgency and purpose infused every stage of the bill’s progress. And in the days and weeks that followed, the stagnant cloud that had settled over Kennedy’s agenda began to lift.

Symbolism was the least of it. Johnson took off his jacket and tore into the legislative process intimately and tirelessly. As the former Senate majority leader, he knew his way around Capitol Hill like few other presidents before him—and none since. The best hope of moving the civil-rights bill from the House Rules Committee—whose segregationist chairman, Howard Smith of Virginia, had no intention of relinquishing it—was a procedure called a “discharge petition.” If a majority of House members sign a discharge petition, a bill is taken from the committee, to the chagrin of its chairman. Johnson made the petition his own personal crusade. Even Risen credits his zeal, noting that after receiving a list of 22 House members vulnerable to pressure on the petition, the president immediately ordered the White House switchboard to get them on the phone, wherever they could be found. Johnson engaged an army of lieutenants—businessmen, civil-rights leaders, labor officials, journalists, and allies on the Hill—to go out and find votes for the discharge petition. He cut a deal that secured half a dozen votes from the Texas delegation. He showed Martin Luther King Jr. a list of uncommitted Republicans and, as Caro writes, “told King to work on them.” He directed one labor leader to “talk to every human you could,” saying, “if we fail on this, then we fail in everything.”

The pressure worked. On December 4—not two weeks into Johnson’s presidency—the implacable Chairman Smith began to give way. Rather than have the bill taken from his committee, he privately agreed to begin hearings that would conclude before the end of January, and then release the bill. Smith looked set to renege on his agreement in the new year, but reluctantly kept his word, allowing the bill to be sent to the full House on January 30, 1964. Risen credits others with this development, suggesting that it was Representative Clarence Brown of Ohio, a Republican member of the Rules Committee, among others, who got Smith to move. Risen is particularly sharp on the evolution of the Republicans during these tumultuous years, but here he accords them too much clout. Brown had to answer to House Republican Leader Charles Halleck of Indiana, whose support Johnson likely bought by proposing, and then personally securing, a NASAresearch facility at Purdue University, in Halleck’s district. And the entire Republican caucus in the House was wilting under Johnson’s relentless and very public campaign to portray “the party of Lincoln” as obstructing civil rights by opposing the discharge petition.

Johnson kept the bill moving in the Senate by dislodging President Kennedy’s tax-cut bill from the Finance Committee. As vice president, Johnson had advised Kennedy not to introduce civil-rights legislation until the tax cut had cleared Congress. Kennedy didn’t listen, and now both bills were stuck. (Like House Rules, Senate Finance had a wily segregationist for a chairman: Harry Byrd of Virginia.) Risen minimizes the significance of this problem, writing that the tax bill “presented no procedural obstacle to the civil rights bill, only a political one.” (And when does politics ever derail legislation?!) As Caro explains, the tax bill was a hostage. By holding it in committee, the South pressured the administration to give up on civil-rights legislation, with the implication that the withdrawal of the latter might produce movement on the former. But Johnson and Byrd were old friends, and during an elaborate White House lunch they came to an understanding: if Johnson submitted a budget below $100 billion, Byrd would release the tax bill. Johnson then personally bullied department heads to reduce their appropriations requests, and delivered a budget of $97.9 billion. The Finance Committee passed the tax bill on January 23, 1964, with Byrd casting the deciding vote to allow a vote, then weighing in against the measure itself. The Senate passed the tax bill on February 7, mere days before the civil-rights bill cleared the House.

Finally, Johnson helped usher the bill to passage in the Senate by working to break the southern filibuster, which was led by his political patron, the formidable Richard Russell of Georgia. In light of the Senate’s fiercely guarded independence, the president could not operate in the open; he had to use proxies like Humphrey, who was his protégé and future vice president, as well as the bill’s floor manager. Johnson impressed upon Humphrey that the vain and flamboyant Senate Republican Leader Everett Dirksen of Illinois was the key to delivering the Republican votes needed for cloture:

“You and I are going to get Ev. It’s going to take time. We’re going to get him. You make up your mind now that you’ve got to spend time with Ev Dirksen. You’ve got to let him have a piece of the action. He’s got to look good all the time. Don’t let those [liberal] bomb throwers, now, talk you out of seeing Dirksen. You get in there to see Dirksen. You drink with Dirksen! You talk with Dirksen! You listen to Dirksen!”

Johnson demanded constant updates from Humphrey and Majority Leader Mike Mansfield, and always urged more-aggressive tactics. (“The president grabbed me by my shoulder and damn near broke my arm,” said Humphrey.) Even though Senate Democrats did not deploy all those tactics, Johnson’s intensity nevertheless set the tone and supplied its own momentum. He kept up a steady stream of speeches and public appearances demanding Senate passage of the strong House bill, undiluted by horse-trading. And he personally lobbied senators to vote for cloture and end the filibuster. Risen contends that Johnson “persuaded exactly one senator” to change his vote on cloture. Given that it is of course impossible to know what motivated each senator’s final decision, this lowball figure is expressed with too much certitude. Evidence presented by Purdum and Caro suggests that Johnson’s importuning, bribing, and threatening may have made an impact on closer to a dozen. The Senate invoked cloture on June 10, breaking the longest filibuster in the institution’s history. The full Senate soon passed the bill. Johnson signed it into law on July 2, 1964, and immediately turned his energies to what would become another landmark statute: the Voting Rights Act of 1965.

Risen’s attempt to minimize Johnson’s significance in the passage of the Civil Rights Act—“he was at most a supporting actor”; “he was just one of a cast of dozens”; “the Civil Rights Act was not his bill by any stretch”—is perplexing. In an otherwise strong book, his revisionist view is less a question of facts than of emphasis: after all, Purdum too notes that Johnson “strategically limit[ed] his own role” at key moments (careful, for example, not to upstage Dirksen). But Risen seems bent on denying Johnson his due, drawing nearly every inference against him and repeatedly overstating the anti-Johnson case. On the one hand, Risen is right to take a fresh look at the evidence and tell the story from a new perspective, focusing on unsung heroes such as Dirksen, Humphrey, Representative William McCulloch, and Nicholas Katzenbach of the Justice Department. He makes a fair point in questioning the way history awards presidents the credit for measures that by necessity cross many desks. On the other hand, Risen is simply wrong to portray Johnson as some hapless operator for trying multiple tactics and targets, some of them unsuccessfully. Johnson’s very comprehensiveness is what jarred the sluggish and paralyzed Capitol into action and ultimately moved the bill.

President Johnson signs the Civil Rights Act into law on July 2, 1964. (Cecil Stoughton/White House Press Office)

If the president led and Congress followed, where did that leave the Supreme Court? Three months after Johnson signed the Civil Rights Act, the Court heard arguments in a pair of cases challenging the constitutionality of its most contentious provision—Title II, which outlawed segregation in public accommodations. In December 1964 the Court decided Katzenbach v. McClungand Heart of Atlanta Motel v. United States, upholding Title II as a valid exercise of Congress’s commerce power. In the years since, the act has been a remarkable success. Its acceptance in the South was surprisingly quick and widespread. In a stroke, the act demolished the rickety but persistent foundation for segregation and Jim Crow. Title II reached far into the daily lives of southerners, creating an unprecedented level of personal mingling between the races and making integration a fact of daily life. Title VII, meanwhile, has vastly reduced workplace discrimination, through the efforts of the Equal Employment Opportunity Commission. Although years of toil, struggle, and bloodshed still lay ahead, the 1964 law dealt a major blow to the system of segregation. The past 50 years of American history are almost unimaginable without it.

And yet the anniversary prompts an ominous reconsideration of the Supreme Court’s role in civil rights. In 1954, the Court launched the federal government’s assault on segregation, with Brown. In 1964, it got out of the way of the political branches, then quickly ratified their work. Today when it comes to racial civil rights, the Roberts Court is an aggressively hostile force. Recall Ackerman’s contention that the 1964 act has taken on the weight of a constitutional amendment. At a literal level, this is of course untrue: the act was not ratified by three-quarters of the states and is not part of the written Constitution. This means that a constitutional amendment is not needed to overturn the Civil Rights Act, which is vulnerable to a subsequent act of Congress or, more to the point, a decision by the Supreme Court.

Ten years ago, even mentioning this possibility would have seemed outrageous. But last June, the Court decided Shelby County v. Holder, striking down Section 4(b) of the Voting Rights Act of 1965 as unconstitutional. Section 4(b) listed the states with a history of voting discrimination that were required to seek preclearance from the Justice Department or the courts before amending their voting laws. The 5–4 decision by Chief Justice John Roberts is nothing short of appalling: as unpersuasive as it is misguided, it is, in Ackerman’s words, “a shattering judicial betrayal” of the civil-rights era. It is also the Roberts Court’s most brazenly activist decision: Congress has reauthorized the Voting Rights Act four times, most recently in 2006, with votes of 390–33 in the House and 98–0 in the Senate. In her brilliant dissent, Justice Ruth Bader Ginsburg summed up the decision’s obtuseness: “Throwing out pre-clearance when it has worked and is continuing to work to stop discriminatory changes is like throwing away your umbrella in a rainstorm because you are not getting wet.”

Shelby County may be so unique that it portends no harm for the Civil Rights Act. After all, the preclearance regime was extraordinarily invasive. Ackerman calls it the biggest federal intrusion into the prerogatives of the southern states since Reconstruction. But Title II of the Civil Rights Act is also strong medicine, reaching beyond state actors to tell private businesses whom they must serve. It was by far the act’s most controversial provision—and it remains controversial among some conservatives. In 2010, Senator Rand Paul caused a sensation by arguing that the provision in the Civil Rights Act dealing with “private business owners” (ostensibly Title II) is unconstitutional. He quickly walked back his comments, but his father, Ron Paul, proudly continues to make the same argument, and the Tea Party is listening. The Heritage Foundation’s Web site files the McClung decision upholding Title II on its “Judicial Activism” page, tagged to the terms Abusing Precedent and Contorting Text. The Voting Rights Act decision can only embolden Title II’s opponents.

And they just might get a hearing. Three trends in the Roberts Court’s jurisprudence suggest that the justices would be more receptive to a challenge to Title II than any prior Court. First is its disregard for precedent. The Roberts Court has repeatedly ignored prior decisions when doing so enabled a conservative victory—most notoriously in the areas of gun regulation (District of Columbia v. Heller) and campaign finance (Citizens United v. Federal Election Commission). Hence it is little comfort that the Court upheld Title II in 1964. It had also previously upheld the Voting Rights Act and its reauthorizations. Second is the Roberts Court’s impatience with open-ended civil-rights measures, which some justices believe are no longer necessary. “The tests and devices that blocked access to the ballot have been forbidden nationwide for over 40 years,” the Court wrote in Shelby County, dismissing the need for ongoing vigilance against voting discrimination. And third is the Court’s continued disdain for the commerce clause. Remember when Roberts’s decision upholding the Affordable Care Act made the point that the act was not a valid exercise of Congress’s commerce power? He was singling out the section of the Constitution that supports the Civil Rights Act.

The 1964 law is not in imminent danger from the Supreme Court. But it is worth considering how a hostile Court changes the equation from 1964, when the judiciary acted in concert with the political branches. The new paradigm places a premium on presidential leadership, at the very least in nominating judges and justices who are in sympathy with the great statutes of the 1960s. But the battle over the Civil Rights Act shows that presidents who are serious about concrete social progress must do even more.

Lyndon Johnsons, of course, do not come along every four or every 40 years. Even if they did, Johnson brought plenty of darkness (election stealing, a credibility gap, Vietnam) along with the light (Civil Rights Act, Voting Rights Act, Great Society). Moreover, not every president needs to be a legislative genius in order to pass laws. Obama, after all, gambled big on the Affordable Care Act, investing the same type of capital in health care that Johnson invested in civil rights. It is now the law of the land. But the energy and purpose that Johnson brought to the Civil Rights Act struggle remains inspiring, and is a model for all presidents. As Richard Russell, the South’s leader in the Senate during the 1960s, put it to a friend a few days after Kennedy’s assassination: “You know, we could have beaten John Kennedy on civil rights, but not Lyndon Johnson.”

Michael O’Donnell is a lawyer in Chicago. His writing has appeared in The Wall Street Journal, The Nation, and The Washington Monthly.

When Cigarettes Were Good for Women

by Blain Roberts

HNN  March 17, 2014

A recent advertisement in the Sports Illustrated swimsuit edition for blu eCigs, a popular brand of electronic cigarettes, hit what one public health expert has called “a new high in terms of chutzpah.” It is audacious, though a more literal description might be that the ad hit a new low: it’s a crotch shot, showing a woman’s body cropped from just above her pierced belly button to her mid-thighs. A miniscule black bikini bottom, adorned with the company’s logo, barely covers what’s underneath.  Posed provocatively around the bikini, the woman’s hands appear ready to remove the item of clothing, if you can call it that. The caption reads, “Slim. Charged. Ready to Go.”

Doctors and public health advocates worry about ads like these, which associate e-cigarettes with female sexuality in a bid to attract male consumers, especially teenage boys, who may be tempted to take up vaping and thus put themselves at risk for nicotine addiction.

Beyond the health consequences of such marketing tactics, anyone who cares about the effects of exploiting and sexualizing women’s bodies has obvious reason for concern, too. After all, the blu eCig model seems as much the commodity as the e-cigarette. She is objectified by the ad’s producers, as she will be, presumably, by its consumers as well.

Tobacco and women’s bodies have a long history, to which e-cigarettes (technically tobacco-less) are indebted. Yet this ad belies the complexity of this past. Surprisingly, the sexual sell in the tobacco market—and tobacco use itself—provided modern American women a way to lay claim to their desires, sexual and otherwise.

For years, Americans frowned upon both female tobacco use and female sexuality. Throughout the nineteenth century, the Victorian understanding of separate spheres, which deemed women morally superior and sexually passive, proscribed a variety of activities, like sex (outside of procreation), drinking, business, and politics. These pursuits and pleasures were for men, as was enjoying tobacco—whether it was by chewing it or smoking it in a pipe or cigar, all sensual activities that bordered on the sexual. Tobacco use was simply off-limits to respectable, middle-class women, white and black. Only prostitutes, actresses, and bohemians indulged in the tobacco habit, which sealed its association with a lack of womanly virtue.

Change came fitfully.  In the 1880s and ‘90s, the American Tobacco Company, the Durham, North Carolina, manufacturer that pioneered the selling of cigarettes, bucked traditional standards. It wasn’t that the company targeted potential female smokers; rather, it introduced salacious trade cards into cigarette packs to appeal to men. The cards featured pictures of women scantily dressed, at least by the conventions of the day. Uncovered arms and legs were in abundance, as were stockings, ribbons, and fringe. The cards were brazen acknowledgements of women’s sexuality. Respectable Americans were not ready, and critics pounced.

Image via Duke University Library.

Yet by the 1910s and ‘20s, a full-blown challenge to Victorianism was underway, with young women leading the charge. They demanded the right to bob their hair, wear cosmetics and short skirts, and, like their male peers, dance, drink alcohol, have sex, and, of course, smoke cigarettes. As Zelda Sayre Fitzgerald, a precocious teenage smoker and quintessential Jazz Age figure put it, flappers altered everything about their behavior and appearance and “went into the battle.”  The battle to break free from restrictive norms and assert their individuality was waged, and largely won, in cities and on college campuses, in cars and in nightclubs, and in tobacco advertising campaigns, which increasingly supported women’s new desires. Liggett and Myers, maker of Chesterfields, released a magazine ad in 1926 with the tag line “Blow Some My Way.” The illustration featured a woman gazing longingly at her cigarette-smoking companion.

Several years later, a woman in a Chesterfield spot, shown lighting her partner’s cigarette, said coyly,  «Somehow, I just like to give you a light.»  The Chesterfield slogan, «They Satisfy,» drove home the message: female sexuality and tobacco use were now celebrated.»

By the 1930s and ‘40s, the use of female sexuality to promote tobacco had even migrated to the tobacco farms of the Southeast. This region grew much of the tobacco sold in the United States, and during the lean years of the Depression, it needed to pump up demand. Trade boards sponsored beauty pageants for rural women (all white, given Jim Crow customs), who vied for the title of tobacco queen and sometimes competed in skimpy two-piece outfits made out of dried tobacco leaves. Hardly asked to shun the product, women and tobacco were one and the same.

This fact alone made the photographs taken of the beauties arresting, but the images were also suffused with sexual innuendo and phallic images. These photos show contestants and queens putting themselves up for evaluation and auction, like tobacco brought to market for sale. They leisurely puff on foot-long cigarettes and smoke corncob pipes as men, standing in intimate proximity, look on with rapt attention.  What these men were thinking was an open question. In one photograph from the mid-1940s, a North Carolina tobacco queen held a tobacco leaf over her breasts. It was obvious that with one movement she could have been topless.

Image via North Carolina Department of Archives.

Used for marketing purposes, these images were intended for tobacco consumers everywhere, but it’s worth emphasizing that this unusual iconography was intended, in part, to chip away at deep-rooted objections to female smoking in rural areas, where only 8 per cent of women smoked, compared to about 40 percent in cities. The photographs glamorized the sensuous pleasures of tobacco use, suggesting to farm women that smoking, and freer expressions of sexuality, were theirs to claim. Women in more conservative parts of America who subsequently picked up the tobacco habit thus redefined what it meant to be female in their communities.  In the South especially, where a Gordian knot of patriarchy and white supremacy depended upon the sexual subordination of women, this was not an inconsequential development.

All of this culminated in the famous Virginia Slims campaign, launched by the Richmond, Virginia-based Phillip Morris Company in 1968, to promote the new, slimmer cigarette made just for women. Capitalizing on the modern women’s movement, Phillip Morris embraced the language of feminism to demonstrate, as the tag line proclaimed, “You’ve Come a Long Way, Baby.” Ads contrasted the contemporary, sexually liberated woman, Virginia Slims in hand, with her oppressed female forebears. In one magazine spot, images of a turn-of-the century housewife suffering from the drudgery of household chores—like churning butter!—were paired with the tongue-in-cheek rhyme, “I want a girl, just like the girl that married Dear Old Dad. She’ll wash the floors, polish up the doors, and never make me mad. She won’t smoke or be a suffragette, she will always be my loving pet.” Underneath, the Virginia Slims smoker smiled knowingly at the reader.

The modern woman had come a long way, and tobacco, as this history demonstrates, had helped get her there. Still, there were clearly pitfalls in this strategy of advancing women’s emancipation. Lung disease and death seem a poor trade-off for not having to wash the floors.

Moreover, the line between sexual empowerment and sexual objectification was a thin one, easily transgressed. Sometimes it was difficult to determine who controlled the sexuality on display. The recent blu eCigs advertisement highlights this problem in a striking way: it’s hard to argue that the bikini-clad woman is empowered when you can’t even see her face. This ad, in short, provides a cautionary reminder. When it comes to fighting for women’s liberation, we must be careful in selecting our weapons.

Blain Roberts is associate professor of history at California State University, Fresno. She is the author of Pageants, Parlors, and Pretty Women:  Race and Beauty in the Twentieth-Century South

index2The Black Press During the Civil War

Kevin McGruder

The New York Times   March 13, 2014

Although the Civil War began as a conflict over secession, from the start most blacks saw it as an opportunity to free the enslaved with a Union victory – a theme reflected in the robust black press that prospered across the North.

In New York City, the war was closely chronicled by two newspapers, The Anglo-African and The Christian Recorder. Established in 1859 by the editor Robert Hamilton and his brother Thomas, The Anglo-African reported extensively on the Civil War and the emancipation efforts. But Anglo-African articles also covered the breadth of African-American life, with a focus on political issues relevant to black Americans, presented by black writer and activists like Frances Ellen Watkins Harper, the Rev. James W.C. Pennington and Martin Delany.

The Christian Recorder, founded in 1848, was a national weekly newspaper published by the African Methodist Episcopal Church, based in Philadelphia, but with correspondents across the country. The New York area was served by correspondents in Manhattan and Brooklyn, who, along with The Recorder’s editor, provided an unvarnished critique of the war and frequently of New York’s black community.

Black New Yorkers were uniquely positioned to participate in debates regarding the war and emancipation. In the 1860s New York City and New York State were centers of free black advocacy. The abolitionist Frederick Douglass lived in Rochester. Many of the “colored men’s conventions” that met periodically from 1830 until 1864 met in New York State. New York City was a center of philanthropy, abolitionist activism and publishing. The city’s 1860 black population of 12,000, from a total population of approximately 800,000, made it second in population to Philadelphia’s free black community.

Black newspapers weren’t just sources of information, but of activism. As the country hurtled toward war in February 1861, The Christian Recorder spread word of a meeting held to plan for a “day of humiliation, fasting and prayer that God would avert the judgments about to fall upon this guilty nation.” They were also a center for debate: As soon as the war began in April 1861, even though black troops had not yet been accepted by the Union Army, there was heated discussion in the black community about the duties of blacks in regard to the war. Some voices in the black press, like The Christian Recorder, questioned the logic of black soldiers’ risking their liberty (captured black soldiers could be enslaved) or their lives for a country whose Supreme Court had held that black people, whether enslaved or free, were not citizens.

The Anglo-African, though, actively promoted the use of black troops in an editorial titled “The Reserve Guard” that August:

Colored men whose fingers tingle to pull the trigger, or clutch the knife aimed at the slaveholders in arms, will not have to wait much longer. Whether the fools attack Washington and succeed or whether they attempt Maryland and fail, there is equal need for calling out the nation’s ‘Reserve Guard.’

The newspapers were more than just hortatory – they also provided historical and comparative analysis of the issues surrounding emancipation. On Jan. 4, 1862, The Christian Recorder reinforced calls for emancipation with a persuasive and prophetic editorial that asked, “What would be the effect of the emancipation of the slaves?” Using data from the British Caribbean, where slavery had been abolished in the 1830s, the editorial confronted two major arguments against emancipation: that the formerly enslaved would “overrun the entire North as the frogs did the Egyptians in the days of Moses,” and that if emancipated “they will refuse to work, and will engage in robbery and murder.” The editorial noted that neither points had been borne out in the Caribbean, that there were already many formerly enslaved people in the South who chose to remain in the South, and that many of these people were cultivating small farmsteads that were key to the independent lives they desired. The writer concluded that for the United States, it was in “our interest to emancipate the slaves of both the rebel and loyal citizens, for it will not only crush rebellion, but increase our prosperity, decrease crime in our midst, and prevent insurrections with their fearful horrors.”

Reading these papers offers a surprising view into the nuanced ways that blacks greeted early signs of emancipation. They greeted Lincoln’s Preliminary Emancipation Proclamation of September 1862, for example, with great anticipation but also some anxiety. Because the effective date for a permanent Emancipation Proclamation was three months away, on Jan. 1, 1863, the fear was that something might occur to change course during the intervening period. In response, in an October editorial, The Christian Recorder swept aside doubts and framed the Proclamation as an answer to prayers:

Now, let the North if they are in favor of the Union, not stop and tremble at the proclamation, but say, like all honest and good men will say, that it is the Lord’s doings, and who shall hinder it? Yes, God has looked down upon this great national sin, and is now frowning upon it, and declares His judgment upon it. He has heard the groans of His people, and has come down to deliver them.

The Emancipation Proclamation did become effective on Jan. 1, 1863, and the Jan. 10 issue of the The Anglo-African contained over a page of accounts of Emancipation celebrations in New York, St. Louis and Boston.

In addition to emancipating the enslaved in the states then in rebellion, the Proclamation also included a provision for recruiting black soldiers. While this order had national implications, the states that had remained in the Union had the final say on admitting black troops, since militias were organized by the states – a fact highlighted in the black press. Massachusetts and Rhode Island organized some of the first black regiments, and New York City’s black press played an important role in advocating for the recruitment of black troops.

That March Congress passed the Conscription Act, authorizing the first military draft. When the actual draft process began in New York City in July 1863, mobs of white workingmen, resentful of being asked to put their lives at risk for black people whom they had been told would flood Northern cities taking their jobs, destroyed the Manhattan Draft office and then roamed the city over four days in the largest assault on the black community in New York’s history. Union troops arrived on the fourth day of the rioting and put an end to the violence. In the aftermath, The Christian Recorder recounted defense efforts: “In Weeksville and Flatbush, the colored men who had manhood in them armed themselves, and threw out their pickets every day and night, determined to die defending their homes.”

But the paper also criticized other black New Yorkers: “To see strong, hearty, double-fisted men, fleeing like sheep before the whoop of a dozen half-grown Irish lads, leaving their wives behind to take care of themselves, was indeed humiliating.”

While black New Yorkers recovered from the riots, the black press redoubled its advocacy of black troop recruitment. In its final issue of 1863, The Anglo-African announced:

The War Department having at last done justice to colored men, and authorized the raising of a colored regiment in this State, to be known as the Twentieth Regiment United States Colored Troops, meetings have been called in several wards, as will be seen by reference to our advertising columns, for the purpose of discussing plans to promote enlistments and providing for the families of those who may enlist.

The recruiting was so successful that a second regiment, the 26th, was authorized. When the regiments left for battle in March of 1864, New York’s black press shifted its focus to advocacy for equal pay for black soldiers. At the same time The Anglo-African and the Christian Recorder chronicled battlefield efforts, and with a shift in wartime momentum toward the Union in 1864, began to focus on issues such as black voting, that would need to be attended to in peacetime. The Anglo-African continued publication until December 1865. The Christian Recorder continues to appear today, as a monthly publication.

Follow Disunion at twitter.com/NYTcivilwar or join us on Facebook.


Sources: The Anglo-African; The Christian Recorder; Sandy Dwayne Martin, “Black Churches and the Civil War: Theological and Ecclesiastical Significance of Black Methodist Involvement, 1861-1865”; Paul Finkelman, “Encyclopedia of African American History, 1619-1895, From the Colonial Period to the Age of Frederick Douglass”; Iver Bernstein, “The New York City Draft Riots: Their Significance for American Society and Politics in the Age of the Civil War”, Rhoda Golden Freeman, “The Free Negro in New York City in the Era Before the Civil War”; William Seraile, “New York’s Black Regiments During the Civil War.”


Kevin McGruder is an assistant professor of history at Antioch College. He is the author of  “A Fair and Open Field: The Responses of Black New Yorkers to the New York City Draft Riots” and the co-author, with Velma Maia Thomas, of  Emancipation Proclamation, Forever Free.

(Image: Library of Congress)

Obama: Ike Redivivus?

by Victor Davis Hanson

National Review Online March 11, 2014
In critique of the George W. Bush administration, and in praise of the perceived foreign-policy restraint of Obama’s first five years in the White House, a persistent myth has arisen that Obama is reminiscent of Eisenhower — in the sense of being a president who kept America out of other nations’ affairs and did not waste blood and treasure chasing imaginary enemies.

Doris Kearns Goodwin, Andrew Bacevich, Fareed Zakaria (“Why Barack Is like Ike”), and a host of others have made such romantic, but quite misleading, arguments about the good old days under the man they consider the last good Republican president.

Ike was no doubt a superb president. Yet while he could be sober and judicious in deploying American forces abroad, he was hardly the non-interventionist of our present fantasies, who is so frequently used and abused to score partisan political points.

There is a strange disconnect about Eisenhower’s supposed policy of restraint, especially in reference to the Middle East, and his liberal use of the CIA in covert operations. While romanticizing Ike, we often deplore the 1953 coup in Iran and the role of the CIA, but seem to forget that it was Ike who ordered the CIA intervention that helped to lead to the ouster of Mossadegh and to bring the Shah to absolute power. Ike thought that he saw threats to Western oil supplies, believed that Mossadegh was both unstable and a closet Communist, sensed the covert hand of the Soviet Union at work, was won over by the arguments of British oil politics, and therefore simply decided Mossadegh should go — and he did.

Ike likewise ordered the CIA-orchestrated removal of the leaders of Guatemala and the Congo. He bequeathed to JFK the plans for the Bay of Pigs invasion, which had been born on the former’s watch. His bare-faced lie that a U-2 spy plane had not been shot down in Russia did terrible damage to U.S. credibility at the time.

The Eisenhower administration formulated the domino theory, and Ike was quite logically the first U.S. president to insert American advisers into Southeast Asia, a move followed by a formal SEATO defense treaty to protect most of Southeast Asia from Communist aggression — one of the most interventionist commitments of the entire Cold War, which ended with over 58,000 Americans dead in Vietnam and helicopters fleeing from the rooftop of the U.S. embassy in Saigon.

Eisenhower’s “New Look” foreign policy of placing greater reliance on threats to use nuclear weapons, unleashing the CIA, and crafting new entangling alliances may have fulfilled its short-term aims of curbing the politically unpopular and costly use of conventional American troops overseas. Its long-term ramifications, however, became all too clear in the 1960s and 1970s. Mostly, Ike turned to reliance on nuke-rattling because of campaign promises to curb spending and balance the budget by cutting conventional defense forces — which earned him the furor of Generals Omar Bradley, Douglas MacArthur, and Matthew Ridgway.

In many ways, Eisenhower’s Mideast policy lapsed into incoherency, notably in the loud condemnation of the 1956 British-French operations in Suez (after Nasser had nationalized the Suez Canal), which otherwise might have weakened or toppled Nasser. This stance of Eisenhower’s (who was up for reelection) may have also contradicted prior tacit assurances to the British that the U.S. would in fact look the other way.

The unexpected American opposition eroded transatlantic relations for years as well as helped to topple the Eden government in Britain. Somehow all at once the U.S. found itself humiliating its two closest allies, empowering Nasser, and throwing its lot in with the Soviet Union and the oil blackmailers of Saudi Arabia — with ramifications for the ensuing decades.

Yet just two years later, Ike ordered 15,000 troops into Lebanon to prevent a coup and the establishment of an anti-Western government — precisely those anti-American forces that had been emboldened by the recent Suez victory of the pan-Arabist Nasser. We forget that Ike was nominated not just in opposition to the non-interventionist policies of Robert Taft, but also as an antidote to the purportedly milk-toast Truman administration, which had supposedly failed to confront global Communism and thereby “lost” much of Asia.

Eisenhower gave wonderful speeches about the need to curtail costly conventional forces and to avoid overseas commitments, but much of his defense strategy was predicated on a certain inflexible and dangerous reliance on nuclear brinksmanship. In 1952 he ran to the right of the departing Harry Truman on the Korean War, and unleashed Nixon to make the argument of Democratic neo-appeasement in failing to get China out of Korea. Yet when he assumed office, Eisenhower soon learned that hinting at the use of nuclear weapons did not change the deadlock near the 38th Parallel. Over 3,400 casualties (including perhaps over 800 dead) were incurred during the Eisenhower administration’s first six months. Yet the July 1953 ceasefire ended the war with roughly the same battlefield positions as when Ike entered office. Pork Chop Hill — long before John Kerry’s baleful notion about the last man to die in Vietnam — became emblematic of a futile battle on the eve of a negotiated stalemate.

Ike’s occasional opportunism certainly turned off more gifted field generals like Matthew Ridgway, who found it ironic that candidate Ike had cited a lack of American resolve to finish the Korean War with an American victory, only to institutionalize Ridgway’s much-criticized but understandable restraint after his near-miraculous restoration of South Korea. In addition, Ridgway deplored the dangerous false economy of believing that postwar conventional forces could be pruned while the U.S. could rely instead on threatening the use of nuclear weapons. He almost alone foresaw rightly that an emerging concept of mutually assured destruction would make the conventional Army and Marines as essential as ever.

As a footnote, Eisenhower helped to marginalize the career of Ridgway, the most gifted U.S. battlefield commander of his era. Ike bore grudges and was petty enough to write, quite untruthfully, that General James Van Fleet, not Ridgway, had recaptured Seoul — even though the former had not even yet arrived in the Korean theater. That unnecessary snub was reminiscent of another to his former patron George Marshall during the campaign of 1952. Ridgway, remember, would later talk Eisenhower out of putting more advisers into Vietnam.

The problem with the Obama administration is not that it does or does not intervene, given the differing contours of each crisis, but rather that it persists in giving loud sermons that bear no relationship to the actions that do or do not follow: red lines in Syria followed by Hamlet-like deliberations and acceptance of Putin’s bogus WMD removal plan; flip-flop-flip in Egypt; in Libya, lead from behind followed by Benghazi and chaos; deadlines and sanctions to no deadlines and no sanctions with Iran; reset reset with Russia; constant public scapegoating of his predecessors, especially Bush; missile defense and then no missile defense in Eastern Europe; Guantanamo, renditions, drones, and preventive detentions all bad in 2008 and apparently essential in 2009; civilian trials for terrorists and then not; and Orwellian new terms like overseas contingency operations, workplace violence, man-caused disasters, a secular Muslim Brotherhood, jihad as a personal journey, and a chief NASA mission being outreach to Muslims. We forget that the non-interventionist policies of Jimmy Carter abruptly ended with his bellicose “Carter Doctrine” — birthed after the Soviets invaded Afghanistan, American hostages were taken in Tehran and Khomeinists had taken power, China went into Vietnam, and Communist insurgencies swept Central America.

As for Dwight Eisenhower, of course he was an admirable and successful president who squared the circle of trying to contain expansionary Soviet and Chinese Communism at a time when the postwar American public was rightly tired of war, while balancing three budgets, building infrastructure, attempting to deal with civil rights, and promoting economic growth. Yet the Republican Ike continued for six months the identical Korean War policies of his unpopular Democratic predecessor Harry Truman, and helped to lay the foundation for the Vietnam interventions of his successors, Democrats John F. Kennedy and Lyndon Johnson. That the initial blow-ups in Korea and Vietnam bookended his own administration may have been a matter of luck, given his own similar interventionist Cold War policies.

Bush was probably no Ike (few are), and certainly Obama is not either. But to score contemporary political points against one and for the other by reinventing Eisenhower into a model non-interventionist is a complete distortion of history. So should we laugh or cry at the fantasies offered by Andrew Bacevich? He writes: “Remember the disorder that followed the Korean War? It was called the Eisenhower era, when budgets balanced, jobs were plentiful and no American soldiers died in needless wars.”

In fact, the post–Korean War “Eisenhower era” was characterized by only three balanced budgets (in at least one case with some budget gimmickry) out of the remaining seven Eisenhower years. In 1958 the unemployment rate spiked at over 7 percent for a steady six months. Bacevich’s simplistic notion that “jobs were plentiful” best applies to the first six months of 1953, when Ike entered office and, for the only time during his entire tenure, the jobless rate was below 3 percent — coinciding roughly with the last six months of fighting the Korean War. This was an age, remember, when we had not yet seen the West German, South Korean, and Japanese democratic and economic miracles (all eventually due to U.S. interventions and occupations), China and Russia were in ruins, Western Europe was still recovering from the war, Britain had gone on a nationalizing binge, and for a brief time the U.S. was largely resupplying the world, and mostly alone — almost entirely with its own oil, gas, and coal. Eisenhower’s term was characterized by intervention in Lebanon, fighting for stalemate in Korea, CIA-led coups and assassinations, the insertion of military advisers into Vietnam, new anti-Communist treaty entanglements to protect Southeast Asian countries, a complete falling out with our European allies, abject lies about spy flights over the Soviet Union, serial nuclear saber-rattling, and Curtis LeMay’s nuclear-armed overflights of the Soviet Union — in other words, the not-so-abnormal stuff of a Cold War presidency.

And the idea that, to quote from Doris Kearns Goodwin, Eisenhower “could then take enormous pride in the fact that not a single soldier had died in combat during his time” is, well, unhinged.

National Review Online contributor Victor Davis Hanson is a senior fellow at the Hoover Institution and the author, most recently, of The Savior Generals

800px-SlaveDanceand_Music

American Finance Grew on the Back of Slaves

By Edward E. Baptist and Louis Hyman 

Chicago Sun-Times.com March 7, 2014 

Last weekend we watched the Oscars and, like most people, were pleased that “Twelve Years a Slave” won Best Picture. No previous film has so accurately captured the reality of enslaved people’s lives. Yet though Twelve Years shows us the labor of slavery, it omits the financial system — asset securitization — that made slavery possible.

Most people can see how slave labor, like the cotton-picking in “Twelve Years A Slave,” was pure exploitation. Few recognize that a financial system nearly as sophisticated as ours today helped Solomon Northup’s enslavers steal him, buy him, and market the cotton he made. The key patterns of that financial history continue to repeat themselves in our history. Again and again, African-American individuals and families have worked hard to produce wealth, but American finance, whether in the antebellum period or today, has snatched black wealth through bonds backed by asset securitization.

Recently, the assets behind these bonds were houses. In the antebellum period, the assets were slaves themselves.

Every year or two, somebody discovers that a famous bank on Wall Street profited from slavery. This discovery is always treated as if the relationship between slavery and the American financial system were some kind of odd accident, disconnected from the present. But it was not an accident. The cotton and slave trades were the biggest businesses in antebellum America, and then as now, American finance developed its most innovative products to finance the biggest businesses.

In the 1830s, powerful Southern slaveowners wanted to import capital into their states so they could buy more slaves. They came up with a new, two-part idea: mortgaging slaves; and then turning the mortgages into bonds that could be marketed all over the world.

First, American planters organized new banks, usually in new states like Mississippi and Louisiana. Drawing up lists of slaves for collateral, the planters then mortgaged them to the banks they had created, enabling themselves to buy additional slaves to expand cotton production. To provide capital for those loans, the banks sold bonds to investors from around the globe — London, New York, Amsterdam, Paris. The bond buyers, many of whom lived in countries where slavery was illegal, didn’t own individual slaves — just bonds backed by their value. Planters’ mortgage payments paid the interest and the principle on these bond payments. Enslaved human beings had been, in modern financial lingo, “securitized.”

As slave-backed mortgages became paper bonds, everybody profited — except, obviously, enslaved African Americans whose forced labor repaid owners’ mortgages. But investors owed a piece of slave-earned income. Older slave states such as Maryland and Virginia sold slaves to the new cotton states, at securitization-inflated prices, resulting in slave asset bubble. Cotton factor firms like the now-defunct Lehman Brothers — founded in Alabama — became wildly successful. Lehman moved to Wall Street, and for all these firms, every transaction in slave-earned money flowing in and out of the U.S. earned Wall Street firms a fee.

The infant American financial industry nourished itself on profits taken from financing slave traders, cotton brokers and underwriting slave-backed bonds. But though slavery ended in 1865, in the years after the Civil War, black entrepreneurs would find themselves excluded from a financial system originally built on their bodies. As we remind our students in our new online course American Capitalism: A History, African-Americans — unable to borrow either to buy property or start businesses — lived in a capitalist economy that allowed them to work, but not to benefit.

More recently, history repeated itself — or more accurately, continued. The antebellum world eerily prefigured the recent financial crisis, in which Wall Street securitization once again stepped in to strip black families of their wealth.

In the 1990s red-lining began to end and black homeownership rates began to rise, increasing the typical family’s wealth to $12,100 by 2005 — or one-twelfth that of white households. In those years, African-American family incomes were also rising about as rapidly as white family incomes. And yet, African-American buyers, playing catch-up after centuries of exclusion from the benefits of credit, still typically had lower net worth and credit ratings. They paid higher interest rates and fees to join the housing bubble, and so securitizing their mortgages brought enormous profits to lenders and investors.

Then the crash of 2008 came. By 2010, median African-American household wealth had plunged by 60 percent — all those years of hard work lost in fees, interest, and falling prices. For whites, the decline was only 23 percent, and those losses were short-lived. Lenders resumed lending to white borrowers, restoring the value of their assets. But African-American borrowers have had a much harder time getting new loans, much less holding on to property bought at securitization-inflated prices. Median white household wealth is now back up to 22 times that of blacks — erasing African-Americans’ asset gains over the preceding 20 years.

Recent foreclosures represent another transfer of wealth from African-Americans to the investors of the world. For the past 200 years, the success of American finance has been built on the impoverishment of African-American families. We should remember the heroic struggles of African Americans to get political equality, but to forget their exclusion from our financial system, except as a source of exploitation, is to miss a basic truth of not only black history but financial history.

Edward E. Baptist and Louis Hyman teach history at Cornell University.

Pedestrian Modern: Shopping and American Architecture, 1925-1956

by Marshall Poe

New Books in History  March 13, 2014

David Smiley

David Smiley

Most of us have been to strip malls–lines of shops fronted by acres of parking–and most of us have been to closed 414HNi6tCpL._SL160_malls–massive buildings full of shops and surrounded by acres of parking. Fewer of us have been to open malls: small parks ringed by shops with parking carefully tucked out of sight. That’s because open malls–once numerous–have largely disappeared, having been replaced by strip malls, closed malls and, more recently, big-box stores.

As David Smiley points out in his wonderfully researched and beautifully illustrated book Pedestrian Modern: Shopping and American Architecture, 1925-1956 (University of Minnesota Press, 2013), the open mall was a response to a number of macro-historical, mid-twentieth century forces: the explosion of car culture, the decline of urban centers, the rise of suburbs, and, of course, mass consumerism. But he also shows that the open mall wasn’t just an banal machine for selling; it was a canvas upon which Modernist architects could create a uniquely American kind of Modernist architecture. The strip mall, the closed mall, and the big-box store may be artless, but the mid-century open mall certainly was not. It had style, as the many wonderful images in David’s book show.

Interestingly, the open mall is making a comeback. I visited one outside Hartford, Connecticut. Alas, it has none of the Modernist elements that made the original open malls so interesting. To me, it looked like a closed mall turned inside out.

Listen here

hnn-logo-new
America´s Dien Bien Phu Syndrome

by John Prados
Histrory News Network    March 12, 2014
Image via Wiki Commons.

Image via Wiki Commons.

March 13, 2014 marks the sixtieth anniversary of the day in 1954 when the Vietnamese revolutionaries known as the Viet Minh opened the Battle of Dien Bien Phu, which marked the end of the French imperial adventure in Indochina. General Vo Nguyen Giap, the Viet Minh commander, passed away just a few months ago and did not live to see this day. But Giap, who served as the defense minister of North Vietnam through the entire American War — and, indeed, many Vietnamese — always considered Dien Bien Phu their greatest moment.

It’s not hard to see why.

During America’s war in Vietnam, the North Vietnamese beneffitted from having a real army, trained over years, well-equipped by Chinese and Soviet patrons, and a well-entrenched state apparatus. At the time of Dien Bien Phu, by contrast, the Viet Minh controlled only portions of the land (outside of the major cities, naturally), faced economic challenges, and were already weary from years of bitter fighting. In addition, the logistical obstacles simply in mounting the effort to assault the remote French position were enormous.

Dien Bien Phu was a far-away mountain valley in the northwest quadrant of Vietnam, hundreds of miles from Viet Minh bases. Roads were few and mostly had not been maintained for a decade. To support an army there — and the Viet Minh numbered 50,000 men — required a scale of supply far beyond anything the Vietnamese had ever attempted. Their opponents, the French Expeditionary Corps, possessed all the advantages of a modern, Western army — tanks, guns, planes, elite paratroops and Foreign Legion units, sophisticated command control mechanisms, good intelligence regarding their adversary — and they fought in a region where the Viet Minh had made many fewer inroads with the population than in the coastal lowlands. The French had another major advantage: massive militaryaid from the United States, a torrent by comparison with Chinese and Soviet support for the Viet Minh.

But this did not mean the French expected victory to be easy at Dien Bien Phu. It was in many respects the final roll of the dice for the French war effort — and the generals knew it. Like their enemy, France had grown weary of the war. The mountain valley lay far from French bases too, and the total French lack of control of the ground in northwest Vietnam made Dien Bien Phu completely dependent on aerial supply. When Giap’s artillery opened a barrage on Dien Bien Phu’s airfield, the only way French troops could be resupplied was via airdrop. Within days Giap’s men captured positions that sealed it completely shut with anti-aircraft guns ringing the drop zone.

By then, the battle became an albatross around the French neck. Only American intervention in the form of Operation Vulture could have saved the French position. Washington struggled hard throughout the siege of Dien Bien Phu, and even after it ended, to craft conditions suitable for American military action. The effort to create a platform from which to intervene did not end with the Geneva agreements of 1954, or with the formation of the Southeast Asia Treaty Organization, or even with U.S. support for the nascent government of South Vietnam — and it ultimately led direct to America’s war in Vietnam.

The decades since Dien Bien Phu are littered with similar dramas. The typical production features a local ally — usually a government but sometimes an insurgent force — who possesses a modicum of power but is unstable, and an adversary (with varying degrees of power and determination) contesting some place the United States considers to have strategic importance. Today, the play is Crimea. Syria was yesterday. A year ago, Libya. Iraq (and its prelude). Afghanistan. Kosovo. Haiti. Somalia. Panama. Nicaragua. Lebanon. the Dominican Republic. The reviews of these productions can be left to others.

At Dien Bien Phu, the United States had a substantial capacity to act. But the lesson of Dien Bien Phu is that the critical variables lie in the stability of America’s local ally and in its own goals and interests, rather than U.S. firepower. At Dien Bien Phu American intelligence believed there was no reason the loss of the French garrison should affect the overall conduct of the war. But General Giap and Ho Chi Minh knew better. A weary American ally had decided the game was no longer worth the candle and wanted to get out of the war. That made Paris extraordinarily vulnerable to the impact of a military defeat in the Vietnamese mountains. Washington discovered it could not make Paris stick to the commitments the French made along the way as the U.S. strove to craft conditions for its intervention. Something similar appears to have happened in Afghanistan, where Hamid Karzai is backing away his own commitments to the United States.

In making their decisions on intervention, United States officials need to become much more sophisticated in their appreciations of the stability of local allies — and discerning of the goals and interests of those parties to conflict.

John Prados is a senior fellow of the National Security Archive in Washington, DC. His current ebook is Operation Vulture: America’s Dien Bien Phu. Read more of Prados’s work on his website. © John Prados, 2014

The Historian Who Unearthed ´Twelve Years as Slave´

New Yorker, March 7, 2014

Sue Eakin

Sue Eakin

Accepting the Oscar for Best Picture on Sunday—technically, it might have been Monday at that point—Steve McQueen took a moment to thank “this amazing historian Sue Eakin,” who “gave her life’s work to preserving Solomon’s book.” It was an unusual shout-out: we’re used to seeing Harvey Weinstein or God get thanked, not historians from Louisiana. But it’s safe to say that without Eakin, who died in 2009, at the age of ninety, none of us would be talking about Solomon Northup, or Patsey, or the other once-forgotten souls portrayed in this year’s Best Picture.

Eakin, who taught at Louisiana State University at Alexandria for twenty-five years, spent her career rescuing Northup’s memoir from obscurity. “There were five of us, and Solomon was the sixth,” Eakin’s son Frank said the other day, from his home in Texas. “There was never a time when he was not part of the conversation.” Eakin grew up near Cheneyville, Louisiana, the eldest of nine children, and discovered Northup when she was twelve. One summer day in 1931, her father, a planter, drove her in a flatbed truck to the nearby town of Bunkie, not far from the property once owned by Edwin Epps (the Michael Fassbender character). They were visiting Oak Hall Plantation, where her father had business with the owner, Sam Haas. Haas brought young Sue to the library on the second floor (“My mom was a big-time bookworm,” Frank says), where he handed her a dusty copy of “Twelve Years a Slave,” first published in 1853.

“I began reading the old book as rapidly as I could, becoming more and more excited with every page,” Eakin wrote later. “I recognized local place names like Cheneyville, where our mail was delivered.” The family names were familiar, too: the Tanners, the Fords, the McCoys. Eakin was rapt, but her father picked her up before she could finish reading. Back then, the book was in scant supply. Eakin didn’t find another copy until 1936, when she arrived at Louisiana State University and spotted the book at Otto Claitor’s Bookstore. She asked Mr. Claitor how much it cost. “What do you want that for?” he said. “There ain’t nothin’ to that old book. Pure fiction.” He sold it to her for twenty-five cents.

Eakin devoted the rest of her life to proving him wrong. As a white woman growing up in Jim Crow-era Louisiana, she had been forward-thinking about race. In 1944, she invited a black choir to sing at the Haas Auditorium, in Bunkie, causing community uproar. A burning cross landed on her front yard. After church one Sunday, she discovered some kids trying to set her house on fire. “I never let it worry me,” she later recalled. But her weapon of choice was history, and Solomon became her obsession. (Her many other books include histories of Cheneyville and Rapides Parish.) She wrote her master’s thesis on “Twelve Years a Slave,” and, in 1968, published the first modern edition. But her research continued. She contacted descendants of Northup and Epps, and helped preserve a side house on Epps’s former property. (In the movie, Chiwetel Ejiofor and Brad Pitt are shown building it.)

With state funding, she developed the Northup Trail, a tour of key locations from the book. “I said, ‘Mom, I know how much this means to you.’ But I didn’t want her expectations to be unmet,” Frank said. “All you had was a bunch of rusty signs. Not many people showed up.” Nevertheless, Eakin believed that someday Northup’s story would get its due. Frank recalled going to courthouses and descendants’ houses as a child. “If she had any news of anybody that could contribute, we were off to the races in the car with the fifty-pound tape recorder and an old camera,” he said. In 1983, she even wrote a musical based on Northup’s life.

Eakin spent her last years working on an expanded edition of “Twelve Years a Slave,” including the decades of research she had accumulated since 1968. But her health began to decline, and her eyesight was poor. Frank’s sister helped her edit the new version, and, in 2007, at the age of eighty-eight, Eakin published the enhanced edition, with maps, pictures, and historical notes. She wrote in the acknowledgments, “Now Solomon and I can rest.” Two years later, she died.

Every couple years, Eakin would get a phone call from someone interested in turning “Twelve Years a Slave” into a movie. Nothing came of it. (Besides, the story was in the public domain.) The year she died, there was another call—Frank spoke to someone and shrugged it off. It wasn’t until 2011, two years after his mother’s death, that Frank heard that Brad Pitt’s production company, Plan B, was making the film. He offered to help however he could. Steve McQueen flew Frank to the Hollywood première, where they discussed his mother’s contribution: “Steve said, ‘There wouldn’t be a movie if it wasn’t for your mom’s discovery when she was twelve.’ ”

Frank spent Oscar night at a viewing party in Texas. “When Brad Pitt introduced Steve, everything was complete silence,” he said. Hearing his mother’s name, he was dumbstruck. “I couldn’t believe my ears. I looked around and said, ‘Did did he say that?’ ” As the publisher of his mother’s edition and the audiobook, Frank has been busy in the wake of the film’s success. But mostly he’s happy to see his mother posthumously validated: “She never sought personal publicity. Her passion was history, getting the history out.” Even the Northup Trail is getting refurbished—no more rusty signs. “Yesterday, I was on the phone with the tourism commission,” Frank said. “They see this as a large tourism opportunity.”

Photograph: Courtesy of Frank Eakin.

index2The Plot to Kill Jeff Davis

By Ronald S. Coddington

New York Times, March 10, 2014

Samuel Kingston, a Union soldier and prisoner of war, languished in a dungeon on a late winter’s day in March 1864. The cell was in the basement of infamous Libby Prison in Richmond, Va., the capital of the Confederacy. A severe cough and cold racked his body. His cellmates were similarly affected. Ten in all, they were crammed into a dank, drafty cell not much larger than a common tent. Rebel guards provided Kingston and the others with nothing more than scraps of food for subsistence and an open bucket for a toilet. If some of the guards had had their way, the prisoners would be left to rot in the filth and cold of the converted brick warehouse.

Four of the cellmates were enlisted men of color, who were often abused, if not executed, by their Southern captors. But in the minds of the guards, the other six, including Kingston, had done something even more heinous: They were implicated in an alleged assassination attempt against the Confederate president, Jefferson Davis and members of his cabinet.

The mysterious plot to take out the senior leadership of the South was uncovered in papers found during a Union cavalry raid on Richmond. The stated purpose of the coup de main was to free federal troops held in Libby Prison and the nearby Belle Isle camp.

Collection of the author Samuel Kingston sat for this portrait in the photographer Mathew Brady’s New York City studio, circa 1863.

Samuel Kingston sat for this portrait in the photographer Mathew Brady’s New York City studio, circa 1863. Collection of the author.

The raid began on the evening of Feb. 28, 1864. A column of handpicked troopers, 3,584 sabers strong, crossed the Rapidan River at Ely’s Ford, about 65 miles north of Richmond. A half-dozen artillery pieces and a few supply wagons and ambulances accompanied the cavalrymen.

The brain behind the audacious operation was a junior cavalry commander in the Army of the Potomac who worked back channels to sell the plan to the Lincoln administration. Hugh Judson Kilpatrick, a West Point-educated brigadier driven by reckless personal ambition, had a penchant for suicidal charges and pushing his troopers to exhaustion. “Kill Cavalry,” as he became known, had started his career as a horse soldier in the summer of 1861 when he was named lieutenant colonel of the Second New York Cavalry. An amalgamated regiment composed of recruits from New York, New Jersey, Connecticut and Indiana, six of the 10 companies hailed from the Empire State.

Kingston was a latecomer to the regiment. A meticulous bachelor who worked as physician in the bustling community of Oswego, N.Y., he joined the Second as an assistant surgeon in May 1863. He had his baptism to war during the seven-week-long Gettysburg Campaign, although the regiment did not fight in the eponymous three-day battle that broke an unprecedented streak of victories by Confederate Gen. Robert E. Lee and his Army of Northern Virginia.

Half a year later, Kingston mounted his horse and joined his comrades on Kilpatrick’s Raid. Word of the incursion arrived in lightly defended Richmond before the Yankees. The Confederate War Department mobilized an irregular force of soldiers, government workers and volunteers to resist the invaders.

Library of Congress Artist Edwin Forbes sketched Kilpatrick’s Raid to Richmond, circa Feb. 28 to Mar. 11, 1864.

Artist Edwin Forbes sketched Kilpatrick’s Raid to Richmond, circa Feb. 28 to Mar. 11, 1864.
Library of Congress

On Feb. 29, during the first full day of the raid, Kilpatrick divided his troops into two columns. He rode hard with the main body of about 3,000 men south to Richmond, while a second, smaller column of 500 men headed to Goochland, northwest of the capital.

Kingston and the rest of the Second were part of the smaller column. It was under the command of Ulric Dahlgren, a 21-year-old colonel and son of a career Navy officer, John Dahlgren. “Ully” spent his boyhood steeped in all things military, and distinguished himself in Union blue. He had led a successful reconnaissance raid into Confederate-held Fredericksburg on Nov. 9, 1862; later, at Gettysburg, he had suffered a severe wound in the foot that resulted in the amputation of a leg below the knee. Still, he soldiered on.

Kilpatrick and his men encountered Richmond’s outermost defenses on March 1 and found them stronger than anticipated. “Kill Cavalry” balked. He turned east and skirmished with Confederates while he waited for Dahlgren’s column to arrive.

Dahlgren, unaware of Kilpatrick’s withdrawal, continued on to Goochland and made a dash for Richmond. According to Lt. Col. Mortimer B. Birdseye of the Second, “This regiment has the honor of being the only Union regiment that passed the outer line of defenses surrounding Richmond during its occupation by Confederate forces.” But Dahlgren and his men ran into stiff resistance as they closed in on the capital. Casualties mounted, and Kingston went to work to save as many men as he could.

Dahlgren pressed to within two-and-a-half miles of the heart of the capital when the defenders finally broke their momentum. Dahlgren acted to save his command. “It soon got too hot, and he sounded the retreat, leaving forty men on the field” stated one of Dahlgren’s aides, 2nd Lt. Reuben Bartley. Kingston, who was uninjured, remained with the wounded as Dahlgren and the survivors fled.

Ulric Dahlgren stands in “Studying the Art of War,” by photographer Alexander Gardner, circa June 1863. Library of Congress.

Ulric Dahlgren stands in “Studying the Art of War,” by photographer Alexander Gardner, circa June 1863. Library of Congress.

Dahlgren continued on. By now night had fallen, and in the confusion caused by the darkness and enemy activity the column became separated. One section eventually made its way back to Kilpatrick. The other section, under the command of Dahlgren, rode into an ambush arranged by about 150 Confederate cavalrymen and other local volunteers. They descended on the Yankee raiders. Dahlgren was struck and killed by four bullets, and the rest of his troopers were dispersed or captured.

Victorious Confederates found Dahlgren’s lifeless body and stripped it of clothing and valuables, including his wooden artificial leg. One man hacked off one of Dahlgren’s fingers to take a ring. Another, 13-year-old William Littlepage, came away with a cigar case, a memorandum book and a few papers.

Littlepage and his comrades read one of the papers with fascination. “Special Orders and Instructions” provided details about the raid. One statement stood out among the rest: “The men must be kept together and well in hand, and, once in the city, it must be destroyed and Jeff Davis and his cabinet killed.”

The papers were forwarded through military and political chains of command and ultimately to Davis. Publication of the contents days after they were discovered rocked Richmond. Calls for retribution and retaliation rippled across the South. The North promptly denied any assassination plans and declared the documents to be forgeries.

Dahlgren’s body, which had been unceremoniously dumped in a muddy grave near the place he fell, was disinterred and put on display in Richmond. “Large numbers of persons went to see it. It was in a pine box, clothed in Confederate shirt and pants, and shrouded in a Confederate blanket,” reported The Richmond Whig on March 8, 1864.

While this circus played out on the streets of the capital, Kingston and his white cellmates were informed that they had been condemned to death as felons for their role in the alleged assassination attempt. “This news appeared to have a very depressing effect on Dr. Kingston,” noted Lieutenant Bartley, a fellow prisoner.

Kingston’s cough and cold worsened, and he lost his appetite. On March 21, as he lay near death, the Confederates removed him from his cell and sent him North. He survived the trip home, and with good food and care came back to life. He eventually returned to the regiment, was promoted to full surgeon, and served in this capacity until the end of the war.

The Confederates never followed through on their promise to execute the prisoners, which was most likely an idle threat by overzealous guards. But their ill treatment exacted a grim toll. According to Bartley, of the six officers imprisoned in the dungeon at Libby Prison, only three survived. He did not mention the fate of the four black soldiers.

Kingston was forever damaged by the ordeal. Back home in Oswego, he was frequently incapacitated by illness, and often doctored himself. His mental health appears to have suffered as well. An acquaintance described him as “a very odd & peculiar person.” Still, he managed to practice medicine and work as a druggist. A cerebral hemorrhage ended his life in 1889, at age 53. His wife, Anne, whom he had married in 1875, and two daughters survived him.

Follow Disunion at twitter.com/NYTcivilwar or join us on Facebook.


Sources: Samuel T. Kingston military service record, National Archives and Records Administration; New York Monuments Commission, “Final Report on the Battlefield at Gettysburg”; John Dahlgren, “Memoir of Ulric Dahlgren”; Philadelphia Inquirer, March 4, 1864; Frank Moore, “The Rebellion Record: A Diary of American Events”; Richmond Whig, March 8, 1864; The New York Times, March 10, 1864; The War of the Rebellion: A Compilation of the Official Records of the Union and Confederate Armies; Anne E. Kingston pension record, National Archives and Records Administration.


Ronald S. Coddington

Ronald S. Coddington is the author of “Faces of the Civil War” and “Faces of the Confederacy.” His most recent book is “African American Faces of the Civil War.” He writes “Faces of War,” a column for the Civil War News.