Feeds:
Entradas
Comentarios

Posts Tagged ‘Supreme Court’

Muhammad Ali and the Supreme Court Case that Redefined the Role of Sports Heroes in American Culture: Part 1

HNN    October 4, 2015

Over 40 years have passed since the Supreme Court of the United States decided the case of Clay, aka Ali, v. United States, which was argued before the Supreme Court Justices on April 19, 1971. On June 28th of the same year, the High Court ruled in favor of the petitioner Muhammad Ali (born Cassius Marcellus Clay Jr.), the boxing heavyweight champion of the world, who was stripped of his title by various boxing commissions when he refused to be inducted into the U.S. Army. Ali claimed exemption due to the fact that he was a Nation of Islam (also known as the Black Muslims) minister. The fact that he did so when the U.S. was involved in a war — Vietnam — angered many people.

The idea that sports stars in the U.S. were infallible athletic gods walking among us mere mortals was always disputed by some and with good reason. Ty Cobb, according to many accounts, was a virulent racist and Babe Ruth spent most of his adult life in an alcoholic stupor. For decades, beer companies supplied free samples of their beverage to National Hockey League (NHL) players and beer is not the ideal beverage if one is a professional athlete. Before the days of million dollar contracts, the beer companies employed these same athletes as salesmen during their off-season free time. The Molson Brewing Company once owned the Montreal Canadiens. Since World War Two, the National Football League (NFL) team owners have had to deal with the fact that gambling on NFL games happens and in the long history of the league, occasionally NFL players have been found guilty of betting on their own team. Regarding gambling, Major League Baseball (MLB)’s all-time hit accumulator, Pete Rose, received a life-time ban on participating in the sport when he admitted to betting on baseball games. For much of American boxing history, the sport was controlled by mobsters, who made sure that the outcome of bouts was fixed beforehand.

Yet what Muhammad Ali stood for somehow superseded all of the above. He was a 6’4,” 235 pound bombastic personality named after the 19th century abolitionist and anti-slavery newspaper editor, Cassius Clay. His father, Cassius Marcellus Clay Sr., was a talented artist and sign painter who was proud of his black lineage. Odessa Clay, Ali’s mother, was born of mixed blood and was part Irish — and so, of course, is her famous son. Born a Christian, Clay converted to the Black Muslim faith 24 hours after he won the heavyweight championship of the world in 1964. At first he told reporters that he wanted to be known as Cassius X, but then amended that to the name of Muhammad Ali. He began boxing at the age of 12, won the 1960 Olympic Light Heavyweight gold medal, and did not retire from the sport until he was badly beaten by Trevor Berbick in a December, 1981 match held in the Bahamas. Until Parkinson’s Syndrome had begun to stop his speech (he actually began to shown early signs of the disease at the time of the Berbick fight), Ali was always talkative and displayed a colorful and outgoing personality.

So during the 1960’s, here came a brash, young (he was only 22 when he won the heavyweight title), prolix man on the world stage. The fact that he publicly renounced Christianity, and took up the Nation of Islam religion (in 1975, he would convert to Sunni Islam) at a time when the American power structure (legislative; judicial; presidential; media and press; corporate; military) was run either by Christians or Jews baffled Americans. At the press conference where Ali made his announcement that he was a member of the Nation of Islam, he famously said, “I don’t have to be what you want me to be. I’m free to be what I want.” Influential newspaper sports columnists Jimmy Cannon and Red Smith belittled Ali in their columns.

The day after Ali returned from a trip to New York with his then good friend Malcolm X (Ali would later stop following Malcom X’s beliefs and devote himself to Black Muslim founder Elijah Muhammad’s tenets) at a Muslin rally, he “received a notice to report to the Armed Forces Induction center in Coral Gables, Florida to take a military qualifying examination,” wrote Howard L. Bingham and Max Wallace in their book, published in 2000, entitled Muhammad Ali’s Greatest Fight: Clay v. The United States of America. Bingham was Ali’s long-time personal confidante and personal photographer. Later, on March 20, 1964, Ali’s military aptitude test results were made public. He failed the test, and especially had trouble with the mathematical questions on the test.

For once, the talkative Ali (who had barely graduated from the public high school that he attended in his native Louisville, Kentucky) was quiet; frankly, he was embarrassed by the disclosure that he flunked the test. “I said that I was The Greatest [a title he bestowed on himself previously], not The Smartest. When I looked at a lot of them questions, I just didn’t know the answers. I didn’t even know how to start about finding the answers,” confessed Ali.

All of this took place during the time of the civil rights movement for blacks and also the American military build-up in Vietnam. Both of these events created emotional turmoil for Americans, so Ali’s growing discovery of his true self (i.e., his religious conversion, and his inchoate reflections about the world), which he was always glad to share with reporters and audiences, made for yet another spicy ingredient in the American societal stew.

Federal Bureau of Investigation (FBI) Director J. Edgar Hoover, U.S. Senators, and others in the 1960’s federal government power structure refused to believe that Ali failed his military aptitude test. When the FBI began an investigation they found that Ali was at best a sub-par high school student. For some months Ali himself believed that all of this meant that he was stupid, but his former high school teachers, reporters, and others who knew him well have noted that he was a highly intelligent person. The military aptitude test was as flawed as the standard IQ test. Author Norman Mailer (who attended Harvard, and was certainly no mental midget) knew Ali well and told of Ali being wise and intelligent on a number of subjects. After Ali’s retirement from boxing, he acted as a Goodwill Ambassador. He knew numerous world leaders well, ranging from Cuba’s Fidel Castor to South Africa’s Nelson Mandela. He personally designed many of the buildings at his personal; boxing training camp. Citizens, irate that Ali was preparing himself not to be drafted, wrote letters to President Lyndon B. Johnson asking him to do something about the situation.

A lesson that professional athletes learned from Ali is that, thanks to a progression in communications, they can voice their comments and ideas on any topic in the world and they will be known throughout the world. Furthermore, the more famous and talented the athlete, the more people will somehow react when he or she voices said comments and ideas. Thanks to a boom in satellite technology, and also other media and press technology, which began in the 1960’s, Muhammad Ali became the world’s first truly international sports star. People from Atlanta to Zanzibar could see Ali daily in television news reports and also watch his boxing matches. Ali became a hero to other famous black American athletes of the 1960’s (most notably football’s Jim Brown and basketball’s Lew Alcindor, who would become known as Abdul-Jabbar). They saw that — contrary to notable American black athletes of the past — they were free to offer their opinions on anything they wanted. Both Brown and Abdul-Jabbar also took note that Ali spent much of his free time doing charity work and also kept busy with other altruistic activities and so Brown and Abdul-Jabbar began to do so as well.

In April of 1964 Ali went on a tour of numerous countries in Africa. The tour was scheduled previous to his military draft imbroglio. Tens of thousands of Africans came out from their homes, shops, and places of work to see and hear the boxing heavyweight champion of the world. By a strange twist of fate, Ali just happened to notice Malcolm X, from a distance, walking in a city square in Ghana. He did not attempt to get Malcolm X’s attention for by this time, Ali and Malcolm X’s friendship was null and void. Ali had decided to follow the beliefs and tenets of Nation of Islam founder Elijah Muhammad rather than those of Malcolm X. As Ali was preparing to defend his heavyweight title against Liston in a 1965 rematch (which Ali would win by a knock out in the first round), Malcolm X was publicly predicting to CBS-TV’s Mike Wallace and other reporters that he, Malcolm X, would be assassinated due to his conflicts with Elijah Muhammad. In 1965 he was, and ever since Ali had feelings of remorse about his and Malcolm X’s failed friendship. After Malcolm X’s murder, five FBI agents were assigned to bodyguard Ali.

Numerous polls taken during this time period of the mid-1960’s show that the majority of Americans supported the U.S. military activities in Vietnam but, ever so slowly, this was beginning to change. President Johnson announced that 17,500 more men would be drafted and additionally, he ordered another 50,000 more troops be assigned to Vietnam. In November, the Pentagon issued a directive in which any person who took a military induction test and had a recorded score of 15 could be eligible to be drafted. As Ali’s score was 16, this now meant that, by the unit of measurement of a sole point, he could now be drafted. Numerous prominent athletes of the 1960’s served in the military. The most notable was Roger Staubach, who won the 1963 Heisman Trophy after successfully quarterbacking the U.S. Naval Academy to a winning season. So Ali’s upcoming refusal to be drafted was something that U.S. citizens, of all creeds, races, and religions, were thinking about.

This simple fact — that a prominent athlete was by his conduct outside of his work place (in Ali’s case, a boxing ring) — virtually forcing a country’s people to confront a major issue of enormous controversy — was, and still is, quite rare. Ali was, in essence, defying the federal government and the military during a war.


Mark Weisenmiller is a Florida-based author/historian/reporter. Previous employers include United Press International (UPI); Deutsche Presse Agentur (DPA); Inter Press Service (IPS); The Economist, and the Xinhua News Agency (XNA). He is currently at work on a non-fiction book of reportage about China, which will be the second in a planned series of non-fiction books of reportage about the countries and regions of the world.

Read Full Post »

Muhammad Ali and the Supreme Court Case that Redefined the Role of Sports Heroes in American Culture: Part 2 

HNN   October 11, 2015

Muhammad Ali was boxing heavyweight champion of the world for much of the 1960’s. During this decade he was admired internationally, but not in his native country of the United States. Chief reason for this was his vocal opposition to serving in the U.S. Army, or any other branch of the military which, as fate would have it, was the same time period as the U.S. military intervention into Vietnam.

Ali was controversial ever since becoming famous. This applies to both his boxing style (in which he moved away from his opponent’s punches and also specialized in moving laterally, rather than the conventional method of moving toward an opponent’s punches and vertically) and also his behavior outside the ring (such as his proclamation that he was renouncing Christianity and his given name and was joining the National of Islam). Looking to Ali as an example, more and more athletes the world over, and especially American black athletes, began to become influential members of society. No longer would athletes be silent automatons mindlessly providing sports entertainment.

While in Miami in 1966 awaiting word from his draft board when to report for induction, Ali was told by a news wire reporter that he was eligible for the draft. Not long afterwards, many television news reporters arrived in their television station trucks, parked outside of Ali’s home, and began annoyingly asking for him to step outside and make a statement. What happened next was, and still is, unclear. For we reporters who have covered stories in which numerous reporters place numerous microphones in front of an interviewer and ask him or her to speak, we know that, despite technological advances, something can be said and not fully understood. This now happened with Ali. Reporters were asking him many questions and he clearly began to lose his temper. After he was asked the question “What do you think of the Viet Cong ?” many reporters quoted him as saying “I ain’t got no quarrel with them Viet Cong.” However Robert Lipsyte of the New York Times and some other reporters who were present noted that Ali answered with “I ain’t got nothing against them Viet Cong.” In either case, whatever Ali said began a series of social and politically vindictive attacks.

Here we have another first, in this two-part story, that resonates with today’s times. If the reader is a professional athlete, the lesson is the following: Be careful and deliberate what you say in public and furthermore, be honest and sincere in said speech. With the gift now of hindsight, we now know that Ali did not do the first, but did the second. Also, whatever one’s opinion of Ali and his refusal to be drafted, one cannot deny Ali’s courage in standing up for his religious convictions. It would have been very, very easy for him to simply move to Canada to live and avoid the draft (as thousands of men did) — and thus be able to obtain boxing licenses in other countries and to fight for millions of dollars — but as Ali often said, “The United States is my home country. I don’t run away from home.”

Something else needs to be recorded here, even though the following is slightly off our narrative: many Americans — such as liberals, Democrats, and especially hippies — took up Ali’s cause with gusto, but Ali frequently did not reciprocate their feelings. For example, the piously religious Ali (he has never smoked or drunk a drop of alcohol in his life, and, as per Muslim custom, he avoids all pork products and prays five times daily) was repulsed by the hippies’ fondness for recreational alcohol and drugs. Even though Ali is now quite aware that due to his Parkinson’s Disease he must take medications, he still, after all of these years, dislikes taking these medications and also putting any sort of chemicals into his body. Many “long-hairs” (to use a popular word of the 1960’s and 1970’s) spent much of their time doing the polar opposite. Ali strongly disliked long hair on men and scorned men who burned their draft cards. Even when he spoke before audiences composed mostly of young people, he was always well groomed (he has always been narcissistic about his appearance) and wore a well-cut suit and matching tie.

After Ali heard black leader Stokely Carmichael say “ain’t no Viet cong ever called me nigger,” Ali borrowed this saying and modified it for himself to be “No Viet Cong ever called me nigger.” The Illinois Athletic Commission (which issued boxing licenses in that state) ordered Ali to appear before them and publicly apologize for his anti-war remarks. Usually Ali avoided such orders, but this time he did appear before the commission and publicly refused to apologize.

Thus another lesson to be learned from this complicated story. To wit: if a sports commission tries to mandate how an athlete conducts their personal life, the commission is likely to face criticism.

In February of 1966, Ali’s attorneys filed their famous client’s first request for military draft exemption status. The exemption was mostly based on finite, picky legal grounds. However three weeks later, in mid March, the lawyers adopted a new legal tactic. They argued that since Ali was a minister of the Nation of Islam, and since as per the Holy Koran, pious Muslims could only fight in holy wars, Ali should be exempted from the legal draft. To many Americans, this latter legal tactic sounded dubious. How, they wondered, could Ali proclaim that his religious belief in international brotherhood and peace made him exempt from the military draft when he beat people up for a living? This particular draft exemption was denied, and then his team of lawyers filed an appeal. However as per federal law, before the appeal could be heard (before ae state appeal board), the U.S. Justice Department had to review the case and decide whether or not Ali was sincere in his beliefs. A retired judge named Lawrence Grauman heard the case.

Ali’s fate rested in this judge’s hands. To most people’s surprise, but not to Ali himself, Grauman ruled in Ali’s behalf. “I recommend that the registrant’s claim for conscientious objector status be sustained,” wrote Grauman. Despite the ruling, the federal government pressed onward, ordering Ali to report for military induction in Houston, where he had moved to lead a mosque.

On April 28, 1967, Ali went. When an Army officer said, “Mr. Cassius Clay, you will please step forward and be inducted into the United States Army,” Ali refused to do so. “Furthermore, Ali faced imprisonment for his action and was barred from boxing while his case was litigated. He called himself ‘The People’s Champion’ and continued to be recognized as the world heavyweight title holder in Great Britain and Japan,” reads a paragraph of Ali’s biography in the 1999 reference book, The Boxing Register International Boxing Hall of Fame Official Record Book.

From this time, the late 1960’s to today, athletes would no longer mindlessly do what their bosses, and other well-established institutions (military, political, religious, etc.) told them to do if they disagreed. Atop that, if these athletes refused to do so, they would try to make their points in the courts. Major League Baseball St. Louis Cardinals star outfielder Curt Flood’s case to the U.S. Supreme Court (which he would lose) proclaiming that the reserve clause in baseball is illegal is but one example.

From this point onwards, Muhammad Ali was considered a pariah to millions of Americans. Denied a right to make a living in his home country, he did all sorts of things: spoke for fees on college campuses, starred in a Broadway musical titled “Buck White” (where he surprised all by displaying a very melodic and pleasant singing voice), and doing pro-bono work for charities. He continued to make his case to anybody who cared to listen. The day of the quiet, taciturn sports star was over. Singer-songwriter Paul Simon neatly captured frustrated Americans views about pushy athletes with the line, “Where have you gone Joe DiMaggio? Our nation turns its lonely eyes to you” in the 1968 song “Mrs. Robinson.”

Ali’s case wound its way upwards through the judicial system all the way to the Supreme Court of the United States after the Fifth Circuit confirmed his June 20, 1967 conviction (on a felony charge of refusing to be drafted). He remained free on appeal. From March 1967 to October 1970, due to his military draft problems, he was inactive in boxing. The case got to the Supreme Court in January of 1971 and Justice William Brennan convinced his colleagues to grant certiorari (approval to hear the case). As Justice Thurgood Marshall had been Solicitor General when Ali was originally convicted, he recused himself. (Another reason he did so, known to his colleagues and their respective law clerks but less well-known to the general public, was that he despised the Black Muslims.)

In their 1971 book The Brethren: Inside The Supreme Court, Scott Armstrong and Bob Woodward write that “On Friday, April 23… the [Supreme Court Justices’] conference decided, 5 to 3, that it agreed with [Solicitor General Erwin N.] Griswold. Ali was not really a conscientious objector and should go to jail.” Yet Ali didn’t. This was thanks to Justices John Harlan and Potter Stewart (though Ali didn’t learn this for years).

Harlan was assigned to write the majority opinion by Chief Justice Warren Burger, but before he did so, Harlan (who had served in the military during World War II) read the Nation of Islam treatise book, Message To The Black Man in America, at the suggestion of his law clerks. In it was stated that Black Muslims could fight holy wars, but the fact that Ali obviously disapproved of ALL wars convinced Harlan to change his vote. This now dead-locked the Justices vote at four for conviction and four for Ali’s freedom. If the court stayed deadlocked Ali would go to jail, but as it is long tradition that deadlocked cases do not come with written legal opinions by Supreme Court Justices, Ali would never know why he lost the case and never would really know why he went to jail.

Justice Stewart came up with a solution: he and his law clerks discovered that a state appeals board gave no reason for the denial of Ali’s conscientious objector status. With this in mind, and also considering that there are three legal grounds a claimant must meet for conscientious objector status, it would therefore be impossible to determine on which of the three legal grounds the U.S. Department of Justice decided to proceed with its case against Ali. Therefore, went this legal argument, Ali should go free. In a unanimous 8-0 decision, that is the legal conclusion that the eight Supreme Court Justices came to.

Ali heard the news that he had won when he was shopping in a grocery store in Chicago and a grocery clerk came over and hugged him and told him the news. Ali then thanked Allah and the Supreme Court, in that order, then immediately went to a South Side gym to work out.

Angelo Dundee, Ali’s life-long boxing trainer, was interviewed many times by this reporter and, when reflecting on Ali’s career, told me, “We never saw Muhammad Ali at his peak. He was out of the ring for three and a half years and those three and a half years [in Ali’s case, when he was just short of age 25 to the age of 28] are primary years for most boxers. Who knows what he could have done?” Herewith our final lesson: whenever a prominent athlete takes issue with a government agency — or worse, as in Ali’s case, the federal government and the military — he or she will somehow, someway be punished—even if the punishment isn’t just.

SOURCES FOR THIS TWO-PART STORY

Websites: www.aavw.orgwww.oyez.orgwww.scotus.comwww.hbo.com.

Books: “Muhammad Ali’s Greatest Fight: Clay v. The United States of America” by Howard L. Bingham and Max Wallace; “The Boxing Register International Boxing Hall of Fame Official Record Book, 1999 Edition”; “The Brethren: Inside The Supreme Court” by Scott Armstrong and Bob Woodward; “The Muhammad Ali Reader,” Edited by Gerald Early; “Muhammad Ali: The Greatest” by John Hennessey; “The Greatest: My Own Story” by Muhammad Ali with Richard Durham; “Muhammad Ali: The Greatest Of All Time” by Robert Cassidy, “King of the World: Muhammad Ali and the Rise of an American Hero” by David Remnick.


Mark Weisenmiller is a Florida-based author/historian/reporter. Previous employers include United Press International (UPI); Deutsche Presse Agentur (DPA); Inter Press Service (IPS); The Economist, and the Xinhua News Agency (XNA). He is currently at work on a non-fiction book of reportage about China, which will be the second in a planned series of non-fiction books of reportage about the countries and regions of the world.

Read Full Post »

Getting Right with Brown

Brown vs Board team

Brown v. Board team.(Photo: NAACP Legal Defense

For over sixty years, no matter where you stand on the constitutional spectrum, you have had to get right with Brown v. Board of Education. Decided sixty-one years ago this coming May 17, Brown is one of the best-known decisions of the U.S. Supreme Court, one of the Court’s most beloved – or at least well-regarded – decisions, and a key juncture in the development of American constitutional law.

There are several reasons why Brown should matter that much.

First, Brown was a watershed decision by the Supreme Court, putting an end, at least on paper, to nearly sixty years of “separate but equal” as a constitutional rule governing access to public facilities and accommodations. Ever since the 1896 Plessy v. Ferguson decision, in which the Court established the “separate but equal” rule as a guide to interpreting the Fourteenth Amendment’s equal protection clause, a central goal of the NAACP’s Legal Defense Fund (usually called the “Inc Fund”) was to end “separate but equal.” For years, Thurgood Marshall led the Inc Fund in combating “separate but equal” by applying legal ju-jitsu to the rule: if facilities were not equal, they could not be separate. If they were unequal and the state insisted on separation, the state had to create a whole new facility equal to the segregated facility for African-Americans to use. Thus, in a lawsuit requiring the University of Oklahoma to integrate its law school, the Court held that a roped-off desk in the Oklahoma Supreme Court’s library was not an equal law school for the African-American who had been admitted to the University of Oklahoma’s law school. Either the state had to create a new law school matching the existing one lecture-hall for lecture-hall, library for library, moot-court society for moot-court society, brick for brick, or it had to integrate its existing law school and admit the black student. Thurgood Marshall had tired of this incremental game, realizing that segregationists would apply legal ingenuity to create new ways of segregating so that the Inc Fund would have to fight each one, step by step. Thus, Marshall concluded, it was time to “go for the whole hog” and mount a head-on attack on segregation as inherently unequal.

Second, Brown was a triumph for public-interest lawyering. Marshall and his colleagues at the Inc Fund had won, at least on paper, an epochal victory for equality before the law. It would encourage lawyers taking on many other kinds of cases – for women’s equality, for equality of gays and lesbians, to name just two categories – and to use American constitutional law as an instrument of reform. In particular, when political processes were unresponsive to the growing demand for embracing racial equality, lawsuits seeking judicial action would prove to be an effective and versatile tool of forcing social change.

Third, Brown was a test of the Supreme Court and the lower federal courts. It opened the door for a generation of litigation and appeals focusing on defining what the commands of the original Brown decision meant and should mean.Brown launched an era of judicial intervention in school governance, in public accommodations, and in other areas of law. The courts would superintend the ways that an entire society treated the black and white races. No longer could discrimination continue in schools or in other forms of public accommodations, without having to meet the scrutiny of courts and judges using the equal-protection clause as a yardstick.

Fourth, Brown was a test of the Constitution itself, and of ways to interpret it. The debate sparked by Brown (and the line of cases following and developing its holdings) focused on the Court’s interpretation of the Fourteenth Amendment and its history. The Court had decided that the passing of time and the evolution of values might render a rule of constitutional interpretation no longer valid. Scholars debated whether the Court had overreached in deciding Brown as it had. Some emphasized the need for “neutral principles” of constitutional law as the only sound basis for sweeping constitutional change via courts – and disputed whether Brown was based on such principles. Some emphasized the need for judicial prudence and self-restraint in exercising judicial review – and disputed whether Brown had been consistent with or in gross violation of such judicial prudence and self-restraint. Some insisted that the Court had to be bound by the original intent of the framers of the Fourteenth Amendment, while others argued that an originalist methodology of constitutional interpretation needlessly froze the Constitution as of 1868. Many disputes still roiling the waters of American constitutional jurisprudence can trace their roots to the dispute over Brown.

At the same time, a fifth significance of Brown is that the decision found surprisingly swift acceptance by many Americans as just, symbolizing the Court’s role in American life as distilled by the inscription over the front door of the Supreme Court Building: EQUAL JUSTICE UNDER LAW. The decision signaled a major shift in public opinion about how the nation ought to treat African-Americans and a major public reconceptualization of the Court itself, one that to some degree is still with us. One source of the anger that many Americans feel against the Supreme Court’s recent decisions on gun rights and campaign finance is the disparity that they see between such decisions and what the Court achieved in Brown.

On May 17, 1954, the announcement of the Court’s unanimous decision of Brown v. Board of Education set off a constitutional earthquake that shook all of American society and law. That earthquake still reverberates among us, as it enters its seventh decade – and we all should remember it.

About the Author

R. B. Bernstein

R. B. Bernstein teaches at City College of New York’s Colin Powell School and New York Law School; his books includeThomas Jefferson (2003), The Founding Fathers Reconsidered (2009), the forthcoming The Education of John Adams, and the forthcoming The Founding Fathers: A Very Short Introduction, all from Oxford University Press.

Read Full Post »

Why Naming John Marshall Chief Justice Was John Adams’s “Greatest Gift” to the Nation

Harlow Gikes

HNN   November 16, 2014

As the final hours of John Adams’s short-lived administration ticked away, the President faced a critical last decision: the nomination of a new Chief Justice of the United States Supreme Court. Former Connecticut senator Oliver Ellsworth, who had helped write the Constitution, was ill and had resigned as the nation’s third Chief Justice.

Instinctively, the President turned to his old friend, New York’s John Jay, whom George Washington had appointed as the nation’s first Chief Justice in 1789. After five years, Jay was so bored with the job he resigned to become governor of his home state—and with good reason: The Supreme Court was not an important element of the American government, and its members had little or nothing to do.

The Constitution and four of ten amendments in the Bill of Rights had shorn the federal judiciary of power and left the Supreme Court a relatively impotent appellate court, with almost no original jurisdiction. By 1800, when Adams searched for a new Chief Justice, the Court had issued only eleven decisions during the federal government’s 11-year existence—one a year. There simply weren’t enough federal laws on the books to provoke much legal activity, and most Americans were more intent on plowing land than filing lawsuits.

Secretary of State John Marshall was in the President’s office when Jay’s letter of refusal arrived. Like Adams and Jay, the 45-year-old Marshall was a fervent Federalist intent on thwarting the radical changes in government that the anti-Federalist President-elect Thomas Jefferson was planning. In effect, Jefferson sought nothing less than a populist revolution, shifting power from the federal government to the states and extending the vote—then limited to property owners of means—to all white adult males. With Jefferson’s followers a majority in Congress, nothing stood in Jefferson’s way but the judiciary, and Marshall urged Adams to appoint as many Federalist judges as possible to frustrate Jefferson’s schemes.

Born and raised in Virginia, Marshall had fought heroically in the Revolutionary War—at Trenton, Brandywine, and Monmouth—and shivered through the bitter winter at Valley Forge. After the war, he studied law, became one of Richmond’s most prominent lawyers, and a fervent champion of constitutional ratification. He won election to Congress in the Federalist sweep that lifted Vice President Adams to the presidency in 1796, and Adams sent him to Paris to help negotiate an end to the Franco-American naval conflict then raging in the Caribbean. Marshall’s tough negotiating skills earned him a hero’s welcome on returning to America—and appointment as Secretary of State, then the second most important federal post.

“When I waited on the President with Mr. Jay’s letter declining the appointment,” Marshall recalled, “the President asked thoughtfully, ‘Whom shall I nominate now?’

“I replied that I could not tell.

“After a moment’s hesitation, he said, ‘I believe I must nominate you.’

“I had never before heard myself named for the office and had not even thought of it. I was pleased as well as surprised and bowed in silence. Next day I was nominated.”

From the first, Marshall saw the High Court as a bulwark against executive and legislative tyranny, with the Constitution as the Court’s primary weapon.

“He hit the Constitution much as the Lord hit the chaos, at a time when everything needed creating,” legal scholar John Paul Frank said of Marshall. “Only a first-class creative genius could have risen so magnificently to the opportunity of the hour.”

Like Moses, Marshall climbed the Mount and thundered commandments to those in government, asserting what “thou shall” and “shall not” do. Both Presidents Washington and Adams had violated constitutional restrictions on their power—each initiating wars without congressional authorization and, in Washington’s case, borrowing funds to finance government operations.

Jefferson planned even more radical usurpations of power—the replacement of judges appointed for life with anti-Federalist jurists who supported the Jefferson political program. Jefferson’s first victim was William Marbury, one of President Adams’s last-minute judicial appointees. When Marbury demanded that Secretary of State deliver the commission Adams had signed, Jefferson ordered Madison to withhold it while he found am anti-Federalist replacement.

Incensed, Marbury asked the Supreme Court for a court order, or writ of mandamus, to force Madison to give Marbury his commission. In 1803, John Marshall stunned Jefferson and the nation by declaring the President and Secretary of State in violation of the law. Under British rule, the king could do no wrong, Marshall conceded, but under the American Constitution, the President remained a citizen like every other American—subject to the law like every other American. A President of the United States had appointed Marbury to the bench and signed his commission, and a Secretary of State had embossed it with the Seal of the United States. For Jefferson and Madison to withhold the commission was a crime.

In a second, even more consequential ruling, however, Marshall refused to issue Marbury his writ, explaining that the Supreme Court was an appellate court not a court of original jurisdiction and that Marbury should have applied first to lower courts for the writ. Marbury cited a provision of the Judiciary Act of 1789 that specifically allowed plaintiffs to bypass lower courts in seekingwrits of mandamus. As onlookers gasped, Marshall then declared the provision unconstitutional.

It was a declaration of historic proportions: For the first time in American history, the Supreme Court had exercised the power of judicial review and declared a federal law unconstitutional. Unmentioned in the Constitution, judicial review was John Marshall’s creation, asserting Supreme Court power to declare any law—federal, state, or local—unconstitutional.

Marbury v. Madison was one of nearly 1,200 decisions Marshall’s court would deliver during his thirty-five years as Chief Justice. The longest serving Chief Justice in American history, he wrote nearly half the decisions himself, effectively appending them to the Constitution to form “the supreme law of the land” as a bulwark against tyranny by ambitious executives and legislators.

John Adams called his appointment of John Marshall as Chief Justice “the proudest act of my life.”

Harlow Giles Unger is the author of more than twenty books on the Founding Fathers and early American history. His latest book is “John Marshall: The Chief Justice Who Saved the Nation,” just published by Da Capo Press, a member of the Perseus Books Group.

Read Full Post »

logo

The Civil Rights Project    May 15, 2014

Segregation Increases after Desegregation Plans Terminated by Supreme Court

LOS ANGELES: Marking the 60th anniversary of the landmark U.S. Supreme Court decision Brown v Board of Education, UCLA’s Civil Rights Project/Proyecto Derechos Civiles (CRP) assessed the nation’s progress in addressing school segregation in it’s new report released today, Brown at 60: Great Progress, a Long Retreat and an Uncertain Future, and found that the vast transformation of the nation’s school population since the civil rights era includes an almost 30% drop in white students and close to quintupling of Latino students.

Brown at 60 shows that the nation’s two largest regions, the South and West, now have a majority of what were called “minority” students. Whites are only the second largest group in the West. The South, always the home of most black students, now has more Latinos than blacks and is a profoundly tri-racial region.

The Brown decision in 1954 challenged the legitimacy of the entire “separate but equal” educational system of the South, and initiated strides toward racial and social equality in schools across the nation. Desegregation progress was very substantial for Southern blacks, in particular, says the report, and occurred from the mid-1960s to the late l980s.

The authors state that, contrary to many claims, the South has not gone back to the level of segregation beforeBrown. It has, however, lost all of the additional progress made after l967, but is still the least segregated region for black students.

Since the 1990s, the Supreme Court has fundamentally changed desegregation law, states the report, and many major desegregation plans have ended. CRP’s statistical analysis shows that segregation increased substantially after desegregation plans were terminated in many large districts including Charlotte, NC; Pinellas County, FL; and Henrico County, VA.

“Brown was a major accomplishment and we should rightfully be proud. But a real celebration should also involve thinking seriously about why the country has turned away from the goal of Brown and accepted deepening polarization and inequality in our schools,” said Gary Orfield, co-author of the study and co-director of the Civil Rights Project. “It is time to stop celebrating a version of history that ignores our last quarter century of retreat and begin to make new history by finding ways to apply the vision of Brown in a transformed, multiracial society in another century.”

This new research affirms that the growth of segregation coincides with the demographic surge in the Latino population. Segregation has been most dramatic for Latino students, particularly in the West, where there was substantial integration in the l960s but segregation has soared since.

The report stresses that segregation occurs simultaneously across race and poverty. The report details a half-century of desegregation research showing the major costs of segregation, particularly for students of color and poor students, and, conversely, the variety of benefits offered by schools with student enrollment of all races.

Among the key findings of the research are:

  • Black and Latino students are an increasingly large percentage of suburban enrollment, particularly in larger metropolitan areas, and are moving to schools with relatively few white students.
  • Segregation for blacks is the highest in the Northeast, a region with extremely high district fragmentation.
  • Latinos are now significantly more segregated than blacks in suburban America.
  • Black and Latino students tend to be in schools with a substantial majority of poor children, while white and Asian students typically attend middle class schools.
  • Segregation is by far the most serious in the central cities of the largest metropolitan areas; the states of New York, Illinois and California are the top three worst for isolating black students.
  • California is the state in which Latino students are most segregated.

The report concludes with recommendations about how the nation might pursue making the promise of Brown a reality in the 21st century–providing equal opportunity to all students regardless of race or economic background.

“Desegregation is not a panacea and it is not feasible in some situations,” said co-author Erica Frankenberg, assistant professor at Pennsylvania State University. “Where it is possible–and it still is possible in many areas–desegregation properly implemented can make a very real contribution to equalizing educational opportunities and preparing young Americans to live, work and govern together in our extremely diverse society.”

Brown at 60 is being released from New York University’s Metropolitan Center for Research on Equity and the Transformation of Schools, where Orfield delivers the keynote address, on Friday, May 16, 2014, for Brown 60 and Beyond. The report includes various tables showing segregation state-by-state and can be found here.

Related Documents


About the Civil Rights Project at UCLA

Founded in 1996 by former Harvard professors Gary Orfield and Christopher Edley, Jr., The Civil Rights Project/Proyecto Derechos Civiles is now co-directed by Orfield and Patricia Gándara, professors at UCLA. Its mission is to create a new generation of research in social science and law on the critical issues of civil rights and equal opportunity for racial and ethnic groups in the United States. It has monitored the success of American schools in equalizing opportunity and has been the authoritative source of segregation statistics. CRP has commissioned more than 400 studies, published more than 15 books and issued numerous reports from authors at universities and research centers across the country. The U.S. Supreme Court, in its 2003 Grutter v. Bollingerdecision upholding affirmative action, and in Justice Breyer’s dissent (joined by three other Justices) to its 2007Parents Involved decision, cited the Civil Rights Project’s research.

 

Read Full Post »

Church and the State in America: a Brief Primer

Ira Chernus

HNN      May 5, 2014

153347-Union_Park_Congregational_Church_and_Carpenter_Chapel_Chicago_IL

Congregational Church, Chicago, Illinois

The Supreme Court has ruled, 5-4, that Greece, New York, can open its town meetings with a prayer, even though nearly all the prayers have contained distinctively Christian language. No doubt advocates and critics of the opinion are scouring American history, looking for proof that their view is correct.

If they look with an unjaundiced eye, they’ll quickly discover one basic principle: Whatever position you hold on this issue, you can find some support in our nation’s history. So history alone cannot resolve the ongoing debate. But it can help inform the debate.

To understand that history we have to begin in the European Middle Ages, when the Roman Catholic Church held sway over the religious life of almost all western Europeans. Politically each area was usually ruled by a single a monarch.  Since “Church” and “state” were both monolithic institutions, it made sense to talk about “church-state relations” quite literally.

In principle, both sides usually agreed that the state ruled over the affairs of this world and the church ruled over the affairs of the soul as it headed toward the next world. In practice, though, each side often tried to extend its power over the other.

When the Protestant reformation came along in the 16th century, it refuted the Catholic church’s claim to control other-worldly affairs. But it did not challenge the basic idea that each area should have one secular ruler and one established church, and the two should live side by side, each respecting the other’s domain. So tensions between church and state inevitably continued.

Since nearly all the early European colonists in what would become the United States were Protestants, they brought that Protestant view with them. Different denominations had majorities in the various colonies, and each had its own model of church-state relations.

But nearly everyone assumed that it could make sense for a colony to have one established church, which would have special privileges from and influence upon the colony. Most of the colonies did, in fact, have established churches.

By the early 1700s, though, the colonies were filling up with immigrants from different places who held different religious views. So the established churches everywhere had to tolerate dissent from the official religion, to a greater or lesser degree.  At the same time, the colonies were experimenting with all sorts of different political structures.

Thus “church” and “state” were no longer monolithic entities as they had been in medieval times. Gradually, the term “church” became a code word for religion in general, including the many different religious beliefs and practices held by different groups and individuals. And the term “state” became a code word for the many various political structures — town, city, county, colonial legislature, royal council, etc.

Things got more complicated in the 18th century as people found their identity based less in fixed social institutions and more in open-ended individual conscience. The Enlightenment philosophers taught that religion was a matter of private belief and individual relationship with God. They also taught that every individual was free to choose their own political views and that the state should base its policies on the will of the majority.

A large Christian revival movement called the Great Awakening reinforced the idea that religion is a matter of inner experience and personal relationship with God more than membership in a church. So the Enlightenment and the Awakening combined to promote individualism and the notion of religion as a private matter.

By the time of the American Revolution, then, there was a complex triangular structure, with private individuals, political institutions (“state”), and religious institutions (“church”) all interacting. So the term “church-state relations” meant, more than ever, an endlessly complex set of changing relations among all the different forms of religious and political life.

But there was a growing belief in the colonies that the private individual had highest priority, that the main role of the state was to protect the individual’s rights, including the right to decide on one’s own religion.

The colonists who joined the Revolution against England all agreed on one thing: the English political system was a tyranny, and the Church of England was part of that tyranny. So there was growing fear of the very idea of an established church.

It was only natural, then, that the new United States would want to protect its citizens from an established church. So the first words of the Bill of Rights said that “Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof

But there was no clear agreement then, as there is none now, about exactly what those words mean.

Some see the two clauses making two opposites points. “No law respecting establishment of religion” makes it illegal to force people to practice  a religion; “no law prohibiting free exercise” makes it illegal to stop people from practicing religion. The “no establishment” clause protects the people and the government from religion. The “free exercise” clause protects religion from the government and the will of the majority.

But some say that both clauses actually make the same point: They both protect individuals from the federal government. The government cannot impose a religious institution on any individual, nor can the government restrict any individual’s religious life. In fact some religious institutions supported the 1st amendment when it was ratified and refused to take any support from the government because they feared such support would entitle the government to impose controls upon them.

The debate about the meaning of the 1st amendment and the intentions of the founders still rages on because they did not bequeath to us any single consistent view on church and state. They all claimed to be Christian. But they had many different ideas of what it meant to be Christian. Each individual could hold what we might see as contradictory views and practices.

To take one important example: Thomas Jefferson created the image of a “wall of separation between church and state” and wrote powerfully about the need to protect the religious freedom of every individual. Yet in the Declaration of Independence he based the entire political philosophy of the new nation on the idea that all men are endowed by their Creator with certain unalienable rights. Without God, Jefferson’s whole political philosophy makes no sense. Jefferson was also devoted to the teachings of Jesus, but only as he understood them; he even created his own version of the Gospels. Jefferson also supported, on occasion, legislation to create public prayer days and to punish people who broke Sabbath laws.

If we cannot expect logical consistency even from Thomas Jefferson, we certainly can’t expect it from the founding fathers as a group.

The 1st amendment was the product of political compromise among the founders. So perhaps it is best to see it as the beginning of a conversation or debate about the relation of political and religious life. Perhaps many of the founders knew that all they could agree on was the need to continue the debate.

Though the founders disagreed on what it meant to be Christian, they all assumed that some version of what each one saw as the “basics” of Christianity was more or less necessary as a foundation of an orderly society. Most of them assumed that Christian values were the basis of political liberty.

Even those who were wary of Christian bias would probably have agreed with Justice Anthony Kennedy, who wrote the majority opinion in the recent Greece case:

“Prayer is but a recognition that, since this nation was founded and until the present day, many Americans deem that their own existence must be understood by precepts far beyond the authority of government to alter or define.”

So most of the founders saw no contradiction between the federal government guaranteeing freedom of religion and the states having established churches that could get special privileges from government, provide prayers for political occasions, and dictate the teaching of religion in schools

But by the late 18th century all the states had so much diversity that the power of established churches was rapidly fading. Massachusetts was the last state to end its established church, in 1833. By the 19th century, then, Americans did not merely believe in the right to dissent from the dominant church. They assumed that there would no longer be any dominant church.

Yet the 19th century was dominated by one religious view: evangelical Christianity. Evangelicals emphasized individual experience as the basis of religion. So religion became, more than ever, a matter of individual choice, which led to the creation of many new churches. But the evangelical fervor also strengthened the idea that all Christians share basic values in common, and that these were the core values of the American way of life — a view that would surface again in some 20th century Supreme Court decisions.

For evangelicals, the “wall of separation” meant that everyone was free to influence the government as much as possible according to their own version of Christian values, with the goal of making America the kingdom of God on earth. For some that meant causes we would consider liberal, like free public schools for all and the abolition of slavery. For some it meant causes that we would call conservative, like prohibition of alcohol and teaching the Bible in public schools. Many felt comfortable supporting all these reform movements.

From the 1840s on large waves of Catholic immigrants came to the U.S.. They learned to accept religious pluralism and reject the old Catholic tradition of one universal church for everyone. But they created their own schools, raising new questions about state support for religious education. These problems, like nearly all problems of church and state in the 19th century, were dealt with at the local and state levels.

After the Civil War, the 14th amendment made all states subject to rule by the federal constitution, opening the way for federal courts to apply the 1st amendment and rule on church-state issues. In 1879 the Supreme Court issued its first opinion directly dealing with church and state. It ruled that the government could forbid Mormons from practicing polygamy. The Court cited words written by Jefferson indicating that the wall of separation prevents the government only from controlling religious beliefs. But the government could forbid behaviors it deemed harmful to society.

However it was not until the 1940s that the Supreme Court began addressing the church-state question in earnest. By that time the federal government was playing a much larger role in the life of every American, while a slowly rising tide of secularism was undermining the notion of America as a Christian nation. For growing numbers of Americans, “the American way of life” meant a dedication to pluralism, diversity, and the fullest protection of individual rights. These factors combined to bring many issues related to religion before the Court.

In 1940 the Court took on the case of Jehovah’s Witnesses who argued they should be able to go door to door without a state license. The Court agreed, declaring for the first time that the 1st amendment’s “free exercise of religion” clause applied to local and state governments as well as the federal.

In the same year, though, a group of Jehovah’s Witnesses argued that their children should not be required to salute the flag in school because it violated their free exercise of religion. The Court ruled against them. Then two years later, in an almost identical case, it ruled that the Jehovah’s Witness children did not have to salute the flag.

Why the abrupt turnaround? There is some evidence that the Court was influenced by a wave of criticism of its first decision from scholars and newspapers, and also by dismay over a wave of anti-Jehovah’s Witness prejudice after the first ruling. This case reminds us that the Court is never making its decision in some abstract realm of pure legal rationality. It is always, to some extent, a barometer of the climate of public opinion.

In the Everson case of 1947 taxpayers argued that their town, which paid for children’s bus rides to public school, should not pay for Catholic children’s bus rides to Catholic school. Writing for the majority, Justice Hugo Black penned a famous, stirring defense of the wall of separation, arguing that the 1st amendment’s “no establishment of religion” clause applied to local and state as well as federal law. This became an accepted principle of later Court cases. Yet Black and the majority decided in favor of the Catholic children getting public money because it was going to them as individuals, not to the church.

This case, and the Court’s reversal in the Jehovah’s Witness cases, foreshadowed the history of church-state cases ever since then. There has been no consistent pattern, but rather what Justice Robert Jackson called a “winding, serpentine” wall of separation, full of all sorts of unpredictable twists and turns in the Court’s views.

Vagueness often prevails. In the Lemon case of 1971, the Court ruled that no law may “have the primary effect of either advancing or inhibiting religion” and left it for later Courts to figure out what that means.  Now the Court has added another contorted brick to that wall, by a 5-4 margin, as has so often been true in recent church-state cases.

The Court still reflects the climate of public opinion, which remains divided and uncertain about the proper relation of religious life to the body politic and the lives of individuals, or what we have come to call “church and state.” So the debate initiated by the 1st amendment goes on — which may be just what the founders intended.

Ira Chernus is professor of Religious Studies at the University of Colorado, Boulder.

 

Read Full Post »

The Civil Rights Heroes the Court Ignored in The New York v. Sullivan

Garrett Epps

The Atlantic March 20, 2014

National Archives

I’m late to the 50th birthday party for New York Times v. Sullivandeliberately so. It’s no fun to be the sourpuss. Sullivan has been celebrated by top legal and media figures from the moment it was decided until its half-centenary this month. Alexander Meiklejohn, the philosopher, called it at the time “an occasion for dancing in the streets.” In his meticulous 1992 book, Make No Law: The Sullivan Case and the First Amendment, famed Supreme Court reporter Anthony Lewis wrote that the case “gave [the First Amendment’s] bold words their full meaning.” And a few weeks ago, University of Chicago Professor Geoffrey Stone wrote that, whatever its flaws, Sullivan “remains one of the great Supreme Court decisions in American history.” The New York Times itself, the winner of the case, congratulated the nation and the Court on “the clearest and most forceful defense of press freedom in American history.” I used to be a newspaper editor. I was dealing with libel threats at my college paper before I was old enough to vote. So I’m grateful for Sullivan’s broad protection of free speech and press. The Court’s decision defused an existential threat to press freedom—a systematic campaign (detailed well by Lewis in Make No Law) to drive the major networks and papers out of the South by using local libel laws to bleed or bankrupt them.  The Court was wise to stop that cold. And yet … and yet. There are some ghosts at the Sullivan feast.  Here are their names: Ralph David Abernathy, S.S. Seay Sr., Fred L. Shuttlesworth, and J.E. Lowery. These four black ministers fought against Southern apartheid—and though the fight in the end was won, these four men lost a great deal in the struggle. Their story is the underside of New York Times v. Sullivan, the part that the “post-racial” America of 2014 is not eager to remember. On March 29, 1960, The New York Times published an advertisement funded by Northern supporters of Martin Luther King and the Southern Christian Leadership Council, who were locked in a struggle to desegregate Montgomery, Alabama. Entitled “Heed Their Rising Voices,” it described a number of actions the city government had taken to thwart Civil Rights Movement protests and punish those who engaged in them. A few of the facts, however, were wrong—not surprising, given that it was written by Bayard Rustin, another Civil Rights hero who was not on the ground in Alabama. Rustin also signed the four ministers’ names to the advertisement—without notifying or consulting them. Days later, L.B. Sullivan, police commissioner of Montgomery, filed suit in a state court against both the Times and the ministers for supposedly defaming him. Even though he hadn’t even been named in the advertisement, the all-white jury awarded Sullivan the full half-million dollars he asked for. A few similar verdicts would have bankrupted even the Times; it pulled its reporters out of Alabama. Other cases were filed against other news organizations; Southern officials boasted publicly that they had found a tool to silence the hated Northern press. The four ministers were also adjudged liable for the full amount. The trial judge wouldn’t even allow them to move for a new trial. Alabama authorities seized their cars and land without waiting for their appeal. Even though both cases ended up in the Supreme Court, they were presented very differently. As Lewis notes dryly, “The Times petition did not emphasize the racial issue.” The issue, for the Times, was press freedom. The ministers’ lawyers, however, cited the shocking racial climate in the court—the jury was all white, the courtroom was forcibly segregated by the trial judge, the judge permitted Sullivan’s lawyers to use derogatory racial terms and refer to cannibalism in the Congo, and the judge refused to call the ministers’ black lawyers “Mister,” as he did Sullivan’s (and the Times’s) white ones. “[T]he jury had before it an eloquent assertion of the inequality of the Negro in the segregation of the one room, of all rooms, where men should find equality before the law,”  the ministers’ brief said. One of the lawyers, Samuel Pierce (later a member of Ronald Reagan’s Cabinet), told the Court, “it is difficult to see how there can be equal protection under the laws and due process in a court where there’s not even equality of courtesy or recognition of human dignity.” Judgment day for the ministers and the Times came on March 9, 1964. In a single opinion for the Court, Justice William Brennan wrote first, that “an otherwise impersonal attack on governmental operations” can never be defamatory of a government official who is not named in the attack, and, second, that even false statements of fact about public officials are protected by the First Amendment unless they are made with “actual malice.” That term means that the person making the statement must either know it is false or at least think it may be false; “pure heart, empty head” protects against libel of officials. Sullivan was and remains a triumph for the Times and the pressBut here is the opinion’s entire discussion of the ministers’ claims: “The individual petitioners contend that the judgment against them offends the Due Process Clause because there was no evidence to show that they had published or authorized the publication of the alleged libel, and that the Due Process and Equal Protection Clauses were violated by racial segregation and racial bias in the courtroom.” Because it had decided the First Amendment issue, Brennan wrote, “we do not decide the questions presented by the other claims of violation of the Fourteenth Amendment.” Am I the only one who wonders why a Court that was bold in defense of the press could not even mention segregation? Or to wince when the opinion relies on the words of a slaveholder, Thomas Jefferson? Am I the only one who remembers that Brennan, the liberal icon, told four brave men their issues were not worthy of address? L.B. Sullivan lives on in the case’s name. The ministers have disappeared. As a Southern-born white, I do not owe my freedom to The New York Times but to men like those four ministersAbernathy, Seay, and Shuttlesworth are dead, but Joseph Lowery, who is 90, gave the invocation at Barack Obama’s first inauguration.  The Times recorded his attendance at the commemoration of King’s “I have a dream speech” last August. But as far as I can tell, it did not give him any credit for its landmark free press victory. My point is not to skewer the Times, which I admireit is to remind us all that American history has a tendency to grow whiter over time. Know these names: Abernathy, Lowery, Seay, Shuttlesworth. Know the names of the other African Americans who risked (and sometimes lost) everything they had to free Americans of every race.  And by all means celebrate New York Times v. Sullivan. In some ways it really is an occasion for dancing in the streets. But perhaps we should not expect Joseph Lowery to dance. Garrett Epps, a former reporter for The Washington Post, is a novelist and legal scholar. He teaches courses in constitutional law and creative writing for law students at the University of Baltimore and lives in Washington, D.C. His new book is American Epic: Reading the U.S. Constitution.

Read Full Post »

Fifty years later, new accounts of its fraught passage reveal the era’s real hero—and it isn’t the Supreme Court.
Michael O’Donell
The Atlantic  March 19, 2014

President Johnson confronts Senator Richard Russell, the leader of the filibuster against the civil-rights bill. (Yoichi Okamoto/National Archives)

In the winter of 1963, as the Civil Rights Act worked its way through Congress, Justice William Brennan decided to play for time. The Supreme Court had recently heard arguments in the appeal of 12 African American protesters arrested at a segregated Baltimore restaurant. The justices had caucused, and a conservative majority had voted to decide Bell v. Maryland by reiterating that the Fourteenth Amendment’s equal-protection clause did not apply to private businesses like restaurants and lunch counters—only to “state actors.” The Court had used this doctrine to limit the reach of the Fourteenth Amendment since 1883. Brennan—the Warren Court’s liberal deal maker and master strategist—knew that such a decision could destroy the civil-rights bill’s chances in Congress. After all, the bill’s key provision outlawed segregation in public accommodations. Taxing his opponents’ patience, he sought a delay in order to request the government’s views on the case. He all but winked and told the solicitor general not to hurry.

And then the conservatives on the Court lost their fifth vote. Justice Tom Clark changed his mind and circulated a draft opinion granting the appeal. In a revolutionary constitutional change, lunch counters and restaurants would suddenly be liable if they violated the equal-protection clause. But Brennan foresaw a new difficulty. By now it was June 1964, and a coalition of northern Democratic and Republican senators looked set to break a southern filibuster and pass a strong civil-rights bill. Would a favorable Supreme Court ruling actually give wavering senators an excuse to vote no? They might say there was no need for legislation because the Court had already solved the problem. So Brennan, ever nimble, engineered a tactical retreat by assembling a majority that avoided the merits of the case altogether. It was an alley-oop to the political branches. They grabbed the ball and dunked it. Ten days after the Court’s decision, Congress passed the Civil Rights Act and the president signed it into law.

In the popular imagination, the Supreme Court is the governmental hero of the civil-rights era. The period conjures images of strong white pillars, Earl Warren’s horn-rims, and the almost holy words Brown v. Board of Education. But in Bell, the Court vindicated civil rights by stepping aside. As Bruce Ackerman observes in The Civil Rights Revolution, Brennan realized that a law passed by democratically elected officials would bear greater legitimacy in the South than a Supreme Court decision. He also doubtless anticipated that the act would be challenged in court, and that he would eventually have his say. The moment demonstrated not merely cooperation among the three branches of government, but a confluence of personalities: Brennan slowing down the Court, President Johnson leaning on Congress to hurry up, and the grandstanders and speechmakers of the Senate making their deals, Everett Dirksen and Hubert Humphrey foremost among them. In this age of obstruction and delay, it is heartening to recall that when the government decides to act, it can be a mighty force.

But three equal branches rarely means three equal burdens, and the civil-rights era was no exception. Although the Court-centered narrative undervalues the two political branches, of those two branches it was the executive that provided decisive leadership in the 1960s. Just as the intragovernmental cooperation of 1964 is striking in light of today’s partisan gridlock, the presidential initiative displayed during the mid-’60s is worth considering in light of Barack Obama’s perceived hands-off approach to lawmaking. Of course, no discussion of civil-rights leadership is complete without including Martin Luther King Jr., who provided moral and spiritual focus, infusing the movement with resolution and dignity. But the times also called for a leader who could subdue the vast political and administrative forces arrayed against change—for someone with the strategic and tactical instincts to overcome the most-entrenched opponents, and the courage to decide instantly, in a moment of great uncertainty and doubt, to throw his full weight behind progress. The civil-rights movement had the extraordinary figure of Lyndon Johnson.

The Civil Rights Act turns 50 this year, and a wave of fine books accompanies the semicentennial. Ackerman’s is the most ambitious; it is the third volume in an ongoing series on American constitutional history called We the People. A professor of law and political science at Yale, Ackerman likens the act to a constitutional amendment in its significance to the country’s legal development. He acknowledges the Supreme Court’s leadership during the 1950s, when President Eisenhower showed little enthusiasm for civil rights, and when Congress passed the largely toothless Civil Rights Act of 1957. During those same years, the Court spoke with a loud, clear voice, unanimously deciding Brown, which ordered the desegregation of schools, and Cooper v. Aaron, which held that state segregation laws conflicting with the Constitution could not stand. But the Supreme Court does not command the National Guard or control the budget. Someone needed to enforce those decisions in the defiant South. That is why, Ackerman writes, “the mantle of leadership passed to the president and Congress,” beginning with the 1964 law.

But the political branches ventured into the fray only in the last weeks of 1963. President Kennedy had introduced the bill in June of that year with much ambivalence. As Todd S. Purdum, a senior writer at Politico, recounts in An Idea Whose Time Has Come, Kennedy had led a sheltered life in matters of race. While generally sympathetic to civil-rights ideals, he “believed that strong civil rights legislation would be difficult if not impossible to pass, and that it could well jeopardize the rest of his legislative program.” He had tried to attack literacy tests and other barriers to voting with legislation but had twice been defeated in the Senate, where the old bulls of the South wielded the filibuster with practiced skill. (Roy Wilkins of the NAACP observed, “Kennedy was not naïve, but as a legislator he was very green.”) He regarded Martin Luther King Jr. warily, and with each new southern crisis saw his agenda slipping away. But events finally forced Kennedy to act. The Freedom Riders in Montgomery, the dogs and water cannons in Birmingham, and the sit-in in Jackson all made further equivocation on civil rights impossible by the spring of 1963. Four hours after Kennedy’s speech calling for legislation, an assassin murdered the NAACP organizer Medgar Evers in his own driveway. Five months after that, the bill was stuck in the House Rules Committee—“the turnstile at the entry to the House of Representatives,” in Purdum’s phrase—and the country had a new president.

In 1963, the Reverend Joseph Carter (far left) was the first African American in his Louisiana parish to register to vote. He was jeered as he walked down the courthouse steps. (Bob Adelman/Corbis)

Purdum, whose book is an astute, well-paced, and highly readable play-by-play of the bill’s journey to become a law, describes the immense challenges facing Lyndon Johnson after Kennedy’s assassination. “When it came to civil rights, much of America was paralyzed in 1963,” he writes. That certainly included Congress. The civil-rights bill, which had been languishing in the House since June, had no hope of coming to a full vote in the near future, and faced even bleaker prospects in the Senate. In fact, Kennedy’s entire legislative program was at a standstill, with a stalled tax-cut bill, eight stranded appropriations measures, and motionless education proposals. And Congress was not Johnson’s only problem. He also had to ensure the continuity of government, reassure the United States’ allies, and investigate Kennedy’s assassination. Purdum’s version of this story is excellent, but he cannot surpass the masterful Robert A. Caro, who offers a peerless and truly mesmerizing account of Johnson’s assumption of the presidency in The Passage of Power.

Days after Kennedy’s murder, Johnson displayed the type of leadership on civil rights that his predecessor lacked and that the other branches could not possibly match. He made the bold and exceedingly risky decision to champion the stalled civil-rights bill. It was a pivotal moment: without Johnson, a strong bill would not have passed. Caro writes that during a searching late-night conversation that lasted into the morning of November 27, when somebody tried to persuade Johnson not to waste his time or capital on the lost cause of civil rights, the president replied, “Well, what the hell’s the presidency for?” He grasped the unique possibilities of the moment and saw how to leverage the nation’s grief by tying Kennedy’s legacy to the fight against inequality. Addressing Congress later that day, Johnson showed that he would replace his predecessor’s eloquence with concrete action. He resolutely announced: “We have talked long enough in this country about equal rights. We have talked for 100 years or more. It is time now to write the next chapter, and to write it in the books of law.”

President Johnson talks with civil-rights leaders in the Oval Office in January 1964. From left: Martin Luther King Jr., LBJ, Whitney Young, and James Farmer. (Yoichi Okamoto/AP)

The New York Times journalist Clay Risen contends in The Bill of the Century that Johnson’s contribution to the Civil Rights Act’s success was “largely symbolic.” One might say the same thing about Neil Armstrong’s walk on the moon. Sometimes symbolism is substance—especially where the presidency is concerned. The head of the executive branch firmly seized the initiative, taking up a moribund bill addressing the nation’s most agonizing problem. Here was Johnson, president for only five days, working out of the Executive Office Building because the White House was still occupied by Kennedy’s family and staff, with an election already looming less than a year away. Instead of proceeding tentatively, as most anyone in those circumstances would have done, he radiated decisiveness, betting everything he had right after he got it. As Caro shows so persuasively, from that moment, Johnson’s urgency and purpose infused every stage of the bill’s progress. And in the days and weeks that followed, the stagnant cloud that had settled over Kennedy’s agenda began to lift.

Symbolism was the least of it. Johnson took off his jacket and tore into the legislative process intimately and tirelessly. As the former Senate majority leader, he knew his way around Capitol Hill like few other presidents before him—and none since. The best hope of moving the civil-rights bill from the House Rules Committee—whose segregationist chairman, Howard Smith of Virginia, had no intention of relinquishing it—was a procedure called a “discharge petition.” If a majority of House members sign a discharge petition, a bill is taken from the committee, to the chagrin of its chairman. Johnson made the petition his own personal crusade. Even Risen credits his zeal, noting that after receiving a list of 22 House members vulnerable to pressure on the petition, the president immediately ordered the White House switchboard to get them on the phone, wherever they could be found. Johnson engaged an army of lieutenants—businessmen, civil-rights leaders, labor officials, journalists, and allies on the Hill—to go out and find votes for the discharge petition. He cut a deal that secured half a dozen votes from the Texas delegation. He showed Martin Luther King Jr. a list of uncommitted Republicans and, as Caro writes, “told King to work on them.” He directed one labor leader to “talk to every human you could,” saying, “if we fail on this, then we fail in everything.”

The pressure worked. On December 4—not two weeks into Johnson’s presidency—the implacable Chairman Smith began to give way. Rather than have the bill taken from his committee, he privately agreed to begin hearings that would conclude before the end of January, and then release the bill. Smith looked set to renege on his agreement in the new year, but reluctantly kept his word, allowing the bill to be sent to the full House on January 30, 1964. Risen credits others with this development, suggesting that it was Representative Clarence Brown of Ohio, a Republican member of the Rules Committee, among others, who got Smith to move. Risen is particularly sharp on the evolution of the Republicans during these tumultuous years, but here he accords them too much clout. Brown had to answer to House Republican Leader Charles Halleck of Indiana, whose support Johnson likely bought by proposing, and then personally securing, a NASAresearch facility at Purdue University, in Halleck’s district. And the entire Republican caucus in the House was wilting under Johnson’s relentless and very public campaign to portray “the party of Lincoln” as obstructing civil rights by opposing the discharge petition.

Johnson kept the bill moving in the Senate by dislodging President Kennedy’s tax-cut bill from the Finance Committee. As vice president, Johnson had advised Kennedy not to introduce civil-rights legislation until the tax cut had cleared Congress. Kennedy didn’t listen, and now both bills were stuck. (Like House Rules, Senate Finance had a wily segregationist for a chairman: Harry Byrd of Virginia.) Risen minimizes the significance of this problem, writing that the tax bill “presented no procedural obstacle to the civil rights bill, only a political one.” (And when does politics ever derail legislation?!) As Caro explains, the tax bill was a hostage. By holding it in committee, the South pressured the administration to give up on civil-rights legislation, with the implication that the withdrawal of the latter might produce movement on the former. But Johnson and Byrd were old friends, and during an elaborate White House lunch they came to an understanding: if Johnson submitted a budget below $100 billion, Byrd would release the tax bill. Johnson then personally bullied department heads to reduce their appropriations requests, and delivered a budget of $97.9 billion. The Finance Committee passed the tax bill on January 23, 1964, with Byrd casting the deciding vote to allow a vote, then weighing in against the measure itself. The Senate passed the tax bill on February 7, mere days before the civil-rights bill cleared the House.

Finally, Johnson helped usher the bill to passage in the Senate by working to break the southern filibuster, which was led by his political patron, the formidable Richard Russell of Georgia. In light of the Senate’s fiercely guarded independence, the president could not operate in the open; he had to use proxies like Humphrey, who was his protégé and future vice president, as well as the bill’s floor manager. Johnson impressed upon Humphrey that the vain and flamboyant Senate Republican Leader Everett Dirksen of Illinois was the key to delivering the Republican votes needed for cloture:

“You and I are going to get Ev. It’s going to take time. We’re going to get him. You make up your mind now that you’ve got to spend time with Ev Dirksen. You’ve got to let him have a piece of the action. He’s got to look good all the time. Don’t let those [liberal] bomb throwers, now, talk you out of seeing Dirksen. You get in there to see Dirksen. You drink with Dirksen! You talk with Dirksen! You listen to Dirksen!”

Johnson demanded constant updates from Humphrey and Majority Leader Mike Mansfield, and always urged more-aggressive tactics. (“The president grabbed me by my shoulder and damn near broke my arm,” said Humphrey.) Even though Senate Democrats did not deploy all those tactics, Johnson’s intensity nevertheless set the tone and supplied its own momentum. He kept up a steady stream of speeches and public appearances demanding Senate passage of the strong House bill, undiluted by horse-trading. And he personally lobbied senators to vote for cloture and end the filibuster. Risen contends that Johnson “persuaded exactly one senator” to change his vote on cloture. Given that it is of course impossible to know what motivated each senator’s final decision, this lowball figure is expressed with too much certitude. Evidence presented by Purdum and Caro suggests that Johnson’s importuning, bribing, and threatening may have made an impact on closer to a dozen. The Senate invoked cloture on June 10, breaking the longest filibuster in the institution’s history. The full Senate soon passed the bill. Johnson signed it into law on July 2, 1964, and immediately turned his energies to what would become another landmark statute: the Voting Rights Act of 1965.

Risen’s attempt to minimize Johnson’s significance in the passage of the Civil Rights Act—“he was at most a supporting actor”; “he was just one of a cast of dozens”; “the Civil Rights Act was not his bill by any stretch”—is perplexing. In an otherwise strong book, his revisionist view is less a question of facts than of emphasis: after all, Purdum too notes that Johnson “strategically limit[ed] his own role” at key moments (careful, for example, not to upstage Dirksen). But Risen seems bent on denying Johnson his due, drawing nearly every inference against him and repeatedly overstating the anti-Johnson case. On the one hand, Risen is right to take a fresh look at the evidence and tell the story from a new perspective, focusing on unsung heroes such as Dirksen, Humphrey, Representative William McCulloch, and Nicholas Katzenbach of the Justice Department. He makes a fair point in questioning the way history awards presidents the credit for measures that by necessity cross many desks. On the other hand, Risen is simply wrong to portray Johnson as some hapless operator for trying multiple tactics and targets, some of them unsuccessfully. Johnson’s very comprehensiveness is what jarred the sluggish and paralyzed Capitol into action and ultimately moved the bill.

President Johnson signs the Civil Rights Act into law on July 2, 1964. (Cecil Stoughton/White House Press Office)

If the president led and Congress followed, where did that leave the Supreme Court? Three months after Johnson signed the Civil Rights Act, the Court heard arguments in a pair of cases challenging the constitutionality of its most contentious provision—Title II, which outlawed segregation in public accommodations. In December 1964 the Court decided Katzenbach v. McClungand Heart of Atlanta Motel v. United States, upholding Title II as a valid exercise of Congress’s commerce power. In the years since, the act has been a remarkable success. Its acceptance in the South was surprisingly quick and widespread. In a stroke, the act demolished the rickety but persistent foundation for segregation and Jim Crow. Title II reached far into the daily lives of southerners, creating an unprecedented level of personal mingling between the races and making integration a fact of daily life. Title VII, meanwhile, has vastly reduced workplace discrimination, through the efforts of the Equal Employment Opportunity Commission. Although years of toil, struggle, and bloodshed still lay ahead, the 1964 law dealt a major blow to the system of segregation. The past 50 years of American history are almost unimaginable without it.

And yet the anniversary prompts an ominous reconsideration of the Supreme Court’s role in civil rights. In 1954, the Court launched the federal government’s assault on segregation, with Brown. In 1964, it got out of the way of the political branches, then quickly ratified their work. Today when it comes to racial civil rights, the Roberts Court is an aggressively hostile force. Recall Ackerman’s contention that the 1964 act has taken on the weight of a constitutional amendment. At a literal level, this is of course untrue: the act was not ratified by three-quarters of the states and is not part of the written Constitution. This means that a constitutional amendment is not needed to overturn the Civil Rights Act, which is vulnerable to a subsequent act of Congress or, more to the point, a decision by the Supreme Court.

Ten years ago, even mentioning this possibility would have seemed outrageous. But last June, the Court decided Shelby County v. Holder, striking down Section 4(b) of the Voting Rights Act of 1965 as unconstitutional. Section 4(b) listed the states with a history of voting discrimination that were required to seek preclearance from the Justice Department or the courts before amending their voting laws. The 5–4 decision by Chief Justice John Roberts is nothing short of appalling: as unpersuasive as it is misguided, it is, in Ackerman’s words, “a shattering judicial betrayal” of the civil-rights era. It is also the Roberts Court’s most brazenly activist decision: Congress has reauthorized the Voting Rights Act four times, most recently in 2006, with votes of 390–33 in the House and 98–0 in the Senate. In her brilliant dissent, Justice Ruth Bader Ginsburg summed up the decision’s obtuseness: “Throwing out pre-clearance when it has worked and is continuing to work to stop discriminatory changes is like throwing away your umbrella in a rainstorm because you are not getting wet.”

Shelby County may be so unique that it portends no harm for the Civil Rights Act. After all, the preclearance regime was extraordinarily invasive. Ackerman calls it the biggest federal intrusion into the prerogatives of the southern states since Reconstruction. But Title II of the Civil Rights Act is also strong medicine, reaching beyond state actors to tell private businesses whom they must serve. It was by far the act’s most controversial provision—and it remains controversial among some conservatives. In 2010, Senator Rand Paul caused a sensation by arguing that the provision in the Civil Rights Act dealing with “private business owners” (ostensibly Title II) is unconstitutional. He quickly walked back his comments, but his father, Ron Paul, proudly continues to make the same argument, and the Tea Party is listening. The Heritage Foundation’s Web site files the McClung decision upholding Title II on its “Judicial Activism” page, tagged to the terms Abusing Precedent and Contorting Text. The Voting Rights Act decision can only embolden Title II’s opponents.

And they just might get a hearing. Three trends in the Roberts Court’s jurisprudence suggest that the justices would be more receptive to a challenge to Title II than any prior Court. First is its disregard for precedent. The Roberts Court has repeatedly ignored prior decisions when doing so enabled a conservative victory—most notoriously in the areas of gun regulation (District of Columbia v. Heller) and campaign finance (Citizens United v. Federal Election Commission). Hence it is little comfort that the Court upheld Title II in 1964. It had also previously upheld the Voting Rights Act and its reauthorizations. Second is the Roberts Court’s impatience with open-ended civil-rights measures, which some justices believe are no longer necessary. “The tests and devices that blocked access to the ballot have been forbidden nationwide for over 40 years,” the Court wrote in Shelby County, dismissing the need for ongoing vigilance against voting discrimination. And third is the Court’s continued disdain for the commerce clause. Remember when Roberts’s decision upholding the Affordable Care Act made the point that the act was not a valid exercise of Congress’s commerce power? He was singling out the section of the Constitution that supports the Civil Rights Act.

The 1964 law is not in imminent danger from the Supreme Court. But it is worth considering how a hostile Court changes the equation from 1964, when the judiciary acted in concert with the political branches. The new paradigm places a premium on presidential leadership, at the very least in nominating judges and justices who are in sympathy with the great statutes of the 1960s. But the battle over the Civil Rights Act shows that presidents who are serious about concrete social progress must do even more.

Lyndon Johnsons, of course, do not come along every four or every 40 years. Even if they did, Johnson brought plenty of darkness (election stealing, a credibility gap, Vietnam) along with the light (Civil Rights Act, Voting Rights Act, Great Society). Moreover, not every president needs to be a legislative genius in order to pass laws. Obama, after all, gambled big on the Affordable Care Act, investing the same type of capital in health care that Johnson invested in civil rights. It is now the law of the land. But the energy and purpose that Johnson brought to the Civil Rights Act struggle remains inspiring, and is a model for all presidents. As Richard Russell, the South’s leader in the Senate during the 1960s, put it to a friend a few days after Kennedy’s assassination: “You know, we could have beaten John Kennedy on civil rights, but not Lyndon Johnson.”

Michael O’Donnell is a lawyer in Chicago. His writing has appeared in The Wall Street Journal, The Nation, and The Washington Monthly.

Read Full Post »