El objetivo de este blog es fomentar el estudio de la historia de los Estados Unidos y analizar las prácticas, instituciones y discursos del imperialismo estadounidense.
American cinema has never been particularly kind to black folks. As blacks moved north into cities with burgeoning movie palaces and industrial jobs and laws designed to keep them impoverished and on the margins, filmic representation of black life—limited for most of the time between World War I and Vietnam to “Toms, Coons, Mulattoes, Mammies and Bucks,” as historian Donald Bogle put it in 1973—was the lens through which American society viewed blacks.1
Anyone who took silver screen representations of blacks at face value would come away with the notion that black folks were, by and large, stupid, cowardly, lazy and worthy of subjugation, censure and plunder. That this is just how America has largely treated its African American population both before and since is no accident. When the blustery but occasionally insightful critic Armond White said “you have a culture of criticism that simply doesn’t want Black people to have any kind of power, any kind of spiritual understanding or artistic understanding of themselves,” he wasn’t wrong.
“ARoad Three Hundred Years Long: Cinema and the Great Migration,” a series at New York’s Museum of Modern Art that began last Monday and runs through June 12th, provides a much-needed corrective, a narrative of black resistance to the dominant mode of slanderous Hollywood storytelling—both now and, somewhat miraculously, in the days of Jim Crow. Curated by MoMA’s Joshua Siegel and independent curator Thomas Beard of Light Industry, the country’s leading micro-cinema for experimental film, the exhibition is pegged to the Museum’s show of Jacob Lawrence’s Great Migration paintings. “A Road Three Hundred Years Long” includes many films that catered to black audiences in the era of their flight from terror, showcasing a significant amount of pre-war black filmmaking as well as movies by contemporary black filmmakers that explore the mysterious legacy of the Great Migration.
Blacks were unable to go to segregated movie theaters in the south except under special circumstances—“midnight rambles,” for example, which were late night screenings of popular films and black-themed movies for black patrons that were barred during the day. But some black owned movie theaters persisted and some of the work shown there, both home movies and narrative features, found appreciative pre-war audiences. MoMA’s program includes many of these films, as well as WPA-style proletarian news reels and avant-garde non-fiction films, to showcase images of black life that—despite the specter of prejudice—highlight unadorned, provincially black, middle class normality in a way that is painfully rare. These spaces have largely escaped Hollywood visions of black American life, obsessed as they are with broad stereotype (see this summer’s Dope) or elegantly rendered historical struggle (Selma, 12 Years a Slave).
But for black experimentalists (like Kevin Jerome Everson, who directed Company Line) or great, sadly underemployed black narrative directors—like Charles Burnett, who made To Sleep with Anger, and Julie Dash, who did Daughters of the Dust—the forgotten spaces of black American life provide wonderful fodder for exploring the mysterious resilience of black middle class life.
The program’s most interesting discoveries are rooted further in the past. Oscar Micheaux and Spencer Williams were the only African Americans to make narrative feature films in the years before the Civil Rights Movement and they are very much at the center of “A Road Three Hundred Years Long.”Several works by Micheaux, the most prolific of the black pre-war filmmakers, are on display, including his 1920 film The Symbol of the Unconquered. A rejoinder of sorts to The Birth of a Nation, it remains one of the watershed moments in black film history.
Meanwhile, Williams’s work is the subject of the documentary Juke: Passages from the films of Spencer Williams, a series-opening compilation film by noted found footage documentarian Thom Andersen (Los Angeles Plays Itself). Williams is known more as the star of the popular early ’50s sitcom the Amos ‘n’ Andy than as a groundbreaking independent filmmaker. Financed by a Jewish Texan named Alfred Sack, Williams’ moralizing work is often compared to Tyler Perry’s, but given the era he worked in he couldn’t hope to control the means of production on his own (as Perry famously has).
Williams’ race movies made great use of magic realist flourishes in his early films; in The Blood of Jesus, undeniably his masterwork, the devil drives a flatbed truck with sinners perched on the back and Angels—given etherealness through that most basic cinematic trickery, the double exposure—hover over the bed of a woman who lies mortally wounded.Andersen’s interest, however, lies mainly in recontextualizing clips from Williams’ films to highlight the documentary-like aspects they contain. Because so few motion picture images of black middle class life were taken in Dallas suburbs or Harlem night clubs, in cab stands and in juke joints, in sitting rooms and in neighborhood streets, Williams’ scenes have a representative quality that transcends their narratives of jazz age sin and Christian redemption, one that Andersen’s deft cutting compiles in a lucid, if decidedly non-narrative way. (Next winter, arthouse film distributor Kino Lorber will release a box setof Micheaux and Williams movies.)
This has been a watershed year for black repertory programming in the city. Between the Film Society of Lincoln Center’s “Telling It Like It Is: Black Independents in New York 1968-1986,” the Brooklyn Academy of Music’s “Space is the Place: Afrofuturism in Film,” and “A Road Three Hundred Years Long,” movies about black life have been given significant platforms in some of New York’s august cultural institutions. The takeaway from these wonderfully curated programs, though, is grimmer: Looking through the filmographies of the artists involved, very little of their work has come with the support of our national film culture’s most enduring brands, the Hollywood studios. One quickly gets the sense that thoughtful, nuanced stories about the African American experience aren’t welcome out there.
Today African Americans are underrepresented in the movie industry’s workforce, both in front of and behind the camera. In the boardroom and in the postproduction house, blackness is scarce. This holds true for every strata of the industry, from the dying vestiges of New York’s indiewood to the hallways of power at Los Angeles’ major studios. I imagine there may be a black person within the studio apparatus who holds the power to green light movies about the parochial anxieties, dreams and predilections of American negroes without fear of quizzical glances from his peers and superiors, but I’ve yet to meet or hear of this person. I’m not holding my breath.
This year and last year are the only consecutive years in cinematic history in which films directed by people of African descent were nominated for the Oscar for best picture. But look for African Americans when you walk through the offices of a major film production or distribution outfit—you’ll still only find them at the guard’s desk or near the janitor’s closet. That narrative has to change.
President Woodrow Wilson’s responseto seeing W. Griffith’s 1915 Klu Klux Klan-sympathizing Birth of a Nation is emblematic: “Like writing history with Lightning, ” he reportedly said, “and my only regret is that it is all so terribly true.”
Brandon Harris
A visiting assistant Professor of Film at SUNY Purchase and Contributing Editor of Filmmaker Magazine, Brandon Harris’ criticism and journalism has appeared in The New Yorker, Vice, The Daily Beast and n+1. His film Redlegs is a New York Times Critic’s Pick.
The recent announcement by U.S. Senator Bernie Sanders, an avowed “democratic socialist,” that he is running for the Democratic nomination for President raises the question of whether Americans will vote for a candidate with that political orientation.
During the first two decades of the twentieth century, the idea of democratic socialism — democratic control of the economy — had substantial popularity in the United States. At the time, the Socialist Party of America was a thriving, rapidly-growing political organization, much like its democratic socialist counterparts abroad — the British Labour Party, the French Socialist Party, the German Social Democratic Party, the Australian Labor Party, and numerous other rising, working class-based political entities around the world. In 1912, when the United States had a much smaller population than today, the Socialist Party had 118,000 dues-paying members and drew nearly a million votes for its candidate, Eugene V. Debs, the great labor leader, for President. (The victor that year was the Democratic candidate, Woodrow Wilson, who drew six million votes.) Furthermore, the party held 1,200 public offices in 340 cities, including 79 mayors in 24 states. Socialist administrations were elected in Minneapolis, Minnesota, Butte, Montana, Flint, Michigan, Schenectady, New York, and all across the country. In 1912, the Socialist Party claimed 323 English and foreign language publications with a total circulation in excess of two million.
Of course, this socialist surge didn’t last. The Democratic and the Republican parties, faced with this threat to their political future, turned to supporting progressive agendas — breaking up or regulating giant corporations, curbing corporate abuses, and championing a graduated income tax — that stole the socialists’ thunder. In addition, after U.S. entry into World War I, an action opposed by the socialists, the federal and state governments moved to crush the Socialist Party — arresting and imprisoning its leaders (including Debs), purging its elected officials, and closing down its publications. Moreover, one portion of the party, excited by the success of revolutionaries in overthrowing Russia’s Czar and establishing the Soviet Union, broke with the Socialist Party and established Communist rivals. Co-opted by the mainstream parties, repressed by government, and abandoned by would-be revolutionaries, the Socialist Party never recovered.
Even so, democratic socialism retained a lingering influence in American life. When a new wave of reform occurred during the New Deal of the 1930s, it included numerous measures advocated and popularized by the Socialist Party: Social Security; public jobs programs like the WPA; minimum wage laws; maximum hour laws; and a steep tax on the wealthy. Here and there, although rarely, socialists even secured public office, and Milwaukee voters regularly elected socialist mayors until 1948. Starting in 1928 and running through the early post-World War II era, Norman Thomas became the attractive, articulate leader of the Socialist Party, and was widely respected among many American liberals and union leaders.
What nearly eliminated the Socialist Party was a combination of New Deal measures (which drew labor and other key constituencies into the Democratic Party) and the public’s identification of Socialism with Communism. Although, in fact, the American Socialist and Communist parties were bitter rivals — the former championing democratic socialism on the British model and the latter authoritarian socialism on the Soviet model — many Americans, influenced by dire conservative warnings, confused the two. Particularly during the Cold War, this further undermined the Socialist Party.
In the early 1970s, with the party barely surviving, most democratic socialists decided it was time to reassess their strategy. They asked: Did the collapse of the Socialist Party mean that, in the United States, democratic socialism was unpopular, or did it mean that third party voting was unpopular? After all, large numbers of Americans supported democratic socialist programs, ranging from national healthcare to public education, from public transportation to taxing the rich, from preserving the environment to defending workers’ rights. What would happen if democratic socialists worked for their programs within the Democratic Party, where the typical constituencies of the world’s democratic socialist parties — unions, racial minorities, women’s rights activists, and environmentalists — were already located? Led by the party’s titular leader, Michael Harrington, whose book The Other America sparked the War on Poverty of the 1960s, they organized Democratic Socialists of America (DSA) and plunged into major social movements and into the Democratic Party.
Although, in the ensuing decades, DSA made little progress toward rebuilding a mass, high profile democratic socialist organization, it did manage to pull thousands of union, racial justice, women’s rights, and environmental activists into its orbit. DSA also discovered a significant number of leftwing Democratic and, sometimes, independent candidates for office who welcomed its support and occasionally joined it. Bernie Sanders — an independent who was elected as mayor of Burlington, Vermont’s only Congressman, and a U.S. Senator from Vermont — is certainly one of the most successful of these politicians. Indeed, in 2012 he won re-election to the Senate with 71 percent of the vote.
But will Americans actually support a democratic socialist in the Democratic Presidential primaries? Sanders himself has conceded that the odds are heavily against him. Even so, although a Quinnipiac poll of American voters in late May of this year found him far behind the much better known and better funded Hillary Clinton, his 15 percent of the vote placed him well ahead of all other potential Democratic candidates. Also, there’s great potential for broadening his support. The latest poll on Americans’ attitudes toward “socialism,” taken in December 2011, found that 31 percent of respondents had a positive reaction to it. And what if Americans had been asked about their attitude toward “democratic socialism”?
Consequently, even if Hillary Clinton emerges as the Democratic nominee, as seems likely, a good showing by Sanders could strengthen the democratic socialist current in American life.
Recently, historians have shown that the modern conservative movement is older and more complex than has often been assumed by either liberals or historians. Michelle Nickerson’s book, Mothers of Conservatism: Women and the Postwar Right(Princeton University Press, 2012)expands that literature even further, demonstrating not only the longer roots of conservative interest in family issues, such as education, but also the important role women played in shaping the early movement. Mothers of Conservatism does this by examining the role of women in the rise of grassroots conservatism during the 1950s. Nickerson explains how women in Southern California became politicized during the height of the Cold War, coming to see communist threats in numerous, mostly local, battles. These women, who were primarily homemakers, argued that they had a special political role as mothers and wives, translating their domestic identities into political activism. Nickerson traces their activism in battles over education and mental health issues among others. She further explains the ideology behind their activism and demonstrates how important these women were to shaping the coming conservative movement and in the long-term, the Republican Party.
Mothers of Conservatism draws on rich archival material as well as on oral history interviews conducted by the author. With these archival sources and interviews, Nickerson brings the activists’ stories, politics, and humanity to life. In this interview, we discuss the ideology, activism, and legacy of the women as well as Nickerson’s experience interviewing her sources.
The most obvious constitutional result of the Civil War was the adoption of three landmark constitutional amendments. The 13th ended slavery forever in the United States, while the 14th made all persons born in the United States (including the former slaves) citizens of the nation and prohibited the states from denying anyone the privileges and immunities of American citizenship, due process or law, or equal protection of the law. Finally, the 15th Amendment, ratified in 1870, prohibited the states from denying the franchise to anyone based on “race, color, or previous condition of servitude.”
These amendments, however, have their roots in the war itself, and in some ways can been seen as formal acknowledgments of the way the war altered the Constitution. Other changes came about without any amendments. Thus, the war altered the Constitution in a variety of ways. A review of some of them underscores how the Union that President Lincoln preserved was fundamentally different — and better — than the Union he inherited when he became president.
Slavery
The first and most obvious change involves slavery. The 13th Amendment was possible (as were the other two Civil War amendments) only because the war broke slavery’s stranglehold over politics and constitutional development. The Constitution of 1787 protected slavery at every turn. Although framers did not use the word “slavery” in the document, everyone at the Constitutional Convention understood the ways in which the new form of government protected slavery. Indeed, the word “slavery” was not used at the request of the Connecticut delegation and some other Northerners, who feared that their constituents would not ratify the Constitution if the word was in the document — not because the delegates objected to the word itself.
It would take many pages to review all the proslavery features of the Constitution, but here are some of the most significant ones. The three-fifths clause gave the South extra members of the House of Representatives, based on the number of slaves in each state. Without these representatives, created entirely by slavery, proslavery legislation like the Missouri Compromise of 1820 and the Fugitive Slave Law of 1850 could never have been passed.
Equally important, votes in the Electoral College were based on the number of representatives in the House, and so slavery gave the South a bonus in electing the president. Without the electors created by slavery, the slaveholding Thomas Jefferson would have lost the election of 1800 to the non-slaveholding John Adams.
The “domestic insurrections clause” guaranteed that federal troops would be used to suppress slave rebellions, as they were in the Nat Turner Rebellion in 1831 and John Brown’s attempt to start a slave rebellion in 1859.
An image from Harper’s Weekly showing the House of Representatives during the passage of the 13th Amendment, Jan. 31, 1865.Credit Library of Congress
Finally, it took two-thirds of Congress to send a constitutional amendment to the states, and it took three-fourths of the states to ratify any amendment. Had the 15 slave states all remained in the Union, to this day, in 2015, it would be impossible to end slavery by constitutional amendment, since in a 50-state union, it takes just 13 states to block an amendment.
The political power of the slave states meant that the nation was always forced to protect slavery. Thus the South in effect controlled politics from 1788 until 1861. Slave owners held the presidency for all but 12 years between 1788 and 1850. All of the two-term presidents were slave owners. Three Northerners held the office from 1850 to 1860 — Fillmore, Pierce and Buchanan – but all were proslavery and they bent over backward to placate the South.
It took the Civil War to break slavery’s stranglehold on politics and fundamentally alter the nature of constitutional law and constitutional change.
The demise of slavery began with slaves running away and the army freeing them. But the key moment was the Emancipation Proclamation, which was the first important executive order in American history. In order to destroy slavery — and save the Union — Lincoln found new power for his office.
Secession and Nullification
Since the beginning of the nation, claims that states could nullify federal law or even secede had destabilized American politics and constitutional law. Sometimes Northerners made these claims, such as the disgruntled New Englanders who organized the Hartford Convention to oppose the War of 1812. But most claims of nullification came from the slave South. In 1798 Jefferson secretly wrote the “Kentucky Resolutions,” while his friend James Madison wrote the “Virginia Resolutions”; both asserted the right of the states to nullify federal law.
From the earliest debates over the Union, in the Second Continental Congress, until the eve of the Civil War, numerous Southern politicians publicly advocated secession if they did not get their way on issues involving slavery and other issues. In 1832-33 South Carolina asserted the right to nullify the federal tariff, and then officially (although mostly symbolically) passed an ordinance to nullify the Force Law, which authorized the president to use appropriate military or civil power to enforce federal laws. At this time Georgia also brazenly declared it did not have to abide by a federal treaty with the Cherokees. In 1850 Southerners held two secession conventions, which went nowhere. In the debates over what became of the Compromise of 1850, Senator John C. Calhoun of South Carolina asserted the right of the South to block federal law.
Some Northern opponents of slavery — most notably William Lloyd Garrison — argued for Northern secession because they rightly understood that slavery dominated the American government. But Garrison had few followers, and even many of them never accepted his slogan of “No Union With Slaveholders.” In the mid-1850s the Wisconsin Supreme Court declared the Fugitive Slave Law unconstitutional, but when the Supreme Court upheld the law the Wisconsin Court backed off.
In short, nullification and secession were not new ideas in 1861, when 11 states left the union, but had been part of the warp and weft of constitutional debate since the founding. But the Civil War ended the discussion. The question of the constitutionality of nullification or secession was permanently settled by the “legal case” of Lee v. Grant, decided at Appomattox Court House in April 1865. Grant had successfully defended the Constitution and the idea of a perpetual Union. Secession lost, and the United States won. The Supreme Court would weigh in on this in Texas v. White (1869), holding that secession had never been legal and that the state governments in the Confederacy lacked any legal authority.
Money and National Power
From the beginning of the nation there had been debates over whether the United States government could issue currency. Indeed, before the Civil War there was no national currency, only “bank notes” issued by private banks or state banks. For two periods (1791-1811 and 1816-1836) the federally chartered Bank of the United States circulated bank notes that functioned as a national currency. But Andrew Jackson vetoed the bank’s recharter on the grounds that it was unconstitutional, and for the next 25 years the nation’s economy was hampered by the lack of a stable, national currency.
The war changed this, too. In order to finance the war, Secretary of the Treasury Salmon P. Chase developed a policy that led to the issuing of “greenbacks,” and suddenly the constitutional issue was settled — not in court, but by the exigency of the conflict. The Supreme Court was perplexed by this new policy and after the war the court briefly declared that issuing greenbacks was unconstitutional, but then quickly changed its mind. Since then, the dollar has emerged as the most important currency in the world. Although no longer backed by gold or silver, American currency remains “the gold standard” for international transactions.
Military Law and Civilians
The war also created a new set of rules — laws that are still with us — for when and how military tribunals or martial law can apply to civilians. For example, when the war began there were no federal laws prohibiting acts of sabotage or for preventing civilians from forming armies to make war on the United States. Nor was there any national police force. Thus, President Lincoln suspended habeas corpus along the railroad route from Philadelphia to Washington and used the Army to arrest pro-Confederate terrorists, like John Merryman, who was tearing up railroads leading to Washington, D.C., and trying to organize a Confederate army in Maryland.
Again, this was a matter of necessity, not ideology: Congress was not in session, and so Lincoln acted on is own authority. Indeed, if Merryman had been successful, members of Congress would have been unable to reach Washington to meet. Congress later approved Lincoln’s actions and authorized even more-massive suspensions of habeas corpus. Thus, the Constitutional rule from the Civil War is that in a dire emergency the government may act to restrain people to preserve public safety.
But what happens when the immediate and pressing emergency is over? May the military still be used to arrest and try civilians? The answer from the Civil War is an emphatic no. During the war military officials in Indiana arrested Lamdin P. Milligan for trying to organize a Confederate army in that state. There was no combat in Indiana at the time, civil society was smoothly functioning, and even Milligan’s allies were not blowing up bridges or destroying railroads as Merryman had been doing. Nevertheless, the Army tried Milligan and sentenced him to death. In 1866, in Ex parte Milligan, the Supreme Court ruled that the trial was unconstitutional. The military might arrest Milligan because of the emergency of the war (just as it had arrested Merryman), but the court ruled that if the civilian courts were open, as they were in Indiana, it was unconstitutional to try a civilian in a military court.
This has generally been the law of the land ever since. In the aftermath of 9/11 the Supreme Court upheld the rule that civilians (even terrorists in the United States) could not be tried by military tribunals, but could only be tried by civilian courts. The Justices relied on Milligan.
Racial Change and the Movement Toward Racial Equality
When the war began, federal law denied African-Americans virtually all constitutional rights. In Dred Scott v. Sandford, decided in 1857, Chief Justice Roger B. Taney ruled that blacks could never be citizens of the United States, even if they were treated as citizens in the states where they lived. This led to the oddity that blacks could vote for members of Congress and presidential electors in six states, and could hold office in those states and some others, but they were not citizens of the nation. Federal law nevertheless supported Taney’s rulings. For example, before the war blacks could not be members of state militias, serve in the national army, receive passports from the State Department, or be letter carriers for the post office.
During the war all this began to change. In 1862 Congress authorized the recruitment of blacks in the national army and in state militias. While most black soldiers were enlisted men, some served as noncommissioned officers, and a few served as officers. Martin Delaney held the rank of major. Just as striking, Eli Parker, a member of the Seneca nation, served on Ulysses S. Grant’s personal staff as a lieutenant colonel and was promoted to brevet brigadier general at the very end of the war.
The war also broke down racial and ethnic/religious taboos and attitudes. Abraham Lincoln became the first president to meet with blacks, and in the case of Frederick Douglass, seek out their advice. In 1864 and 1865 Congress gave charters to street railway companies that required that there be no discrimination in seating. Congress also changed the law that limited military chaplains to ministers of the gospel, thus allowing rabbis and Roman Catholic priests to become chaplains. During the war Congress created the office of recorder of the deeds for the city of Washington. The first officer holder was Simon Wolfe, a Jewish immigrant, but after that, the office was held by African-Americans for the rest of the century, including Frederick Douglass, Blanch Bruce, a former senator, and Henry P. Cheatham, a former congressman. In his last public speech Lincoln called for enfranchising black veterans and other members of their race. Five years later the Constitution would reflect that goal in the 14th and 15th amendments.
Today we rightly look back at these two amendments, and the 13th, as the most important lasting constitutional legacies of the Civil War. And that they are. But it is also important that we look at how America’s understanding of the Constitution, especially as it related to racial and ethnic equality, changed during the course of the war, and not simply as a consequence of it. Put differently: The Civil War amendments changed the Constitution. But even if, somehow, they had never happened, the war itself would have altered the way Americans saw one another, and their government.
Paul Finkelman is a senior fellow in the Penn Program on Democracy, Citizenship and Constitutionalism at the University of Pennsylvania and a scholar-in-residence at the National Constitution Center.
G.I. Bill student with wife and children walking in front of the Old Capitol, The University of Iowa, 1948. Image: Iowa Digital Library.
One of the deep, long-term changes in American lives has been what social historians call the “standardization” of the life course. From the nineteenth into the twentieth century, increasingly more young Americans were able to follow a common sequence: get educated, get a job, leave parents’ home, get married, have children, and become financially secure (to be followed by empty nest, retirement, and “golden years”)—the American Dream in one, widely-shared package.
In recent decades, however, Americans’ life courses have become less standardized, less shared. A new study, by Jeremy Pais and D. Matthew Ray, shows how much this historical reversal is connected to economic fortunes. The less affluent, who were late to standardization in the twentieth century, are in the twenty-first increasingly leading “non-standard” lives.
Standardization
A century and more ago, many a young American’s expectations of life were overturned by the mishaps of life, such as the early death of a parent or spouse, debilitating illness or accident, and farm- or job-devastating weather or depression. Over time, life became more straightforward and predictable as death and illness retreated while affluence and security grew. (Earlier posts on this point are here, here, and here.) Life-planning way into the future became more sensible.
Over the same period, American society developed institutions that increasingly structured the life course and made it more shared. Schooling became required of all, with specific starting, grade, and graduation ages for everyone. New laws stipulated minimum ages for when one could marry, could work, and should stop working.
Americans also increasingly chose to pursue common life courses. One important indication is number of children. Around 1900, Americans often had either many children or no children at all, but by 2000 most American parents had converged to having two children, give or take another one. Rather than generating more diversity, Americans’ greater freedom from circumstance allowed them to follow shared norms.
Historian David Stevens reported in 1990 that the correlation between how old Americans were and when they took critical life steps strengthened from 1900 to 1970. That is, Americans increasingly took these steps at the same age.
De-Standardization
After about 1970, diversity in the sequencing of life transitions grew: children before marriage, full-time employment while still living with parents, being unmarried late in life, long delays in attaining financial security, and so on. In that same 1990 study, Stevens found that the age-transition correlation–that is, standardization–began reversing after 1970. Later studies showed the trend accelerating. Indeed, the seemingly new disarray and unpredictability of life for twenty-somethings (actually, a throwback to a century ago) gained an academic label, “emerging adulthood,” and an accompanying academic journal.
The most likely factor in de-standardization is the economic setbacks—not the cultural shifts—of the last few decades.
This historical reversal of standardization might be attributed to cultural shifts since the 1960s: acceptance of divorce, of cohabitation, of women breaking old restraints, and so on. The extended education that at least middle-class youth pursued also contributed. But the most likely or most important factor seems to have been the economic setbacks—or, more precisely, the unequal economic setbacks—of the last few decades.
In their new study, Pais and Ray describe what they label the “Adult [Male] Attainment Project,” another way of viewing the standard life course. The American ideal, they write, is that, by about age 40 men should have attained, and sustained, these five statuses:
working (or studying);
marriage;
living independently with their wives;
settled parenthood—i.e, living with their children; and
owning a home.
These attainments did, indeed, generally come as a package; adult men who had one tended to have the others. Pais and Ray found that, from 1980 through 2010, almost half of American men aged 35 to 45 had the whole package, the American Dream.
However, the proportion of American men who occupied these statuses declined from 1980 to 2010, and much more so among the less affluent. In 1980, men who were in the top two-fifths of family income were four times as likely as men in the bottom two-fifths to have completed the “adult attainment project”; in 2010, they were eight times as likely. The class gap in being married and in residing with children widened especially.
The regularization and predictability of American life that had developed over a few generations and culminated in the post-war era seems to have started unraveling, at least for the less advantaged. The era of standardization may have been a passing phase; unpredictability may be the long-term norm. Perhaps, but if so, today’s non-standard patterns must be–given that early death and similar misfortunes are still at historical lows—the result of other factors, largely economic inequality, it appears.
Claude S. Fischer is Professor of Sociology at the University of California, Berkeley and author of Made in America. In his bimonthly BR column, Fischer explores controversial social and cultural issues using tools of sociology and history.
A giant bust of Lincoln by the artist David Adickes in a field outside of Williston, North Dakota.Credit Shannon Stapleton/Reuters
When did the Civil War end? Many have answered never. As late as 1949, in an address at Harvard, the writer Ralph Ellison said that the war “is still in the balance, and only our enchantment by the spell of the possible, our endless optimism, has led us to assume that it ever really ended.”
Still, there was an ending of sorts, in 1865. Sometimes, it came cleanly, as with Gen. Robert E. Lee’s surrender at Appomattox on April 9. At other times, the war just seemed to give out, as soldiers melted away from their regiments and began to find their way home. Other generals in more distant theaters fought on gamely: Not until June 23 did Stand Watie, a Cherokee chief and a Confederate brigadier general, sign a cease-fire agreement at Doaksville, in what is now Oklahoma. The last Confederates of all were the furthest away: After evading capture in the North Pacific, the confederate raider Shenandoah sailed all the way to Liverpool, where its crew surrendered on Nov. 6, the fifth anniversary of Lincoln’s election.
Then there was Abraham Lincoln’s assassination. This sickening act of violence, when added to all the others, brought a definitive feeling that an era had ended, as surely as Lincoln’s election in November 1860 had precipitated it. The funeral train that carried Lincoln’s remains home to Springfield, Ill., drew millions, and while the tragedy felt senseless, it also offered the nation a chance to mourn something much larger than the death of a single individual. To the end, Lincoln served a higher cause.
After he was laid to rest, on May 4, the armies united for an epic display of glory, worthy of Rome. Over two days, on May 23 and 24, more than 150,000 soldiers marched down Pennsylvania Avenue in Washington before a reviewing stand where President Andrew Johnson and Lt. Gen. Ulysses S. Grant stood.
That was a political as well as a military statement, for this vast army did not exactly disappear. The Grand Army of the Republic, founded in 1866, would become a potent lobbying force for veterans. Its immense gatherings helped to choose Lincoln’s successors for decades.
More than a year later, on Aug. 20, 1866, President Johnson proclaimed that final pockets of resistance in Texas were “at an end.” We could call this, too, the close of the war.
But much remained “in the balance,” as Ellison said; uncomfortable, unfinished. Certainly, the presence of so many veterans was a new fact for Americans, and kept the war alive, simmering, for decades.
More than a few required help to cope with their trauma, and the federal government, which had grown so much during the war, grew again to address their needs. It paid out pensions, it built hospitals, it maintained service records, and it assumed more responsibility for the mental and physical health of those who had given so much. That was an important precedent for the New Deal and the Great Society.
To this day, as a recent Wall Street Journal article reported, an elderly North Carolina woman, Irene Triplett, collects $73.13 a month for her father’s pension. He served in both the Confederate and Union armies: His tombstone avoids that complexity by saying simply, “He was a Civil War soldier.”
Reintegrating these former soldiers took decades. What we now regard as the best Civil War fiction, such as the work of Stephen Crane and Ambrose Bierce, did not even appear until the 1890s, as if the war’s memory was too potent at first.
A new product, Coca-Cola, was introduced in 1885 by a former Confederate officer, John Pemberton, who had been slashed by a saber in the final fighting of the war, after Appomattox, then wrestled with an addiction to morphine, to dull the pain. A pharmacist, Pemberton experimented with a mysterious formula that derived from the coca leaf and the kola nut, to ease his suffering. The early marketing for the elixir suggested that it could reduce the symptoms that veterans suffered from, including neurasthenia, headaches and impotence.
Many veterans retained their sidearms, including Confederate officers, and weapons were easily available, thanks to an arms industry that had done great service to the Union cause. They could hardly be expected to voluntarily go out of business. With new products (like Winchester’s Model 1866 rifle), sophisticated distribution networks and a public eager to buy, the industry entered a highly profitable phase. Winchester’s repeating rifles needed hardly any time for reloading, and sold briskly in Europe, where American arms tipped the balance in local conflicts.
The Winchester was easily transported to the West, where new military campaigns were undertaken against Native Americans, and few could be blamed for wondering if the Civil War had in fact ended. Many of the same actors were present, and it could be argued that this was simply another phase of the crisis of Union, reconciling East and West, rather than North and South.
This tragic epilogue does not fit cleanly into the familiar narrative of the Civil War as a war of liberation. Peoples who had lived on ancestral lands for thousands of years were no match for a grimly experienced army, eager to occupy new lands, in part to reward the soldiers who had done the fighting.
Natives called the repeating rifles “spirit guns,” and had no answer for them. They fought courageously, but in the end had no choice but to accept relocation, often to reservations hundreds of miles away. Adolf Hitler would cite these removals as a precedent for the Nazi concentration camps.
In other ways, the war endured. The shift westward created a huge market for building products, furnishings and all of the technologies that had advanced so quickly during the fighting. One skill that amazed observers was the speed with which Americans could build railroads and the bridges that they needed to cross. Between 1865 and 1873, more than 35,000 miles of tracks were laid, greater than the entire domestic rail network in 1860.
This activity was very good for business. Huge profits were made as those who had become wealthy supplying the war effort adapted to the needs of a civilian population eager to start anew. Indeed, it is difficult to tell from the 1870 census that any war had taken place at all. The 1860 census had valued the total wealth of the United States at $16 billion; 10 years later, it was nearly twice that, $30 billion. So many immigrants came between 1860 and 1870 that the population grew 22.6 percent, to 38.5 million, despite the massive losses of war dead.
To careful observers in 1865, it was palpable that something important had already happened during the war. To organize victory, a grand consolidation had taken place, in which leading concerns had improved their organizations, crushed their smaller rivals and strengthened distribution networks. The railroad was a key part of this consolidation; so was the telegraph, often built along the tracks. Military goods needed to move quickly around the country to supply armies, and all of those skills were instantly transferable to private enterprise. One firm, an express freight delivery service founded in Buffalo, moved its goods slightly faster than the competition. It was, and is, known as American Express.
Information was vital to make all of these systems work. During the war, the Military Telegraph Corps built 8 to 12 miles of telegraph line a day; and the military alone sent 6.5 million messages during the war. By the end of 1866, more than 80,000 miles of line existed, and these were rapidly extended into the West and South, reknitting some of the strands of Union.
Entirely new sectors of the economy had sprung up as well. In 1859, on the eve of the conflict, oil was discovered in northwestern Pennsylvania, and throughout the war, its value became clear to a war economy that urgently needed to lubricate the machinery of production. John D. Rockefeller bought a refinery in Cleveland in 1863, a major step on the way to the creation of Standard Oil. As soon as the war ended, the search for oil in new locations began: The first well in Texas was dug in 1866, in Nacogdoches County.
Many veterans, having paid so dearly for freedom, were troubled to come back from the war, only to find a new economy, dominated by industrial barons, quite a few of whom had paid substitutes to do their army service. Lincoln’s words about freedom continued to move people, but his emphasis on equality seemed to fade as the power of money rose to new heights. It was not only that a small elite had become extremely wealthy; but money itself seemed to move in new ways, fast and loose.
In other words, it was unclear to many Americans what, exactly, they had won. A great evil had been defeated; and Union forcibly defined and defended. But so rapid were the changes unleashed by the war that soldiers blinked their eyes in amazement when they returned home. Like Ulysses, the Greek hero their commander was named after, they often did not recognize the country they came back to.
Perhaps the most complicated legacy of the war was its claim to have liberated millions of African-Americans from slavery. This was not the official purpose of the war when it began in 1861, but it became so, especially after the scale of the war required a cause worthy of so great a sacrifice.
But when did slavery actually end? Was it the national ratification of the 13th amendment, on Dec. 6, 1865? Or the day Mississippi ratified it, in 1995? Or the gift of full citizenship (including voting rights) to African-Americans? There are those who would argue that we are still waiting for that Day of Jubilee. To read the stories that came out of Ferguson, Mo., Cleveland and Baltimore in the last year — all communities that remained in the Union — is to realize how distant the victory of the Civil War feels to large numbers of African-Americans.
Of course, that does not minimize the importance of the Confederacy’s defeat. It ended forever a way of life and politics that had dominated the United States from its founding. It accelerated the demise of slavery where it still existed, in Cuba and Brazil, and encouraged liberals around the world to push for greater rights. In the fall of 1865, Victor Hugo wrote in a notebook, “America has become the guide among the nations.”
In France, Napoleon III was destabilized by Lincoln’s victory, and pulled back from his adventure in Mexico, where his puppet, Maximilian, was shot by a firing squad in 1867. Three years later, he was removed after his defeat in the Franco-Prussian War, and the transfer of the provinces of Alsace and Lorraine to Germany left a bitterness that would fuel the world wars of the 20th century.
Without the Civil War, and its tempering of the national character, would the United States have been able to mount a great global campaign against fascism? Surely it would have been feebler, without the manufacture of war matériel across all the regions, or the rhetoric of freedom Franklin D. Roosevelt used to inspire the world.
Nearly all of the national triumphs of the last century, from the civil rights movement to the exploration of space to the birth of the digital age, stemmed from the contributions of Southerners, Northerners and Westerners working together. We have had failures too — we see them on a daily basis. But the refusal to fall apart in 1861 made a difference.
Ted Widmer is an assistant to the president for special projects at Brown University, and the editor of “The New York Times Disunion: Modern Historians Revisit and Reconsider the Civil War from Lincoln’s Election to the Emancipation Proclamation.”
The 1960s are celebrated—and loathed—as a time of political and cultural liberalization. But the decade’s legacy is ambiguous. / National Archives
Forget the Summer of Love. Forget acid, Ken Kesey, and consciousness expansion. Forget the Grateful Dead and the smell of patchouli oil. Forget everything you know about the hallowed 1960s, everything every greying, former hippie has told you about how amazing and paradigm-shifting the whole psychedelic, turn-on-tune-in-drop-out freak show was.
Forget too the bile of right-wing blowhards such as William Bennett and historian Gertrude Himmelfarb, who seem incapable of blaming America’s perceived ills on anything other than the big, bad Decade of Perdition and the narcissistic navel-gazers it allegedly spawned. Conservative pundits have blamed the ’60s for everything from Bill Clinton’s tryst with Monica Lewinsky to, as Robert Bork wrote in his 1996 book Slouching Toward Gomorrah, a “slide into a modern, high-tech version of the Dark Ages,” a Boschian neo-con delirium worthy of the worst mescaline trip.
George Will, another of those right-wing pundits, did manage, quite accidentally, to stumble upon a kernel of the truth. In a 1991 Newsweek essay excoriating Oliver Stone’s The Doors, Will describes the death of front man Jim Morrison as “a cautionary reminder of the costs of the ’60s stupidity that went by the puffed-up title of ‘counterculture.’”
Puffed up it certainly was, but the proposition that the ’60s served as the cultural turning point of the twentieth century, ten years that changed everything, has largely become an article of faith, a shibboleth for an entire Woodstock Industrial Complex of aging boomers. The decade’s icons and totems persist to this day. For example, no man—save, perhaps, a twenty-something hipster at a Halloween party—would be caught dead in a ’70s-vintage leisure suit. But tie-dyed clothing is everywhere, from the sale booths at a Dave Matthews Band concert to the runways of the Milan fashion shows.
Or try this mental exercise. Ask yourself when you last heard John Lennon’s “Imagine,” one of the world’s most popular engines of ’60s nostalgia, written by the decade’s leading secular saint. Was it last month? Last week? “Imagine” has come to signify everything the decade allegedly stood for—a quest for tolerance, peace, brotherhood, and generosity. Granted, Lennon meant well. But the irony of a man who once owned a major chunk of the Dakota—widely considered New York City’s most exclusive co-op apartment building—singing “imagine no possessions” borders on the breathtaking. To his credit, the irony wasn’t lost on Lennon. When confronted with it by a friend, the former Beatle reportedly remarked, “It’s only a bloody song.”
Perhaps the saddest irony of all is that Lennon was shot and killed by a lunatic, Mark Chapman, who believed the singer had turned his back on ’60s ideals—whatever the voices in Chapman’s head told him those ideals were. But “Imagine” is not “just a bloody song.” It is an anthem, and it celebrates everything that the 1960s failed to achieve.
• • •
The counterculture’s most enduring, most emblematic moment came in August 1969, during a large, three-day rock concert in upstate New York. The promoters stopped collecting tickets, everyone got to listen to some really cool music, and the vibe was so cosmic and peaceful that nobody so much as got into a fist fight. A memorable event, to be sure, but the keepers of the ’60s flame want so much more, from the civil rights movement to the antiwar movement, from consciousness expansion to the sexual revolution.
To credit the ’60s for the civil rights movement is an insult to that movement’s history and the long struggle for equality. Dr. Martin Luther King may have given his “I Have A Dream” speech on the Washington Mall in 1963, but the death of Jim Crow owes as much to the activists of the 1950s, such as Thurgood Marshall, who successfully argued Brown v. Board of Education, the 1954 Supreme Court case that began the long drive to integrate America’s schools. Or Claudette Colvin, who, as a fifteen-year-old high school student in Montgomery, Alabama in March of 1955, refused to give up her seat on a public bus to a white man. Colvin was arrested, handcuffed, and forcibly removed from the vehicle. She was followed a few months later by Rosa Parks, who also told the City of Montgomery what it could do with its Jim Crow laws and who was also arrested. Thus was born the Montgomery Bus Boycott, the first shot fired in the modern civil rights movement, which itself followed a legacy of protest dating back to the previous century.
The ’60s wasn’t the era that brought forth the Civil Rights Movement. It was the era when well-meaning white people began to notice it.
And the antiwar movement? True, Vietnam was entirely a ’60s affair. The critics were also quite correct when they called the war a hideous waste of human life and national treasure. Our presence there was predicated on policymakers’ fears that we would somehow “lose” that tiny country to Communism, and with it all of Southeast Asia. As the body count grew and the horrendous fallacies of U.S. foreign policy became all too apparent, America’s youth began to question the wisdom of the country’s leaders. Finally, an angry generation said, “Enough!”—there were protest marches, placards, and slogans, the spectacle each night on the Huntley-Brinkley report of young men and women demanding peace and in return being gassed and beaten by the police.
Seeing this, an entire nation slowly woke up to the delusions and reckless arrogance of its rulers. The antiwar movement lit the fire, and America responded. In 1968, a year that saw more than 16,000 killed in action, voters marched to the polls and sent veteran commie-baiter and cold warrior Richard Nixon to the White House.
Indeed, one could argue that the country’s present conservative movement is the most enduring political legacy of the ’60s. Though civil rights foe Barry Goldwater—Nixon’s predecessor as Republican presidential candidate—was decisively beaten in the 1964 election, his followers refused to let the torch of right-wing extremism burn out. The ’60s saw the founding of groups such as the Young Americans for Freedom and the American Conservative Union. These groups and their devotees were, at the time, mostly considered punch lines, when they were considered at all. But with the help of William F. Buckley, his friends, and their money, these organizations and associated right-wing lobbying and media campaigns laid the groundwork for the election of Ronald Reagan in 1980.
So, while ’60s activism can’t be discounted, the record is mixed and not quite as advertised. But if the results are largely a wash, then what is left? Alas, less than the Woodstock Nation wants us to believe. Whatever the ’60s might wish to claim as a breakthrough in thought and morality, midwifed by its turned-on, tuned-in avant garde, the whole show had been reduced to a crass, corrupt parody of itself long before the clock struck midnight on December 31, 1969.
Take, for example, consciousness expansion. It all began with such promise. In the early days of 1962, we have highly regarded Harvard psychologist Timothy Leary writing to famed author and mescaline connoisseur Aldous Huxley, extolling the progress Leary was making in bringing hallucinogenic drug research into the mainstream. He tells Huxley about students writing their PhD theses on the effects of psilocybin mushrooms and proudly states that a “visionary experience”—code for an acid trip—had become de rigueuramong grad students at the Andover Newton Theological Seminary, which was aiding Leary in his work.
He also tells Huxley about another experiment he is conducting, administering psilocybin to prisoners at the Massachusetts Correctional Institute:
The death-rebirth theme is the center of attention. We are experimenting (collaboratively with the advance joint assistance of the convicts) with more systematic ad hoc rituals in the prisons. Next Monday we are running a last judgment–rebirth sequence for four convicts. The therapeutic force of this approach is astounding.
Leary doesn’t explain in the letter what a “last judgment–rebirth sequence” entails or why it proved so salutary to the participants, but he would later claim reduced recidivism rates among the prisoners in his experiment. However, a follow-up examination of Leary’s work conducted in the late 1990s found no difference in recidivism among the convicts treated with magic mushrooms as compared to Massachusetts ex-prisoners as a whole.
There would be none of Leary’s high-minded vision questing on display a few years later when, in 1967, at the height of the famed Summer of Love, Beatle George Harrison and his wife Pattie Boyd took a stroll through San Francisco’s Haight-Ashbury district. The two had dropped acid themselves that afternoon, and decided to go off with several friends to see the hippies and groove on the expected good vibes. In a television interview, Harrison recalled:
We were expecting Haight-Ashbury to be this brilliant place. I thought it was going to be all these groovy, kind of gypsy kind of people with little shops making works of art and paintings and carvings. But instead it turned out to be just a lot of bums. Many of them were just very young kids who’d come from all over America and dropped acid and gone to this mecca of LSD. . . . It certainly showed me what was really happening in the drug culture. It wasn’t . . . all these groovy people having spiritual awakenings and being artistic. It was like any addiction.
Describing the same incident in her 2007 autobiography, Boyd said the crowd grew hostile after Harrison was offered more drugs and turned them down, prompting the two to beat a hasty retreat to their limo. They left San Francisco later that night, and Harrison said in the interview that he never partook of psychedelics again.
By the 1970s, cocaine was ubiquitous, heroin was finding a larger audience, and the pretense of drugs as a path to a higher spiritual plane was largely gone. The first year of the decade saw the deaths of Janis Joplin and Jimi Hendrix, the former overdosing on smack, the latter choking on his own vomit after mixing pills and alcohol. The aforementioned Jim Morrison would die of a heroin overdose in a Paris bathtub the next year.
But what about the sexual revolution? One need only Google “erotic Greek pottery” or “Pompeian wall paintings” to see that free love, open marriage, homosexuality, group sex, sado-masochism, etc. have long been with us. While it is true that reliable oral contraception—the pill—became available by prescription in 1960, reasonably trustworthy methods of birth control, such as condoms, had been available since the first half of the century, the only potential obstacle to their purchase a derisive scowl from the local pharmacist. Ergo, in a brief appearance in the 1981 film Reds, writer Henry Miller, describing his youth in the early 1920s, said, “There was just as much fucking going on then as now.”
Yet many continue to see the ’60s as America’s defining moment of sexual liberation. That the decade had a tremendous advantage simply by coming after the girdled-and-crewcut 1950s, ten years of nation-wide uptightness on a scale unseen since Victorian-era Britain, is seldom noted. More to the point, though, any evidence that the ’60s set us free from the chains of sexual repression and inhibition is murky and anecdotal, at best. The evidence that it did nothing of the sort is considerably stronger.
In 1970, Albert Klassen and his colleagues at the Kinsey Institute at Indiana University conducted a nationwide poll, which found that roughly 75 to 90 percent of the nation still felt homosexuality, extra-marital sex, and pre-marital sex involving both teens and adults was always or almost always wrong. Even masturbation took a hit, with just under half of both men and women labeling the practice as wrong or almost always wrong. These results were recently affirmed by the Institute’s Thomas G. Albright, who re-tabulated the data.
For folks born in the 1940s, who would have been entering early adulthood at some point during the 1960s, Klassen put the total number of lifetime sexual partners at roughly six for males, four for females. Only 3 percent of women polled managed more than ten partners. Klassen summarized the findings by noting that, if there had been some kind of sexual revolution during the ’60s, his research had unearthed little evidence of it.
Today the ’60s are associated primarily with counterculture entertainment, but mainstream artists such as Herb Alpert were massively popular at the time. / Public domain photo via Wikimedia Commons
One explanation for this grand misperception may lie with ’60s mainstream entertainment, which helped take the commercialization of sex to an all-time high. Not that sex started selling then—it always had, of course—but modern mass media, particularly television, proved very effective at bringing miniskirts and go-go boots into America’s living rooms.
One nudge-wink example was the popular ABC series The Dating Game. Premiering in 1965, the show hooked up eligible, attractive young single men and women for what was billed as the ultimate blind date. The winning couple was shipped off for a week of implied carnality in Puerto Vallarta, Mexico, all expenses paid.
On the show, the main contestant would put questions to three unseen prospects of the opposite sex, hidden from his or her view behind a wall running down the middle of the set. The questions were scripted, mainly to keep the bachelors from asking the bachelorettes the most obvious questions, such as breast size or number of sexual partners. Instead, the show’s writers would devise queries brimming with double entendres and not-so-subtle innuendo.
Q: Bachelorette Number Three, if you were a flavor of ice cream, what flavor would you be?
A: (giggle) Cherry.
The show’s background music was provided by trumpeter Herb Alpert and his band the Tijuana Brass. The tunes were vibrant, fresh and effervescent, in grand symbiosis with the youth on display. Though largely forgotten now, the band was at one point the musical face of the Swingin’ Sixties in the United States, outselling even The Beatles in 1966. The Tijuana Brass also laid claim to a memorable piece of sexploitation of their own, with their fourth album Whipped Cream and Other Delights. Released in 1965, the cover featured a photograph of a voluptuous brunette covered in whipped cream, holding a single red rose and looking into the camera with a classic come-hither gaze. Sultry and seductive, it was an image worthy of a Playboy spread and, for a while, just as likely to be found in any well-appointed bachelor pad as was Hefner’s publication.
But as the Kinsey study found, though sexual references and imagery were exploding on television and album covers, in magazines and movies, those pads were rarely rocking.
• • •
Alpert’s former popularity as a mainstream entertainer—his music eclipsed by the memory of such immortals as The Who, Joplin, and Hendrix—should serve as a reminder of how few Americans actually participated in the counterculture. Max Yasgur’s farm held about 500,000 people, a tiny tribe when compared to Americans at large, most of whom couldn’t tell Jerry Garcia from Bigfoot. For the country’s masses, blended scotch and Pabst Blue Ribbon were the drugs of choice, not pot and psychedelics. Hefner’s Playboy Mansion trumped the outdoor rock festival as the ultimate symbol of sybaritic abandon. Acapulco, not Haight-Ashbury, was the hip, happening destination. The Cadillac and the Ford Mustang ruled the highways of our great nation, running the VW Microbus off the road.
The idea of the ’60s as ground zero for a massive cultural shift also becomes suspect when one considers how anomalous the decade was economically. It was ten years of wondrous material plenty, unlike any the republic had previously seen. America experienced both an exceptionally prolonged period of economic expansion and some of the lowest sustained unemployment numbers in the twentieth century. Though few would want to admit it now, much of what came out of those ten years wasn’t prompted by acid-induced vision quests or transcendental meditation. It was purchased through America’s increased affluence, particularly the affluence of its young, who constituted a new consumer class.
On both the left and the right, however, we continue to believe a fifty year-old press release, minting bespoke memories of the ’60s tailored to whatever ideology we happen to champion. A Pew poll conducted in 1999, trying to gauge whether there is a discernible collective memory of the twentieth century, found that the ’60s had made the strongest impression on the national psyche of any decade before or after. “The collective memory of this important epoch,” the researchers determined, was “American Cultural Revolution.”
A truer, sadder epitaph for the era is provided by John Sebastian, who played a solo set at Woodstock and was lead singer of The Lovin’ Spoonful. In When the Music Mattered(1983), by rock journalist Bruce Pollock, Sebastian says:
I think we are devourers of our own culture and cannibalized a lot of things that could have happened out of Woodstock. A media culture can absorb and regurgitate stuff so fast that it loses meaning almost before it’s out of the pot. Somehow every mood that was created was suddenly turned into a marketable item. I regret that more of the spirit that existed at that point in time could not carry over to the sort of cocaine-and-glitter thing that filled the void once it was gone.
And you, dewy-eyed young person with your tie-dyed T-shirt and iPod full of Grateful Dead MP3s, I fear you will always look upon your own era and somehow find it lacking. A great pity, that.
Just remember to forget that Jerry Rubin, founder of the Yippies, went on to become a shill for snake-oil vendor Werner Erhard and his EST Seminars. Forget that Black Panthers leader Eldridge Cleaver became a conservative Republican and endorsed Ronald Reagan. Forget that Jane Fonda had a boob job. The ’60s will always be whatever we say it is, regardless of what may have actually happened. That is why the song is called “Imagine.”
Hal Stucker is a writer and photographer. His work has appeared in Wired, the New York Times, Photo District News, and the book Black Star: 60 Years of Photojournalism. His Boston Review story «Strapped» was included in Best American Essays 2014.