*The American Yawp is currently in beta draft. Please click here to help improve this chapter*
Like many young Americans in 1969, Meredith Hunter was a fan of rock ‘n’ roll. When news spread that the Rolling Stones were playing a massive free concert at California’s Altamont Motor Speedway, Hunter, who was black, made plans to attend with his white girlfriend. But his sister, Dixie, protested. She later recalled, “It was a time when black men and white women were not supposed to be together.” Their home, Berkeley, was more tolerant but, she explained, “things [were] different in Berkeley than the outskirts of town.” She feared what might happen.
Meredith went anyway. He joined 300,000 others eager to hear classic sixties bands—Jefferson Airplane, the Grateful Dead, and, of course, the Rolling Stones—for free. Altamont was to climax the Stones’ first American tour in three years and would be a feature of the documentary (later released as Gimme Shelter) recording it, but the concert was a disorganized disaster. Inadequate sanitation, a horrid sound system, and tainted drugs contributed to a tense and uneasy atmosphere. The Hell’s Angels biker gang were paid $500 in beer to be the the show’s “security team.”
High on dope and armed with sawed-off pool cues, the Angels indiscriminately beat concert-goers who tried to come on the stage. One of those was Meredith Hunter. High on methamphetamines, Hunter approached the stage multiple times and, growing agitated, brandished a revolver. He was promptly stabbed to death by an Angel and his lifeless body was stomped into the ground. The Stones, unaware of the murder just a few feet away, continued jamming “Sympathy for the Devil.”
If the more famous Woodstock music festival typified an idyllic sixties youth culture, Altamont revealed a darker side of American culture, one in which drugs and music were associated not with peace and love but with violence, anger, and death. While many Americans continued to celebrate the political and cultural achievements of the 1960s, a more anxious, conservative mood afflicted many Americans. For some, the United States had not gone nearly far enough to promote greater social equality. For others, the nation had gone too far, had unfairly trampled the rights of one group to promote the selfish needs of others. Onto these brewing dissatisfactions the 1970s dumped the divisive remnants of a failed war, the country’s greatest political scandal, and an intractable economic crisis. To many, it seemed as if the nation stood ready to unravel.
Perhaps no single issue contributed more to public disillusionment than the Vietnam War. The “domino theory”—the idea that if a country fell to communism, then neighboring states would soon follow—governed American foreign policy. After the communist takeover of China in 1949, the United States financially supported the French military’s effort to retain control over its colonies in Vietnam, Cambodia and Laos. But the French were defeated in 1954 and Vietnam was divided into the communist North and anti-communist South.
The American public remained largely unaware of Vietnam in the early 1960s, even as President John F. Kennedy deployed over sixteen thousand military advisers to help South Vietnam suppress a domestic communist insurgency. This all changed in 1964, when Congress passed the Gulf of Tonkin Resolution after a minor episode involving American and North Vietnamese naval forces. The Johnson administration distorted the incident to provide a pretext for escalating American involvement in Vietnam. The resolution authorized the president to send bombs and troops into Vietnam. Only two senators opposed the resolution.
The first combat troops arrived in South Vietnam in 1965 and as the war deteriorated the Johnson administration escalated the war. Soon hundreds of thousands of troops were deployed. Stalemate, body counts, hazy war aims, and the draft all catalyzed the anti-war movement and triggered protests throughout the United States and Europe. With no end in sight, protesters burned their draft cards, refused to pay income tax, occupied government buildings, and delayed trains loaded with war materials. By 1967, anti-war demonstrations drew crowds in the hundreds of thousands. In one protest, hundreds were arrested after surrounding the Pentagon.
Vietnam was the first “living room war.” Television, print media, and liberal access to the battlefield provided unprecedented coverage of the war’s brutality. Americans confronted grisly images of casualties and atrocities. In 1965, CBS Evening News aired a segment in which United States Marines burned the South Vietnamese village of Cam Ne with little apparent regard for the lives of its occupants, who had been accused of aiding Viet Cong guerrillas. President Johnson berated the head of CBS, yelling “Your boys just shat on the American flag.”
While the U. S. government imposed no formal censorship on the press during Vietnam, the White House and military nevertheless used press briefings and interviews to paint a positive image of the war effort. The United States was winning the war, officials claimed. They cited numbers of enemies killed, villages secured, and South Vietnamese troops trained. American journalists in Vietnam, however, quickly realized the hollowness of such claims (the press referred to afternoon press briefing in Saigon as “the Five O’Clock Follies”). Editors frequently toned down their reporters’ pessimism, often citing conflicting information received from their own sources, who were typically government officials. But the evidence of a stalemate mounted. American troop levels climbed yet victory remained elusive. Stories like CBS’s Cam Ne piece exposed the “credibility gap,” the yawning chasm between the claims of official sources and the reality on the ground in Vietnam.Nothing did more to expose this gap than the 1968 Tet Offensive. In January, communist forces engaged in a coordinated attack on more than one hundred American and South Vietnamese sites throughout South Vietnam, including the American embassy in Saigon. While U.S. forces repulsed the attack and inflicted heavy casualties on the Viet Cong, Tet demonstrated that, despite repeated claims by administration officials, after years of war the enemy could still strike at will anywhere in the country. Subsequent stories and images eroded public trust even further. In 1969, investigative reporter Seymour Hersh revealed that U.S. troops had massacred hundreds of civilians in the village of My Lai. Three years later, Americans cringed at Nick Ut’s wrenching photograph of a naked Vietnamese child fleeing an American napalm attack. More and more American voices came out against the war.
Reeling from the war’s growing unpopularity, on March 31, 1968, President Johnson announced on national television that he would not seek reelection. Eugene McCarthy and Robert F. Kennedy unsuccessfully battled against Johnson’s vice president, Hubert Humphrey, for the Democratic Party nomination (Kennedy was assassinated in June). At the Democratic Party’s national convention in Chicago, local police brutally assaulted protestors on national television. In a closely fought contest, Republican challenger Richard Nixon, running on a platform of “law and order” and a vague plan to end the War. Well aware of domestic pressure to wind down the war, Nixon sought, on the one hand, to appease anti-war sentiment by promising to phase out the draft, train South Vietnamese troops, and gradually withdraw American troops. He called it “Vietnamization.” At the same time, however, Nixon appealed to the so-called “silent majority” of Americans who still supported the war and opposed the anti-war movement by calling for an “honorable” end to the war (he later called it “peace with honor”). He narrowly edged Humphrey in the fall’s election.
Public assurances of American withdrawal, however, masked a dramatic escalation of conflict. Looking to incentivize peace talks, Nixon pursued a “madman strategy” of attacking communist supply lines across Laos and Cambodia, hoping to convince the North Vietnamese that he would do anything to stop the war. Conducted without public knowledge or Congressional approval, the bombings failed to spur the peace process and talks stalled before the American imposed November 1969 deadline. News of the attacks renewed anti-war demonstrations. Police and National Guard troops killed six students in separate protests at Jackson State University in Mississippi, and, more famously, Kent State University in Ohio in 1970.
Another three years passed—and another 20,000 American troops died—before an agreement was reached. After Nixon threatened to withdraw all aid and guaranteed to enforce a treaty militarily, the North and South Vietnamese governments signed the Paris Peace Accords in January of 1973, marking the official end of U. S. force commitment to the Vietnam War. Peace was tenuous, and when war resumed North Vietnamese troops quickly overwhelmed Southern forces. By 1975, despite nearly a decade of direct American military engagement, Vietnam was united under a communist government.
The fate of South Vietnam illustrates of Nixon’s ambivalent legacy in American foreign policy. By committing to peace in Vietnam, Nixon lengthened the war and widened its impact. Nixon and other Republicans later blamed the media for America’s defeat, arguing that negative reporting undermined public support for the war. In 1971, the Nixon administration tried unsuccessfully to sue the New York Times and the Washington Post to prevent the publication of the Pentagon Papers, a confidential and damning history of U. S. involvement in Vietnam that was commissioned by the Defense Department and later leaked. Nixon faced a rising tide of congressional opposition to the war, led by prominent senators such as William Fulbright. Congress asserted unprecedented oversight of American war spending. And in 1973, Congress passed the War Powers Resolution, which dramatically reduced the president’s ability to wage war without congressional consent.
The Vietnam War profoundly shaped domestic politics. Moreover, it poisoned Americans’ perceptions of their government and its role in the world. And yet, while the anti-war demonstrations attracted considerable media attention and stand as a hallmark of the sixties counterculture so popularly remembered today, many Americans nevertheless continued to regard the war as just. Wary of the rapid social changes that reshaped American society in the 1960s and worried that anti-war protests further threatened an already tenuous civil order, a growing number of Americans criticized the protests and moved closer to a resurgent American conservatism that brewed throughout the 1970s.
III. The Politics of Love, Sex, and Gender
Many looked optimistically at what the seventies might offer. Many hoped, like George Clinton’s funk band Funkadelic, that Americans might dance together under a disco glitter ball as “one nation under a groove.” Many Americans—feminists, gay men, lesbians, and married couples across—carried the sexual revolution further. Whether women rejected the monogamy and rigid gender roles at the heart of the nuclear family, American women had fewer children, cohabitation without marriage spiked, straight couples married later (if at all), and divorce levels climbed. Sexuality, decoupled from marriage and procreation, was transformed into a source of personal fulfillment and a worthy political cause.
At the turn of the decade, sexuality was considered a private matter, but one closely linked to civil rights. American law defined legitimate sexual expression within the confines of patriarchal, procreative, middle-class marriage. Interracial marriage was illegal until 1967 and remained largely taboo throughout the 1970s, while government-led sterilization programs threatened the reproductive freedom of poor women of color. Same-sex intercourse and cross-dressing were criminalized in most states, and gay men, lesbians, and transgender people were vulnerable to violent police enforcement as well as discrimination in housing and employment.
Two landmark legal rulings in 1973 established the battle lines for the “sex wars” of the 1970s: First, the Supreme Court’s 7-1 ruling in Roe v. Wade struck down a Texas law that prohibited abortion in all cases when a mother’s life was not in danger. The Court’s decision built upon precedent from a 1965 ruling that, in striking down a Connecticut law prohibiting married couples from using birth control, recognized a constitutional “right to privacy.” In Roe, the Court reasoned that “this right of privacy . . . is broad enough to encompass a woman’s decision whether or not to terminate her pregnancy.” The Court held that states could not interfere with a woman’s right to an abortion during the first trimester of pregnancy and could only fully prohibit abortions during the third trimester. Other Supreme Court rulings, however, held that sexual privacy could be sacrificed for the sake of “community” good. Another 1973 decision, Miller v. California, held that the first amendment did not protect “obscene” material, defined by the Court as anything with sexual appeal that lacked “serious literary, artistic, political, or scientific value.” The ruling expanded states’ abilities to pass laws prohibiting materials like hardcore pornography. State laws were unevenly enforced, however, and pornographic theaters and sex shops proliferated. Americans debated whether these were immoral atrocities, the “vanguard of a new ‘pansexual’ utopia,” as one bathhouse owner called it, or “the ultimate conclusion of sexist logic,” as poet and lesbian feminist Rita Mae Brown charged.
Furthermore, new laws prohibiting employment discrimination increased opportunities for women to make a living outside of the home and marriage. Women—haltingly and with significant disparities—advanced into traditional male occupations, into politics, and into corporate management.
The seventies saw the reform of divorce law. Between 1959 and 1979 the American divorce rate doubled. Close to half of all marriages formed in the 1970s ending in divorce. The stigma attached to divorce evaporated and American culture encouraged individuals to leave abusive or unfulfilling marriages. Before 1969, most states required one spouse to prove that the other was guilty of a specific offense, such as adultery. The difficulty of getting a divorce under this system encouraged widespread lying in divorce courts. Even couples desiring an amicable split were sometimes forced to claim that one spouse had cheated on the other even if neither (or both) had. Other couples temporarily relocated to states with more lenient divorce laws, such as Nevada. Widespread recognition of such practices prompted reforms. In 1969, California adopted the first no-fault divorce law. By the end of the 1970s, almost every state had adopted some form of no-fault divorce. The new laws allowed for divorce on the basis of “irreconcilable differences,” even if only one party felt that he or she could not stay in the marriage.
As straight couples eased the bonds of matrimony, gay men and women negotiated a harsh world that stigmatized homosexuality as a mental illness or depraved immoral act. Building upon postwar efforts by gay rights organizations to bring homosexuality into the mainstream of American culture, young gay activists of the late sixties and seventies began to challenge what they saw as the conservative gradualism of the “homophile” movement. Inspired by the burgeoning radicalism of the Black Power movement, the New Left protests of the Vietnam War, and the counterculture movement for sexual freedom, gay and lesbian activists agitated for a broader set of sexual rights that emphasized an assertive notion of “liberation” rooted not in mainstream assimilation, but in pride of sexual difference.
Perhaps no single incident did more to galvanize gay and lesbian activism than the 1969 uprising at the Stonewall Inn in New York City’s Greenwich Village. Police regularly raided gay bars and hangouts. But when police raided the Stonewall in June 1969, the bar patrons protested and sparked a multi-day street battle that catalyzed a national movement for gay liberation. Seemingly overnight, calls for homophile respectability were replaced with chants of “Gay Power!”
In the seventies, gay activists attacked the popular culture that demanded them to keep their sexuality hidden. Activists urged gay Americans to “come out.” Gay rights organizations cited statistics proving that secrecy contributed to stigma and “coming out” could reduce suicide rates. All movements, however, proceed haltingly. Transgender people were often banned from participating in Gay Pride rallies and lesbian feminist conferences and they, in turn, mobilized to fight the high incidence of rape, abuse, and murder of transgender people. Activists now declared “all power to Trans Liberation.”
Throughout the following years, gay Americans gained unparalleled access to private and public spaces. A step towards the “normalization” of homosexuality occurred in 1973, when the American Psychiatric Association stopped classifying homosexuality as a mental illness. Pressure mounted on politicians. In 1982, Wisconsin became the first state to ban discrimination based on sexual orientation and more than eighty cities and nine states followed over the following decade. But progress proceeded unevenly and gay Americans continued to suffer hardships from a hostile culture.
As events in the 1970s broadened sexual freedoms and promoted greater gender equality, so too did they generate sustained and organized opposition. Evangelical Christians and other moral conservatives, for instance, mobilized to reverse gay victories. In 1977, Dade County, Florida used the slogan “Save Our Children” to overturn an ordinance banning discrimination based on sexual orientation. A leader of the brewing religious right, Jerry Falwell, said in 1980 that “It is now time to take a stand on certain moral issues …. We must stand against the Equal Rights Amendment, the feminist revolution, and the homosexual revolution. We must have a revival in this country.”
The most stunning conservative counterattack of the seventies was the defeat of the Equal Rights Amendment (ERA). Versions of the Amendment, which declared, “Equality of rights under the law shall not be denied or abridged by the United States or any state on account of sex,” were introduced to Congress each year since 1923. It finally passed amid the revolutions of the sixties and seventies and went to the states for ratification in March 1972. With high approval ratings, the ERA seemed destined to swiftly pass through state legislatures and become the Twenty-Seventh Amendment. Hawaii ratified the Amendment the same day it was passed by Congress. Within a year, thirty states had done likewise. But then the Amendment stalled. It took years for passage in more states. In 1977, Indiana became the thirty-fifth and last state to ratify.
By 1977, anti-ERA forces had gathered and deployed their strength against the Amendment. At a time when many women shared Betty Friedan’s frustration that society seemed to confine women to the role of homemaker, Phyllis Schlafly’s STOP ERA organization (“Stop Taking Our Privileges”) trumpeted the value and advantages of homemakers and mothers. Schlafly worked tirelessly to stifle the ERA. She lobbied legislators and organized counter-rallies to ensure that Americans heard “from the millions of happily married women who believe in the laws which protect the family and require the husband to support his wife and children.” The Amendment had needed only three more states for ratification. It never got them. In 1982 the ratification crusade expired.
The failed battle for the ERA uncovered the limits of the feminist crusade. And it illustrated the women’s movement’s inherent incapacity to represent fully the views of fifty percent of the country’s population, a population riven by class differences, racial disparities, and cultural and religious divisions.
IV. Race and Social and Cultural Anxieties
The lines of race and class and culture ruptured American life throughout the 1970s. Americans grew disenchanted with the pace of social change: it was insufficient, some said; it was excessive, said others. The idealism of the 1960s died. Alienation took its place.
As the monolith of American culture shattered—a monolith pilloried in the fifties and sixties as exclusively white, male-dominated, conservative, and stifling—the culture seemed to fracture and Americans retreated into tribal subcultures. Mass culture became segmented. Marketers targeted particular products to ever smaller pieces of the population, including previously neglected groups such as African Americans, who, despite continuing inequality, acquired more disposable income. Subcultures often revolved around certain musical styles, whether pop, disco, hard rock, punk rock, country, or hip-hop. Styles of dress and physical appearance likewise aligned with cultures of choice.
If the popular rock acts of the sixties appealed to a new counterculture, the seventies witnessed the resurgence of cultural forms that appealed to a white working class confronting the social and political upheavals of the 1960s. Country hits such as Merle Haggard’s “Okie from Muskogee” evoked simpler times and places where people “still wave Old Glory down at the courthouse” and they “don’t let our hair grow long and shaggy like the hippies out in San Francisco.” A popular television sitcom, All in the Family, became an unexpected hit among “middle America” The main character Archie Bunker was designed to mock reactionary middle-aged white men. “Isn’t anyone interested in upholding standards?” he lamented in an episode dealing with housing integration. “Our world is coming crumbling down. The coons are coming!”
As Bunker knew, African-Americans were becoming much more visible in American culture. While black cultural forms had been prominent throughout American history, they assumed new popular forms in the 1970s. Disco offered a new, optimistic, racially integrated pop music. Behind the scenes, African American religious styles became an outsized influence on pop music. Musicians like Aretha Franklin, Andre Crouch, and “fifth Beatle” Billy Collins brought their background in church performance to their own recordings as well as to the work of white artists like the Rolling Stones, with whom they collaborated. And by the end of the decade African American musical artists had introduced American society to one of the most significant musical innovations in decades: the Sugarhill Gang’s 1979 record, Rapper’s Delight. A lengthy paean to black machismo, it became the first rap single to reach the top 40.
Just as rap represented a hyper-masculine black cultural form, Hollywood popularized its white equivalent. Films such as 1971’s Dirty Harry captured a darker side of the national mood. Clint Eastwood’s titular character exacted violent justice on clear villains, working within the sort of brutally simplistic ethical standard that appealed to Americans anxious about a perceived breakdown in “law and order” (more than one critic slammed the film’s glorified “fascism”) and the need for violent reprisals.
Violence increasingly marked American race relations. No longer confined to the anti-black terrorism that struck the southern civil rights movement in the 1950s and 1960s, violence now broke out across the country among blacks in urban riots and among whites protesting new civil rights programs. In the mid-1970s, for instance, protests over the use of busing to integrate public schools in Boston erupted in violence among whites and blacks.
Racial violence in the nation’s cities tainted many white Americans’ perception of the civil rights movement and urban life in general. Civil unrest broke out across the country, but the riots in Watts/Los Angeles (1965), Newark (1967), and Detroit (1967) were most shocking. In each, a physical altercation between white police officers and African Americans spiraled into days of chaos and destruction. Tens of thousands participated in urban riots. Many looted and destroyed white-owned business. There were dozens of deaths, tens of millions of dollars in property damage, and an exodus of white capital that only further isolated urban poverty.
In 1967, President Johnson appointed the Kerner Commission to investigate the causes of America’s riots. Their report became an unexpected bestseller. The Commission cited black frustration with the hopelessness of urban poverty. As the head of the black National Business League testified, “It is to be more than naïve—indeed, it is a little short of sheer madness—for anyone to expect the very poorest of the American poor to remain docile and content in their poverty when television constantly and eternally dangles the opulence of our affluent society before their hungry eyes.” A Newark rioter who looted several boxes of shirts and shoes put it more simply: “They tell us about that pie in the sky but that pie in the sky is too damn high.” But white conservatives blasted the conclusion that white racism and economic hopelessness were to blame for the violence. African Americans wantonly destroying private property, they said, was not a symptom of America’s intractable racial inequalities, but the logical outcome of a liberal culture of permissiveness that tolerated, even encouraged, nihilistic civil disobedience. Many moderates and liberals, meanwhile, saw the explosive violence as a sign African Americans had rejected the nonviolent strategies of the civil rights movement.
The unrest of the late ‘60s did, in fact, reflect a real and growing disillusionment among African Americans with the fate of the civil rights crusade. Political achievements such as the 1964 Civil Rights Act and 1965 Voting Rights Act were indispensable legal preconditions for social and political equality, but the movement’s long (and now often forgotten) goal of economic justice proved as elusive as ever. 1968 found Martin Luther King, Jr., organized the Poor People’s Campaign, a multi-racial struggle to uproot America’s entrenched poverty. “I worked to get these people the right to eat cheeseburgers,” Martin Luther King Jr. supposedly said to Bayard Rustin as they toured the devastation in Watts some years earlier, “and now I’ve got to do something…to help them get the money to buy it.” What good was the right to enter a store without money for purchases?
V. Deindustrialization and the Rise of the Sunbelt
Though black leaders like King and Rustin denounced urban violence, they recognized the frustrations that fueled it. In the still-moldering ashes of Jim Crow, African Americans in Watts and similar communities across the country bore the burdens of lifetimes of legally sanctioned discrimination in housing, employment, and credit. The inner cities had become traps that too few could escape.
Segregation survived the legal dismantling of Jim Crow. The perseverance into the present day of stark racial and economic segregation in nearly all American cities destroyed any simple distinction between southern “de jure” segregation and non-southern “de facto” segregation.
Meanwhile, whites and white-owned businesses fled the inner cities, depleted municipal tax bases, and left behind islands of poverty. This flight of people and capital was felt most acutely in the deindustrializing cities of the Northeast and Midwest. Few cases better illustrate these transformations than Detroit. As the automobile industry expanded and especially as the United States transitioned to a wartime economy during World War II, Detroit boomed. When auto manufacturers like Ford and General Motors converted their assembly lines to build machines for the American war effort, observers dubbed the city the “arsenal of democracy.” Newcomers from around the country flooded the city looking for work. Between 1940 and 1947, manufacturing employment increased by 40 percent, and between 1940 and 1943 the number of unemployed workers fell from 135,000 to a mere four thousand. Thanks to New Deal labor legislation and the demands of war, unionized workers in Detroit and elsewhere enjoyed secure employment and increased wages. A vast middle class populated a thriving city with beautiful public architecture, theaters, and libraries.
Workers made material gains throughout the 1940s and 1950s. During the so-called “Great Compression,” Americans of all classes enjoyed in postwar prosperity. A highly progressive tax system and powerful unions lowered income inequality. Rich and poor advanced together. Working class standard-of-living nearly doubled between 1947 and 1973 and unemployment continually fell.
But general prosperity masked deeper vulnerabilities. After the war automobile firms began closing urban factories and moving to outlying suburbs. Several factors fueled the process. Some cities partly deindustrialized themselves. Municipal governments in San Francisco, St. Louis, and Philadelphia banished light industry to make room for high-rise apartments and office buildings. Mechanization seemed to contribute to the decline of American labor. A manager at a newly automated Ford engine plant in postwar Cleveland captured the interconnections between these concerns when he glibly noted to United Automobile Workers (UAW) president Walter Reuther, “you are going to have trouble collecting union dues from all of these machines.” More importantly, however, manufacturing firms sought to lower labor costs by automating, downsizing, and relocating to areas with “business friendly” policies such as low tax rates, anti-union “right-to-work” laws, and low wages.
Detroit began to bleed industrial jobs. Between 1950 and 1958, Chrysler cut its Detroit production workforce in half. In the years between 1953 and 1959, East Detroit lost ten plants and over seventy-one thousand jobs. Detroit was a single-industry town, it was built upon the auto industry. Decisions made by the “Big Three” automakers, therefore, reverberated across the city’s industrial landscape. When auto companies mechanized or moved their operations, ancillary suppliers such as machine tool companies were cut out of the supply chain and were likewise forced to cut their own workforce. Between 1947 and 1977, the number of manufacturing firms in the city dropped from 3,272 to fewer than two thousand. The labor force was gutted. Manufacturing jobs fell from 338,400 to 153,000 over the same three decades.
Industrial restructuring decimated all workers, and many middle-class blacks managed to move out of the city’s ghettoes, but deindustrialization fell heaviest on the city’s African Americans. By 1960, 19.7 percent of black autoworkers in Detroit were unemployed, compared to just 5.8 percent of whites. Overt discrimination in housing and employment had for decades confined blacks to segregated neighborhoods where they were forced to pay exorbitant rents for slum housing. Subject to residential intimidation and cut off from traditional sources of credit, few blacks could afford to follow industry as it left the city for the suburbs and other parts of the country. Detroit devolved into a mass of unemployment, crime, and crippled municipal resources. When riots rocked Detroit in 1967, 25 to 30 percent of blacks between age eighteen and twenty-four were unemployed.
Deindustrialization went hand in hand with the long assault on unionization that had begun in the aftermath of World War II. Without the political support they had enjoyed during the New Deal years, unions such as the Congress of Industrial Organizations (CIO) and the United Auto Workers (UAW) shifted tactics and accepted labor-management accords in which cooperation, not agitation, was the strategic objective. This accord held mixed results for workers. On the one hand, management encouraged employee loyalty through privatized welfare systems that offered workers health benefits and pensions. Grievance arbitration and collective bargaining also allowed workers official channels in which to criticize and push for better conditions. At the same time, unions became increasingly weighed down by bureaucracy and corruption. Union management came to hold primary influence in what was ostensibly a “pluralistic” power relationship, and workers—though still willing to protest—by necessity pursued a more moderate agenda compared to the union workers of the 1930s and 40s.
The decline of labor coincided with ideological changes within American liberalism. Labor and its political concerns undergirded Roosevelt’s New Deal coalition, but by the 1960s many liberals had forsaken working class politics. More and more saw poverty as stemming not from structural flaws in the national economy, but from the failure of individuals to take full advantage of the American system. For instance, while Roosevelt’s New Deal might have attempted to rectify unemployment with government jobs, Johnson’s Great Society and its imitators funded government-sponsored job training, even in places without available jobs. Union leaders in the ‘50s and ‘60s typically supported such programs and philosophies.
Widely shared postwar prosperity leveled off and began to retreat by the mid-1970s. Growing international competition, technological inefficiency, and declining productivity gains stunted working- and middle-class wages. As the country entered recession, wages decreased and the pay gap between workers and management began its long widening. The tax code became less progressive and labor lost its foothold in the marketplace. Unions representing a third of the workforce in the 1950s, but only one in ten workers belonged to one as of 2006.
Geography dictated much of labor’s fall. American firms fled from pro-labor states in the 1970s and 1980s. Some went overseas in the wake of new trade treaties to exploit low-wage foreign workers, but others turned to the anti-union states in the South and West stretching from Virginia to Texas to southern California. Factories shuttered in the North and Midwest and by the 1980s commentators had dubbed America’s former industrial heartland the “the Rust Belt.”
Coined by journalist Kevin Phillips in 1969, the “Sun Belt” refers to the swath of southern and western states that saw unprecedented economic, industrial, and demographic growth after World War II. During the New Deal, President Franklin D. Roosevelt declared the American South “the nation’s No. 1 economic problem” and injected massive federal subsidies, investments, and military spending into the region. During the Cold War, Sun Belt politicians lobbied hard for military installations and government contracts for their states.
Meanwhile, the state’s hostility toward labor beckoned corporate leaders. The Taft-Hartley Act in 1949 facilitated southern states’ frontal assault on unions. Thereafter, cheap, nonunionized labor, low wages, and lax regulations stole northern industries away from the Rust Belt. Skilled northern workers followed the new jobs southward and westward, lured by cheap housing and a warm climate slowly made more tolerable by modern air conditioning.
The South attracted business but struggled to share their profits. Middle class whites grew prosperous, but often these were recent transplants, not native southerners. As the cotton economy shed farmers and laborers, poor white and black southerners found themselves mostly excluded from the fruits of the Sun Belt. Public investments were scarce: white southern politicians channeled federal funding away from primary and secondary public education and toward high-tech industry and university-level research. The Sun Belt inverted Rust Belt realities: the South and West brought growing numbers of high-skill, high-wage jobs but lacked the social and educational infrastructure needed to supply the native poor and middle classes with those same jobs.
Although massive federal investments sparked the Sun Belt’s explosive growth, the New Right took its firmest hold there. The South ran rife with conservative religious ideas, which it exported westward. The leading figures of the nascent religious right rose to prominence in the Sun Belt. Moreover, business-friendly politicians successfully synthesized conservative Protestantism and free-market ideology, creating a potent new political force.
Sunbelt cities were automobile cities. They sprawled across the landscapes. Public space was more limited than in older, denser cities. Politics often revolved around suburban life. Housewives organized reading groups in their homes, and from those reading groups sprouted new organized political activities. Prosperous and mobile, old and new suburbanites gravitated towards an individualistic vision of free enterprise espoused by the Republican Party. Some, especially those most vocally anti-communist, joined groups such as the Young Americans for Freedom and the John Birch Society. Less radical suburban voters, however, still gravitated towards the more moderate brand of conservatism promoted by Richard Nixon.
Once installed in the White House, Richard Nixon focused his energies on shaping American foreign policy. He publicly announced the “Nixon Doctrine” in 1969. While asserting the supremacy of American democratic capitalism, and conceding that the U. S. would continue supporting its allies financially, he denounced previous administrations’ willingness to commit American forces to third world conflicts and warned other states to assume responsibility for their own defense. He was turning America away from the policy of active, anti-communist containment, and toward a new strategy of “détente.”
Promoted by national security advisor and eventual Secretary of State Henry Kissinger, détente sought to stabilize the international system by “thawing” relations with Cold War rivals and bilaterally freezing arms levels. Taking advantage of tensions between the People’s Republic of China (PRC) and the Soviet Union, Nixon pursued closer relations in order to de-escalate tensions and strengthen the United States’ position relative to both countries. The strategy seemed to work. Nixon became the first American president to visit communist China and the first to visit the Soviet Union in 1971 and 1972, respectively. Direct diplomacy and cultural exchange programs with both countries grew and culminated with the formal normalization of U. S.-Chinese relations and the signing of two U. S.-Soviet arms agreements: the anti-ballistic missile (ABM) treaty and the Strategic Arms Limit Treaty (SALT I). By 1973, after almost thirty years of Cold War tension, peaceful coexistence suddenly seemed possible. Short-term gains, however, failed to translate into long-term stability. By the decade’s end, a fragile calm gave way once again to Cold War instability.
A brewing energy crisis interrupted Nixon’s presidency. In November 1973, Nixon appeared on television to inform Americans that energy had become “a serious national problem” and that the United States was “heading toward the most acute shortages of energy since World War II.” The previous month Arab members of the Organization of Petroleum Exporting Countries (OPEC), a cartel of the world’s leading oil producers, embargoed oil exports to the United States in retaliation for American intervention in the Middle East. The embargo caused an “oil shock” and launched the first energy crisis. By the end of 1973, the global price of oil had quadrupled. Drivers waited in line for hours to fill up their cars. Individual gas stations ran out of gas. American motorists worried that oil could run out at any moment. A Pennsylvania man died when his emergency stash of gasoline ignited in his trunk. OPEC rescinded its embargo in 1974, but the economic damage had been done and the energy crisis nevertheless extended into the late 1970s.
Like the Vietnam War, the oil crisis showed that small countries perceived could still hurt the United States. At a time of anxiety about the nation’s future, Vietnam and the energy crisis accelerated Americans’ disenchantments with the United States’ role in the world and the efficacy and quality of its leaders. Furthermore, scandals in the 1970s and early 80s sapped trust in America’s public institutions. Watergate catalyzed the disenchantment of the Unraveling.
On June 17, 1972, five men were arrested inside the offices of the Democratic National Committee (DNC) in the Watergate Complex in downtown Washington, D.C. After being tipped by a security guard, police found the men attempting to install sophisticated bugging equipment. One of those arrested was a former CIA employee then working as a security aide for the Nixon administration’s Committee to Reelect the President (lampooned as “CREEP”).
While there is no direct evidence that Richard Nixon ordered the Watergate break-in, Nixon had been recorded in conversation with his Chief of Staff requesting the DNC chairman be illegally wiretapped to obtain the names of the committee’s financial supporters, which could then be given to the Justice Department and the IRS to conduct spurious investigations into their personal affairs. (Nixon was also recorded ordering his Chief of Staff to break into the offices of the Brookings Institute and take files relating to the war in Vietnam, saying, “Goddamnit, get in and get those files. Blow the safe and get it.”) Whether or not the president ordered the Watergate break-in, the White House launched a massive cover-up. Administration officials ordered the CIA to halt the FBI investigation and paid hush-money to the burglars and White House aides. Nixon distanced himself from the incident publicly and went on to win a landslide election victory in November 1972. But, thanks largely to two persistent journalists at the Washington Post, Bob Woodward and Carl Bernstein, information continued to surface that tied the burglaries ever closer to the CIA, the FBI, and the White House. The Senate held televised hearings. Nixon fired his Chief of Staff and appointed a special prosecutor to investigate the burglary and ongoing investigation, and then, when the investigation progressed, ordered the Attorney General to fire that same prosecutor. Citing “executive privilege,” Nixon refused to comply with orders to produce tapes from the White House’s secret recording system. In July 1974, the House Judiciary Committee approved a bill to impeach the president. Nixon resigned before the full House could vote on impeachment. He became the first and only American president to resign his office.
Vice President Gerald Ford was sworn in as his successor and a month later granted Nixon a full presidential pardon. Nixon disappeared from public life without ever publicly apologizing, accepting responsibility, or facing charges stemming from the scandal.
Watergate weighed on voters’ minds. Nixon’s disgrace netted big congressional gains for the Democrats in the 1974 mid-term elections. President Ford, the presumptive Republican nominee in 1976, damaged his popularity by pardoning Nixon. Voters seemed to want a Washington outsider untainted by the Beltway politics of the previous decade.
A wide field of Democratic presidential hopefuls reflected the diversity and disunity of the party. According to late January Gallup polls, segregationist Alabama governor George Wallace and moderate former Vice President Hubert Humphrey led with eighteen and seventeen percent respectively. A distant third at five percent was conservative Washington Senator Henry “Scoop” Jackson. In fourth place, with four percent, was former Georgia governor Jimmy Carter, a nuclear physicist and peanut farmer who represented the rising generation of younger, racially liberal “New South” Democrats.
After the chaos of the 1968 Chicago convention, the Democrats reformed party rules to bring in women, African Americans, young people, and Spanish speakers. One way the committee sought to improve popular participation (as well as stifle backstage maneuvering and public bickering) to increase the weight of caucuses and primaries on the presidential nomination process, reducing the machinations of party officials at the convention. Jimmy Carter and his energetic staff of Georgians understood the importance of these primaries and spent two years traveling the country, getting to know local Democrats and winning grassroots support.
Unlike his Democratic opponents—and unlike President Ford—Carter was a Washington outsider. He was unidentified with either his party’s liberal or conservative wings. Indeed, his appeal was more personal and moral than political. He ran on no great political issues. Instead, crafting an optimistic campaign centered on the slogan, “Why not the best?,” he let his background as a hardworking, honest, Southern Baptist navy-man ingratiate him to voters around the country, especially in his native South, where support of Democrats had wavered in the wake of the civil rights movement. Carter’s wholesome image was painted in direct contrast to the memory of Nixon and by association with the man who pardoned him, his vice president, Gerald Ford. Carter sealed his party’s nomination in June and won a close victory in November.
When Carter assumed took the oath of office on January 20, 1977, he became president of a nation in the midst of economic turmoil. Oil shocks, inflation, stagnant growth, unemployment, and sinking wages weighed down the nation’s economy. The age of affluence was over, and the unraveling had begun, culminating deeply rooted problems that had lain dormant during the long postwar prosperity.
At the end of the Second World War, American leaders erected a complex system of trade policies to help rebuild the shattered economies of Western Europe and Asia. In the glow of the Cold War, American diplomats and politicians used trade relationships to win influence and allies around the globe and they saw the economic health of their allies, particularly West Germany and Japan, as a crucial bulwark against the expansion of communism. Americans encouraged these nations to develop vibrant export-oriented economies and tolerated restrictions on U.S. imports. This came at great cost to the United States. As the American economy stalled, Japan and West Germany soared and became major forces in the global production for autos, steel, machine tools, and electrical products. By 1970, the United States began to run massive trade deficits. The value of American exports dropped and the prices of its imports skyrocketed. Coupled with the huge cost of the Vietnam War and the rise of oil-producing states in the Middle East, growing trade deficits sapped the United States’ dominant position in the global economy.
American leaders didn’t know how to respond. After a series of negotiations with leaders from France, Great Britain, West Germany, and Japan in 1970 and 1971, the Nixon administration allowed these rising industrial nations to continue flouting the principles of free trade by maintaining trade barriers that sheltered their domestic markets from foreign competition while at the same time exporting growing amounts of goods to the United States, which no longer maintained so comprehensive a tariff system. By 1974, in response to U. S. complaints and their own domestic economic problems, many of these industrial nations overhauled their protectionist practices but developed even subtler methods, such as state subsidies for key industries, to nurture their economies.
Carter, like Ford before him, presided over a hitherto unimagined economic dilemma: the simultaneous onset of inflation and economic stagnation, a combination popularized as “stagflation.” Neither Carter nor Ford had the means nor the ambition to protect American jobs and goods from foreign competition. As firms and financial institutions invested, sold goods, and manufactured in new rising economies, such as Mexico, Taiwan, Japan, Brazil, and elsewhere, American politicians allowed them to sell their often less costly products in the United States.
As American officials institutionalized this new unfettered global trade, many suffering American manufacturers perceived only one viable path to sustained profitability: moving overseas, often by establishing foreign subsidiaries or partnering with foreign firms. Investment capital, especially in manufacturing, fled the U. S. looking for overseas investments and hastened the decline in the productivity of American industry while rising export-oriented industrial nations flooded the world market with their cheaply produced goods. Global competition swiftly undermined the dominance enjoyed by American firms. By the end of the decade, the United States suffered from perennial trade deficits and weakened U.S. industry while many Americans suffered eroded job security and stagnating incomes.
As Carter failed to slow the unraveling of the American economy, he also struggled to shift American foreign policy away from blind anti-communism toward a human-rights based agenda. Carter was a one-term Georgia governor with little foreign policy experience and few knew what to expect from his presidency. Carter did not make human rights a central theme of his campaign. Only in May 1977 did the new president offer a definitive statement when, speaking before the graduating class at the University of Notre Dame, he declared his wish to move away from a foreign policy in which “an inordinate fear of communism” caused American leaders to “adopt the flawed and erroneous principles and tactics of our adversaries.” (Cold War foreign policy, he said, had resulted in the “profound moral crisis” of the Vietnam War.) Carter proposed instead “a policy based on constant decency in its values and on optimism in our historical vision.” Carter’s focus on human rights, mutual understanding, and peaceful solutions to international crises resulted in some successes. Under Carter, the U. S. either reduced aid to or ceased aiding altogether the American-supported right-wing dictators guilty of extreme human rights abuses in places such as South Korea, Argentina, and the Philippines. And despite intense domestic opposition, in September 1977, partly under the belief that such a treaty would signal a renewed American commitment to fairness and respect for all nations, Carter negotiated the return of the Panama Canal to Panamanian control.
Carter’s arguably greatest foreign policy achievement was the Camp David Accords. In September 1978, Carter negotiated a peace treaty between Israeli Prime Minister Menachem Begin and Egyptian President Anwar Sadat. After thirteen days of secret negotiations hosted by Carter at the presidency’s rural Maryland retreat, Camp David, two agreements were reached. The first established guidelines for Palestinian autonomy and a set of principles that would govern Israel’s relations to its Arab neighborhoods. The second provided the basis for Egyptian-Israeli peace by returning the Sinai Peninsula to Egyptian control and opening the Suez Canal to Israeli ships. The Accords, however, had significant limits. Though Sadat and Begin won a Nobel Peace Prize for their efforts, the Accords were as significant for what they left unresolved as for what they achieved. Though they represented the first time since the establishment of Israel that Palestinians were promised self-government and the first time that an Arab state fully recognized Israel as a nation, most of the Arab world rejected the Accords. The agreement ensured only limited individual rights for Palestinians and precluded territorial control or the possibility of statehood. Indeed, Palestinian Liberation Organization (PLO) chairman Yasser Arafat later described the Accords’ version of Palestinian autonomy as, “no more than managing the sewers.”
Carter, however, could not balance his insistence on human rights with the realities of the Cold War. While his administration reduced aid to some authoritarian states, the U.S. continued to provide military and financial support to allies it considered truly vital to American interests—most notably, the oil-rich nation of Iran. When the President and First Lady (Rosalynn Carter) visited Tehran in January 1978, the President praised the nation’s dictatorial ruler, Shah Reza Pahlavi and remarked on the “respect and the admiration and love” of Iranians for their leader. A year later, the Iranian Revolution deposed the Shah. In November, 1979, revolutionary Iranians, irate over America’s interventions in Iranian affairs and its long support of the Shah, stormed the U. S. embassy in Tehran and took fifty-two Americans hostage. At the same time, Americans again felt the energy pinch when revolutionaries shut down Iranian oil fields, spiking the price of oil for the second time in a decade. Americans not only felt the nation’s weakness at the gas pump, they watched it every night on national television: for many Americans, the hostage crisis that stretched across the next 444 days became a source of both jingoistic unity and a constant reminder of the country’s new global impotence. The nation that had defeated the Nazis and the Empire of Japan in the Second World War found itself, thirty years later, humiliated by both half of an obscure southeast Asian country and a relatively small and unstable Middle Eastern nation.
With his popularity plummeting Carter, in April 1980, ordered a secret rescue mission, Operation Eagle Claw, that ended in disaster. A U. S. helicopter crashed in the middle of the desert, killing eight servicemen, and leading Carter to take responsibility for the losses and the continued inability to free the American hostages.
Moreover, Carter’s efforts to ease the Cold War by achieving a new nuclear arms control agreement (SALT II) disintegrated under domestic opposition led by conservative hawks such as Ronald Reagan. They accused Carter of weakness, and cited Soviet support for African leftist revolutionaries as evidence of Soviet duplicity. And then the Soviets invaded Afghanistan in December 1979, returning the Cold War to the forefront of U.S. foreign policy. A month later, a beleaguered Carter committed the United States to defending its “interests” in the Middle East against Soviet incursions, declaring that “an assault [would] be repelled by any means necessary, including military force.” Known as the “Carter Doctrine,” the President’s declaration signaled the administration’s ambivalent commitment to human rights and a renewed reliance on military force in its anti-communist foreign policy. The seeds of Ronald Reagan’s more aggressive foreign policy had been sown.
Though American politics moved right after Lyndon Johnson’s resignation, Nixon’s 1968 election had marked no conservative counterrevolution. American politics and society remained in flux throughout the 1970s. American politicians on the right and the left pursued relatively moderate courses compared to those in the preceding and succeeding decades. But a groundswell of anxieties and angers brewed beneath the surface. The world’s greatest military power had floundered in Vietnam and an American president stood flustered by Middle Eastern revolutionaries. The cultural clashes from the 60s persisted and accelerated. While cities burned, a more liberal sexuality permeated American culture. The economy crashed, leaving America’s cities prone before poverty and crime and its working class gutted by deindustrialization and globalization. American weakness was everywhere. And so, by 1980, many Americans—especially white middle- and upper-class Americans—felt a nostalgic desire for simpler times and simpler answers to the frustratingly complex geopolitical, social, and economic problems crippling the nation. The appeal of Carter’s soft drawl and Christian humility had signaled this yearning, but his utter failure to stop the unraveling opened the way for a new movement, with new personalities and a new conservatism, which promised to undo the damage and restore the United States to its nostalgic image of itself.
This chapter was edited by Edwin Breeden, with content contributions by Seth Anziska, Jeremiah Bauer, Edwin Breeden, Kyle Burke, Alexandra Evans, Sean Fear, Anne Grey Fischer, Destin Jenkins, Matthew Kahn, Suzanne Kahn, Brooke Lamperd, Katherine McGarr, Matthew Pressman, Adam Parsons, Emily Prifogle, John Rosenberg, Brandy Thomas Wells, and Naomi R. Williams.