Category Archives: Uncategorized

30. The Recent Past

New York City, before September 11, 2001, via Library of Congress.

New York City, before September 11, 2001, via Library of Congress.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

Revolutionary technological change, unprecedented global flows of goods and people and capital, an amorphous and unending “War on Terror,” accelerating inequality, growing diversity, a changing climate, political stalemate: our world is remarkable, frustrating, and dynamic. But it is not an island of circumstance–it is a product of history. Time marches forever on. The present becomes the past and the past becomes history. But, as William Faulkner wrote, “The past is never dead. It’s not even past.” ((William Faulker, Requiem for a Nun (New York: Random House, 1954), 73.)) The last several decades of American history have culminated in the present, an era of innovation and advancement but also of stark partisan division, sluggish economic growth, widening inequalities, widespread military interventions, and pervasive anxieties about the present and future of the United States. Through boom and bust, national tragedy, foreign wars, and the maturation of a new generation, a new chapter of American history is busily being written.

 

II. American Politics from George H.W. Bush to September 11, 2001

The conservative “Reagan Revolution” lingered over the presidential election of 1988. At stake was the legacy of a newly empowered conservative movement, a movement that would move forward with Reagan’s vice president, George H. W. Bush, who triumphed over Massachusetts Governor Michael Dukakis with a promise to continue the conservative work that had commenced in the 1980s.

George H. W. Bush, whose father, Prescott Bush, was a United States Senator from Connecticut, was a World War II veteran, president of a successful oil company, served as chair of the Republican National Committee, Director of the Central Intelligence Agency, and was elected to the House of Representatives from his district in Texas. After failing to best Reagan in the 1980 Republican primaries, he was elected as his vice president in 1980 and again in 1984. In 1988, Michael Dukakis, a proud liberal from Massachusetts, challenged Bush for the White House.

Dukakis ran a weak campaign, but Bush, a Connecticut aristocrat who had never been fully embraced by “movement conservatism,” hammered him with moral and cultural issues. Bush said Dukakis had blocked recitation of the Pledge of Allegiance in Massachusetts schools, that he was a “card-carrying member” of the American Civil Liberties Union. Bush meanwhile dispatched his eldest son, George W. Bush, as his ambassador to the religious right. ((Bill Minutaglio, First Son: George W. Bush and the Bush Family Dynasty (New York: Random House, 1999), 210-224.)) Bush also infamously released a political ad featuring the face of Willie Horton, a black Massachusetts man and convicted murderer who raped a woman after taking advantage of a Massachusetts prison furlough program during Dukakis’ tenure. “By the time we’re finished,” Bush’s campaign manager, Lee Atwater, said, “they’re going to wonder whether Willie Horton is Dukakis’ running mate.” ((Roger Simon, “How A Murderer And Rapist Became The Bush Campaign’s Most Valuable Player,” The Baltimore Sun (November 11, 1990).)) Liberals attacked conservatives for perpetuating the ugly “code word” politics of the old Southern Strategy. ((See especially Dan T. Carter, From George Wallace to Newt Gingrich: Race in the Conservative Counterrevolution, 1963-1994 (Baton Rouge: Louisiana State University Press, 1996), 72-80.)) Buoyed by such attacks, Bush won a large victory and entered the White House

Bush’s election signaled Americans’ continued embrace of Reagan’s conservative program and further evidenced the utter disarray of the Democratic Party. American liberalism, so stunningly triumphant in the 1960s, was now in full retreat. It was still, as one historian put it, in the “Age of Reagan.” ((Sean Wilentz, The Age of Reagan: A History, 1974–2008 (New York: Harper, 2008).))

The Soviet Union collapsed during Bush’s tenure. Devastated by a stagnant economy, mired in a costly and disastrous war in Afghanistan, confronted with dissident factions in Eastern Europe, and rocked by internal dissent, the Soviet Union crumbled. Soviet leader and reformer Mikhail Gorbachev loosened the Soviet Union’s tight personal restraints and censorship (“glasnost“) and liberalized the Soviet political machinery (“perestroika.”). Eastern Bloc nations turned against their communist organizations and declared their independence from the Soviet Union. Gorbachev let them go. The Soviet Union unraveled. On December 25, 1991, Gorbachev resigned his office, declaring that the Soviet Union no longer existed. At the Kremlin, the hammer and sick was lowered. The Russian tricolor was raised. ((James F. Clarity, “End of the Soviet Union,” New York Times (December 26, 1991).))

The dissolution of the Soviet Union left the United States as the world’s only remaining superpower. Global capitalism seemed triumphant. Observers wondered if some final stage of history had been reached, if the old battles had ended and a new global consensus built around peace and open markets would reign forever. “What we may be witnessing is not just the end of the Cold War, or the passing of a particular period of post-war history, but the end of history as such,” wrote Francis Fukuyama in his much-talked-about 1989 essay, “The End of History?” ((Francis Fukuyama, “The End of History?”, The National Interest (Summer 1989).)) Assets in Eastern Europe were privatized and auctioned off as newly independent nations introduced market economies. New markets were rising in Southeast Asia and Eastern Europe. India, for instance, began liberalizing its economic laws and opening itself up to international investment in 1991. China’s economic reforms, advanced by Chairman Deng Xiaoping and his handpicked successors, accelerated as privatization and foreign investment proceeded.

The post-Cold War world was not without international conflicts, however. Congress granted President Bush approval to intervene in Kuwait in 1990. The United States laid the groundwork–Operation Desert Shield–in August and commenced combat operations–Operation Desert Storm–in January, 1991. With the memories of Vietnam still fresh, many Americans were hesitant to support military action that could expand into a protracted war or long-term commitment of troops. But the Gulf War was a swift victory for the United States. New technologies–including laser-guided precision bombing–amazed Americans, who could now watch 24-hour live coverage of the war on The Cable News Network (CNN). The Iraqi army disintegrated after only a hundred hours of ground combat. President Bush and his advisers opted not to pursue the war into Baghdad and risk an occupation and insurgency. And so the war was won. Many wondered if the “ghosts of Vietnam” had been exorcised. ((William Thomas Allison, The Gulf War, 1990-91 (New York: Palgrave Macmillan, 2012), 145, 165.)) Bush won enormous popularity. Gallup polls showed a job approval rating as high as 89% in the weeks after the end of the war. ((Charles W. Dunn, The Presidency in the Twenty-first Century (Lexington: University Press of Kentucky, 2011), 152.))

The Iraqi military set fire to Kuwait’s oil fields during the Gulf War, many of which burned for months and caused massive pollution. Photograph of oil well fires outside Kuwait City, March 21, 1991. Wikimedia, http://commons.wikimedia.org/wiki/File:Operation_Desert_Storm_22.jpg.

The Iraqi military set fire to Kuwait’s oil fields during the Gulf War, many of which burned for months and caused massive pollution. Photograph of oil well fires outside Kuwait City, March 21, 1991. Wikimedia, http://commons.wikimedia.org/wiki/File:Operation_Desert_Storm_22.jpg.

President Bush’s popularity seemed to suggest an easy reelection in 1992, but Bush had still not won over the “New Right,” the aggressively cultural and economic conservative wing of the Republican Party, despite his attacks on Dukakis, his embrace of the flag and the Pledge, and his promise, “Read my lips: no new taxes.” He faced a primary challenge from political commentator Patrick Buchanan, a former Reagan and Nixon White House adviser, who cast Bush as a moderate, an unworthy steward of the conservative movement who was unwilling to fight for conservative Americans in the nation’s ongoing “culture war.” Buchanan did not defeat Bush in the Republican primaries, but he inflicted enough damage to weaken his candidacy. ((Robert M. Collins, Transforming America: Politics and Culture During the Reagan Years (New York: Columbia University Press, 2009), 171, 172).))

Still thinking that Bush would be unbeatable 1992, many prominent Democrats passed on a chance to run and the Democratic Party nominated a relative unknown, Arkansas Governor Bill Clinton. Dogged by charges of marital infidelity and draft-dodging during the Vietnam War, Clinton was a consummate politician with enormous charisma and a skilled political team. He framed himself as a “New Democrat,” a centrist open to free trade, tax cuts, and welfare reform. Twenty-two years younger than Bush, he was the first Baby Boomer to make a serious run at the presidency. Clinton presented the campaign as a generational choice. During the campaign he appeared on MTV. He played the saxophone on the Arsenio Hall Show. And he told voters that he could offer the United States a new way forward.

Bush ran on his experience and against Clinton’s moral failings. The GOP convention in Houston that summer featured speeches from Pat Buchanan and religious leader Pat Robertson decrying the moral decay plaguing American life. Clinton was denounced as a social liberal that would weaken the American family with his policies and his moral character. But Clinton was able to convince voters that his moderated Southern brand of “new” liberalism would be more effective than the moderate conservatism of George Bush. Bush’s candidacy, of course, was most crippled by a sudden economic recession. “It’s the economy, stupid,” Clinton’s political team reminded the country.

Clinton would win the election, but the Reagan Revolution still reigned. Clinton and his running mate, Tennessee Senator Albert Gore, Jr., both moderate southerners, promised a path away from the old liberalism (and the landslide electoral defeats) of the 1970s and 1980s. They were Democrats, but conservative Democrats, so-called “New Democrats.” In his first term, Clinton set out an ambitious agenda that included an economic stimulus package, universal health insurance, a continuation of the Middle East peace talks initiated by Bush’s Secretary of State James Baker, welfare reform, and a completion of the North American Free Trade Agreement (NAFTA) to abolish trade barriers between the U.S., Mexico, and Canada. His moves to reform welfare, open trade, and deregulate financial markets were particular hallmarks of Clinton’s “Third Way,” a political middle path that synthesized liberal and conservative ideas. ((For Clinton’s presidency and the broader politics of the 1990s, see James T. Patterson, Restless Giant: The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005); and Sean Wilentz, The Age of Reagan: A History, 1974–2008 (New York: Harper, 2008).))

With NAFTA, Clinton, reversed decades of Democratic opposition to free trade and opened the nation’s northern and southern borders to the free flow of capital and goods. Critics, particularly in the Midwest’s Rust Belt, blasted the agreement for opening American workers to deleterious competition by low-paid foreign workers. Many American factories did relocate by setting up shops–maquilas–in northern Mexico that took advantage of Mexico’s low wages. Thousands of Mexicans rushed to the maquilas. Thousands more continued on past the border.

If NAFTA opened American borders to goods and services, people still navigated strict legal barriers to immigration. Policymakers believed that free trade would create jobs and wealth that would incentivize Mexican workers to stay home, and yet multitudes continued to leave for opportunities in el norte. The 1990s proved that prohibiting illegal migration was, if not impossible, exceedingly difficult. Poverty, political corruption, violence, and hopes for a better life in the United States–or simply higher wages–continued to lure immigrants across the border. Between 1990 and 2010, the proportion of foreign-born individuals in the United States grew from 7.9 percent to 12.9 percent, and the number of undocumented immigrants tripled from 3.5 million to 11.2. While large numbers continued to migrate to traditional immigrant destinations—California, Texas, New York, Florida, New Jersey, and Illinois—the 1990s also witnessed unprecedented migration to the American South. Among the fastest-growing immigrant destination states were Kentucky, Tennessee, Arkansas, Georgia, and North Carolina, all of which had immigration growth rates in excess of 100% during the decade. ((Patterson, 298-299.))

In response to the continued influx of immigrants and the vocal complaints of anti-immigration activists, policymakers responded with such initiatives as Operations Gatekeeper and Hold the Line, which attempted to make crossing the border more prohibitive. By strengthening physical barriers and beefing up Border Patrol presence in border cities and towns, a new strategy of “funneling” immigrants to dangerous and remote crossing areas emerged. Immigration officials hoped the brutal natural landscape would serve as a natural deterrent.

In his first weeks in office, Clinton reviewed Department of Defense policies restrcting homosexuals from serving in the armed forces. He pushed through a compromise plan, “Don’t Ask, Don’t Tell,” that removed any questions about sexual preference in induction interview but also required that gay servicemen and women keep their sexual preference private. The policy alienated many–social conservatives were outraged and his credentials as a conservative southerner suffered; liberals recoiled at continued anti-gay discrimination and his credentials as a liberal suffered–and cost Clinton political capital.

In his first term Clinton put forward universal health care as a major policy goal and First Lady Hillary Rodham Clinton played a major role in the initiative. But the push for a national healthcare law collapsed on itself. Conservatives revolted, the health care industry flooded the airwaves with attack ads, Clinton struggled with Congressional Democrats, and voters bristled. A national healthcare system was again repulsed.

The mid-term elections of 1994 were a disaster for the Democrats, who lost the House of Representatives for the first time since 1952. Congressional Republicans, led by Georgia Congressman Newt Gingrich and Texas Congressman Dick Armey, offered a new “Contract with America.” Republican candidates from around the nation gathered on the steps of the Capitol to pledge their commitment to a conservative legislative blueprint to be enacted if the GOP won control of the House. The strategy worked.

Social conservatives were mobilized by an energized group of religious activists, especially the Christian Coalition, led by Pat Robertson and Ralph Reed. Robertson was a television minister and entrepreneur whose 1988 long shot run for the Republican presidential nomination brought him a massive mailing list and network of religiously motivated voters around the country. From that mailing list, the Christian Coalition organized around the country, seeking to influence politics on the local and national level.

In 1996 the generational contest played out again when the Republicans nominated another aging war hero, Senator Bob Dole of Kansas, but Clinton again won the election, becoming the first Democrat to serve back to back terms since Franklin Roosevelt. He was aided in part by the amelioration of conservatives by his signing of welfare reform legislation, “The Personal Responsibility and Work Opportunity Reconciliation Act of 1996,” which decreased welfare benefits, restricted eligibility, and turned over many responsibilities to states. Clinton said it would “break the cycle of dependency.” ((Carolyn Skorneck, “Final Welfare Bill Written,” Washington Post, (July 30, 1996), A1.))

Clinton’s presided over a booming economy fueled by emergent computing technologies. Personal computers had skyrocketed in sales and the internet become a mass phenomenon. Communication and commerce were never again the same. But the tech boom was driven by companies and the 90s saw robust innovation and entrepreneurship. Investors scrambled to find the next Microsoft or Apple, the suddenly massive computing companies. But it was the internet, “the world wide web,” that sparked a bonanza. The “dot-com boom” fueled enormous economic growth and substantial financial speculation to find the next Google or Amazon.

Republicans, defeated at the polls in 1996 and 1998, looked for other ways to sink Clinton’s presidency. Political polarization seemed unprecedented and a sensation-starved, post-Watergate media demanded scandal. The Republican congress spent millions on investigations hoping to uncover some shred of damning evidence to sink Clinton’s presidency, whether it be real estate deals, White House staffing, or adultery. Rumors of sexual misconduct had always swirled around Clinton. The press, which had historically turned a blind eye to such private matters, saturated the media with Clinton’s potential sex scandals. Congressional investigations targeted the allegations and Clinton, called to testify before a grand jury and in a statement to the American public, denied having “sexual relations” with Monica Lewinsky. Republicans used the testimony to allege perjury. In December 1998, the House of Representatives voted to impeach the president. It was a radical and wildly unpopular step. Two-thirds of Americans disapproved and a majority told Gallup pollsters that Republicans had abused their constitutional authority. Clinton’s approval rating, meanwhile, jumped to 78%. ((Frank Newport, “Clinton Receives Record High Job Approval Rating,” Gallup (December 24, 1998).)) In February 1999, on a vote that mostly fell upon party lines, Clinton was acquitted by the Senate.

The 2000 election pitted Vice President Albert Gore, Jr. against George W. Bush, the son of the former president who had been elected twice as Texas governor. Gore, wary of Clinton’s recent impeachment despite Clinton’s enduring approval ratings, distanced himself from the president and eight years of relative prosperity and ran as a pragmatic, moderate liberal. Bush, too, ran as a moderate, distancing himself from the cruelties of past Republican candidates by claiming to represent a “compassionate conservatism” and a new faith-based politics. Bush was an outspoken evangelical. In a presidential debate, he declared Jesus Christ his favorite political philosopher. He promised to bring church leaders into government and his campaign appealed to churches and clergy to get out the vote. Moreover, he promised to bring honor, dignity, and integrity to the Oval Office, a clear reference Clinton. Utterly lacking the political charisma that had propelled Clinton, Gore withered under Bush’s attacks. Instead of trumpeting the Clinton presidency, Gore found himself answering the media’s questions about whether he was sufficiently an “alpha male” and whether he had “invented the internet.”

Few elections have been as close and contentious as the 2000 election, which ended in a deadlock. Gore had won the popular vote by 500,000 votes, but the Electoral College math seemed to have failed him. On election night the media had called Florida for Gore, but then Bush made gains and news organizations backpedaled and then they declared the state for Bush—and Bush the probable president-elect. Gore conceded privately to Bush, then backpedaled as the counts edged back toward Gore yet again. When the nation awake the next day, it was unclear who had been elected president. The close Florida vote triggered an automatic recount.

Lawyers descended on Florida. The Gore campaign called for manual recounts in several counties. Local election boards, Florida Secretary of State Kathleen Harris, and the Florida Supreme Court all weighed in until the United Supreme Court stepped in and, in an unprecedented 5-4 decision in Bush v. Gore, ruled that the recount had to end. Bush was awarded Florida by a margin of 537 votes, enough to win him the state, a majority in the Electoral College, and the presidency.

In his first months in office, Bush fought to push forward enormous tax cuts skewed toward America’s highest earners and struggled with an economy burdened by the bursting of the dot-com-bubble. Old fights seemed ready to be fought, and then everything changed.

 

III. September 11 and the War on Terror

On the morning of September 11, 2001, 19 operatives of the al-Qaeda terrorist organization hijacked four passenger planes on the East Coast. American Airlines Flight 11 crashed into the North Tower of the World Trade Center in New York City at 8:46 a.m. EDT. United Airlines Flight 175 crashed into the South Tower at 9:03. American Airlines Flight 77 crashed into the western façade of the Pentagon at 9:37. At 9:59, the South Tower of the World Trade Center collapsed. At 10:03, United Airlines Flight 93 crashed in a field outside of Shanksville, Pennsylvania, likely brought down by passengers who had received news of the earlier hijackings. And at 10:28, the North Tower collapsed. In less than two hours, nearly 3,000 Americans had been killed.

Six days after the September 11th attacks, the World Trade Center was still crumbling and dozens of men and women were still unaccounted for. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/3/3b/September_17_2001.jpg.

Ground Zero six days after the September 11th attacks. Wikimedia, .

The attacks shocked Americans. Bush addressed the nation and assured the country that “The search is underway for those who are behind these evil acts.” At Ground Zero three days later, Bush thanked the first responders. A worker said he couldn’t hear him. “I can hear you,” Bush shouted back, “The rest of the world hears you. And the people who knocked these buildings down will hear all of us soon.”

American intelligence agencies quickly identified the radical Islamic militant group al-Qaeda, led by the wealthy Saudi Osama Bin Laden, as the perpetrators of the attack. Sheltered in Afghanistan by the Taliban, the country’s Islamic government, al-Qaeda was responsible for a 1993 bombing of the World Trade Center and a string of attacks at U.S. embassies and military bases across the world. Bin Laden’s Islamic radicalism and his anti-American aggression attracted supporters across the region and, by 2001, al-Qaeda was active in over sixty countries.

Although in his presidential campaign Bush had denounced foreign “nation-building,” he populated his administration with “neo-conservatives,” firm believers in the expansion of American democracy and American interests abroad. Bush advanced what was sometimes called the Bush Doctrine, a policy in which the United States would have to the right to unilaterally and pre-emptively make war upon any regime or terrorist organization that posed a threat to the United States or to United States’ citizens. It would lead the United State into protracted conflicts in Afghanistan and Iraq and entangle the United States in nations across the world. Journalist Dexter Filkins called it a “Forever War,” a perpetual conflict waged against an amorphous and un-defeatable enemy. The geopolitical realities of the twenty-first-century world were forever transformed. ((Dexter Filkins, The Forever War (New York: Vintage, 2009).)) 

The United States, of course, had a history in Afghanistan. When the Soviet Union invaded Afghanistan in December 1979 to quell an insurrection that threatened to topple Kabul’s communist government, the United States financed and armed anti-Soviet insurgents, the Mujahedeen. In 1981, the Reagan Administration authorized the Central Intelligence Agency (CIA) to provide the Mujahedeen with weapons and training to strengthen the insurgency. An independent wealthy young Saudi, Osama bin Laden, also fought with and funded the Mujahedeen. The insurgents began to win. Afghanistan bled the Soviet Union dry. The costs of the war, coupled with growing instability at home, convinced the Soviets to withdraw from Afghanistan in 1989. ((See, for instance, Lawrence Wright, The Looming Tower: Al Qaeda and the Road to 9/11 (New York: Knopf, 2006).))

Osama bin Laden relocated al-Qaeda to Afghanistan after the country fell to the Taliban in 1996. The United States under Bill Clinton had launched cruise missiles into Afghanistan at al-Qaeda camps in retaliation for al-Qaeda bombings on American embassies in Africa.

Then, after September 11, with a broad authorization of military force, Bush administration officials made plans for military action against al-Qaeda and the Taliban. What would become the longest war in American history began with the launching of Operation Enduring Freedom in October 2001. Air and missile strikes hit targets across Afghanistan. U.S. Special Forces joined with fighters in the anti-Taliban Northern Alliance. Major Afghan cities fell in quick succession. The capital, Kabul, fell on November 13. Bin Laden and Al-Qaeda operatives retreated into the rugged mountains along the border of Pakistan in eastern Afghanistan. The United States military settled in.

As American troops struggled to contain the Taliban in Afghanistan, the Bush administration set its sights on Iraq. After the conclusion of the Gulf War in 1991, American officials established economic sanctions, weapons inspections, and “no-fly zones.” By mid-1991, American warplanes were routinely patrolling Iraqi skies and coming under periodic fire from Iraqi missile batteries. The overall cost to the United States of maintaining the two no-fly zones over Iraq was roughly $1 billion a year. Related military activities in the region added almost another half billion to the annual bill. On the ground in Iraq, meanwhile, Iraqi authorities clashed with U.N. weapons inspectors. Iraq had suspended its program for weapons of mass destruction, but Saddam Hussein fostered ambiguity about the weapons in the minds of regional leaders to forestall any possible attacks against Iraq.

In 1998, a standoff between Hussein and the United Nations over weapons inspections led President Bill Clinton to launch punitive strikes aimed at debilitating what was thought to be a fairly developed chemical weapons program. Attacks began on December 16, 1998. More than 200 cruise missiles fired from U.S. Navy warships and Air Force B-52 bombers flew into Iraq, targeting suspected chemical weapons storage facilities, missile batteries and command centers. Airstrikes continued for three more days, unleashing in total 415 cruise missiles and 600 bombs against 97 targets. The amount of bombs dropped was nearly double the amount used in the 1991 conflict.

The United States and Iraq remained at odds throughout the 1990s and early 2000, when Bush administration officials began considering “regime change.” The Bush Administration began publicly denouncing Saddam Hussein’s regime and its alleged weapons of mass destruction. It began pushing for war in the fall of 2002. It was alleged by the Administration that Hussein was trying to acquire uranium and that it had aluminum tubes used for nuclear centrifuges. Public opinion was divided. George W. Bush said in October, “Facing clear evidence of peril, we cannot wait for the final proof—the smoking gun—that could come in the form of a mushroom cloud.” ((Thomas R. Mockaitis, The Iraq War: A Documentary and Reference Guide (Santa Barbara: ABC-Clio, 2012), 26.)) The Administration’s push for war was in full swing. Protests broke out across the country and all over the world, but majorities of Americans supported military action. On October 16, the United States Congress passed the Authorization for Use of Military Force against Iraq Resolution, giving Bush the power to make war in Iraq. Iraq began cooperating with U.N. weapons inspectors in late 2002, but the Bush administration pressed on. On February 6, 2003, Secretary of State Colin Powell, who had risen to public prominence as an Army general during the Persian Gulf War in 1991, presented allegations of a robust Iraqi weapons program to the United Nations. Protests continued.

The first American bombs hit Baghdad on March 20, 2003. Several hundred-thousand troops moved into Iraq and Hussein’s regime quickly collapsed. Baghdad fell on April 9. On May 1, 2003, aboard the USS Abraham Lincoln, beneath a banner reading “Mission Accomplished,” George W. Bush announced that “Major combat operations in Iraq have ended.” ((Judy Keen, “Bush to Troops: Mission Accomplished,” USA Today (June 5, 2003).)) No evidence of weapons of mass destruction had been found or would be found. And combat operations had not ended, not really. The Iraqi insurgency had begun, and the United States would spend the next ten years struggling to contain it.

Despite the celebration of President Bush, combat operations in Iraq would continue for years more. In some ways, it has not ended. Although combat troops were withdrawn from Iraq by December 2011, President Obama announced the use of airstrikes against Iraqi militants in August 2014. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/5/50/USS_Abraham_Lincoln_%28CVN-72%29_Mission_Accomplished.jpg.

Combat operations in Iraq would continue for years. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/5/50/USS_Abraham_Lincoln_%28CVN-72%29_Mission_Accomplished.jpg.

Efforts by various intelligence gathering agencies led to the capture of Saddam Hussein, hidden in an underground compartment near his hometown, on December 13, 2003. The new Iraqi government found him guilty of crimes against humanity and he was hanged on December 30, 2006.

 

IV. The End of the Bush Years

The War on Terror was a centerpiece in the race for the White House in 2004. The Democratic ticket, headed by Massachusetts Senator John F. Kerry, a Vietnam War hero who entered the public consciousness for his subsequent testimony against it, attacked Bush for the ongoing inability to contain the Iraqi insurgency or to find weapons of mass destruction, the revelation, and photographic evidence, that American soldiers had abused prisoners at the Abu Ghraib prison outside of Baghdad, and the inability to find Osama Bin Laden. Moreover, many who had been captured in Iraq and Afghanistan were “detained” indefinitely at a military prison in Guantanamo Bay in Cuba. “Gitmo” became infamous for its harsh treatment, indefinite detentions, and the torture of prisoners. Bush defended the War on Terror and his allies attacked critics for failing to support the troops. Moreover, Kerry had voted for the war. He had to attack what had authorized. Bush won a close but clear victory.

The second Bush term saw the continued deterioration of the wars in Iraq and Afghanistan, but Bush’s presidency would take a bigger hit from his perceived failure to respond to the domestic tragedy that followed Hurricane Katrina’s devastating hit on the Gulf Coast. Katrina had been a category 5 hurricane. It was, the New Orleans Times-Picayune read, “the storm we always feared.” ((Bruce Nolan, “Katrina: The Storm We’ve Always Feared,” New Orleans Times-Picayune (August 30, 2005).))

New Orleans suffered a direct hit, the levees broke, and the bulk of the city flooded. Thousands of refugees flocked to the Superdome, where supplies and medical treatment and evacuation were slow to come. Individuals died in the heat. Bodies wasted away. Americans saw poor black Americans abandoned. Katrina became a symbol of a broken administrative system, a devastated coastline, and irreparable social structures that allowed escape and recovery for some, and not for others. Critics charged that Bush had staffed his administration with incompetent supporters and had further ignored the displaced poor and black residents of New Orleans. ((Douglas Brinkley: The Great Deluge: Hurricane Katrina, New Orleans, and the Mississippi Gulf Coast (New York: Harper Collins, 2006).))

Hurricane Katrina was one of the deadliest and more destructive hurricanes to hit American soil in U.S. history. It nearly destroyed New Orleans, Louisiana, as well as cities, towns, and rural areas across the Gulf Coast. It sent hundreds of thousands of refugees to near-by cities like Houston, Texas, where they temporarily resided in massive structures like the Astrodome. Photograph, September 1, 2005. Wikimedia, http://commons.wikimedia.org/wiki/File:Katrina-14461.jpg.

Hurricane Katrina was one of the deadliest and more destructive hurricanes to hit American soil in U.S. history. It nearly destroyed New Orleans, Louisiana, as well as cities, towns, and rural areas across the Gulf Coast. It sent hundreds of thousands of refugees to near-by cities like Houston, Texas, where they temporarily resided in massive structures like the Astrodome. Photograph, September 1, 2005. Wikimedia, http://commons.wikimedia.org/wiki/File:Katrina-14461.jpg.

Immigration had become an increasingly potent political issue. The Clinton Administration had overseen the implementation of several anti-immigration policies on the border, but hunger and poverty were stronger incentives than border enforcement policies. Illegal immigration continued, often at great human cost, but nevertheless fanned widespread anti-immigration sentiment among many American conservatives. Many immigrants and their supporters, however, fought back. 2006 saw waves of massive protests across the country. Hundreds of thousands marched in Chicago, New York, and Los Angeles, and tens of thousands marched in smaller cities around the country. Legal change, however, went nowhere. Moderate conservatives feared upsetting business interests’ demand for cheap, exploitable labor and alienating large voting blocs by stifling immigration and moderate liberals feared upsetting anti-immigrant groups by pushing too hard for liberalization of immigration laws.

Afghanistan and Iraq, meanwhile, continued to deteriorate. In 2006, the Taliban reemerged, as the Afghan Government proved both highly corrupt and highly incapable of providing social services or security for its citizens. Iraq only descended further into chaos.

In 2007, 27,000 additional United States forces deployed to Iraq under the command of General David Petraeus. The effort, “the surge,” employed more sophisticated anti-insurgency strategies and, combined with Sunni moves against the disorder, pacified many of Iraq’s cities and provided cover for the withdrawal of American forces. On December 4, 2008, the Iraqi government approved the U.S.-Iraq Status of Forces Agreement and United States combat forces withdrew from Iraqi cities before June 30, 2009. The last US combat forces left Iraq on December 18, 2011. Violence and instability continued to rock the country.

Opened in 2005, this beautiful new mosque at the Islamic Center of America in Dearborn, Michigan, is the largest such religious structure in the United States. Muslims in Dearborn have faced religious and racial prejudice, but the suburb of Detroit continues to be a central meeting-place for American Muslims. Photograph July 8, 2008. Wikimedia, http://commons.wikimedia.org/wiki/File:Islamic_Center_of_America.jpg.

Opened in 2005, the Islamic Center of America in Dearborn, Michigan, is the largest such religious structure in the United States. Muslims in Dearborn have faced religious and racial prejudice, but the suburb of Detroit continues to be a central meeting-place for American Muslims. Photograph July 8, 2008. Wikimedia.

V. The Great Recession

The Great Recession began, as most American economic catastrophe’s began, with the bursting of a speculative bubble. Throughout the 1990s and into the new millennium, home prices continued to climb, and financial services firms looked to cash in on what seemed to be a safe but lucrative investment. Especially after the dot-com bubble burst, investors searched for a secure investment that was rooted in clear value and not trendy technological speculation. And what could be more secure than real estate? But mortgage companies began writing increasingly risky loans and then bundling them together and selling them over and over again, sometimes so quickly that it became difficult to determine exactly who owned what.

Decades of financial deregulation had rolled back Depression Era restraints and again enabled risky business practices to dominate the world of American finance. It was a bipartisan agenda. In the 1990s, for instance, Bill Clinton signed the Gramm-Leach-Bliley Act, repealing provisions of the 1933 Glass-Steagall Act separating commercial and investment banks, and the Commodity Futures Modernization Act, which exempted credit-default swaps–perhaps the key financial mechanism behind the crash–from regulation.

Mortgages had been so heavily leveraged that when American homeowners began to default on their loans, the whole system collapsed. Seemingly solid financial services firms disappeared almost overnight. In order to prevent the crisis from spreading, the federal government poured billions of dollars into the industry, propping up hobbled banks. Massive giveaways to bankers created shock waves of resentment throughout the rest of the country. On the Right, conservative members of the Tea Party decried the cronyism of an Obama administration filled with former Wall Street executives. The same energies also motivated the Occupy Wall Street movement, as mostly young left-leaning New Yorkers protesting an American economy that seemed overwhelmingly tilted toward “the one percent.” ((On the Great Recession, see Joseph Stiglitz, Freefall: America, Free Markets, and the Sinking of the World Economy (New York: Norton, 2010.); and Michael Lewis, The Big Short: Inside the Doomsday Machine (New York: Norton: 2010).))

The Great Recession only magnified already rising income and wealth inequalities. According to the Chief Investment Officer at JPMorgan Chase, the largest bank in the United States, “profit margins have reached levels not seen in decades,” and “reductions in wages and benefits explain the majority of the net improvement.” ((Harold Meyerson, “Corporate America’s Chokehold on Wages,” Washington Post (July 19, 2011).)) A study from the Congressional Budget authority found that since the late 1970s, after-tax benefits of the wealthiest 1% grew by over 300%. The “average” American’s had grown 35%. Economic trends have disproportionately and objectively benefited the wealthiest Americans. Still, despite some political rhetoric, American frustration has not generated anything like the social unrest of the early twentieth century. A weakened labor movement and a strong conservative base continue to stymie serious attempts at redistributing wealth. Occupy Wall Street managed to generate a fair number of headlines and shift public discussion away from budget cuts and toward inequality, but its membership amounted to only a fraction of the far more influential and money-driven Tea Party. Its presence on the public stage was fleeting.

The Great Recession, however, was not. While American banks quickly recovered and recaptured their steady profits, and the American stock market climbed again to new heights, American workers continued to lag. Job growth would remain miniscule and unemployment rates would remain stubbornly high. Wages froze, meanwhile, and well-paying full-time jobs that were lost were too often replaced by low-paying, part-time work. A generation of workers coming of age within the crisis, moreover, had been savaged by the economic collapse. Unemployment among young Americans hovered for years at rates nearly double the national average.

VI. The Obama Presidency

By the 2008 election, with Iraq still in chaos, Democrats were ready to embrace the anti-war position and sought a candidate who had consistently opposed military action in Iraq. Senator Barack Obama of Illinois had been a member of the state senate when Congress debated the war actions but he had publicly denounced the war, predicting the sectarian violence that would ensue, and remained critical of the invasion through his 2004 campaign for the U.S. Senate. He began running for president almost immediately after arriving in Washington.

A former law professor and community activist, Obama became the first black candidate to ever capture the nomination of a major political party. ((Thomas J. Sugrue, Not Even Past: Barack Obama and the Burden of Race Book (Princeton: Princeton University Press, 2012).)) During the election, Obama won the support of an increasingly anti-war electorate. Already riding a wave of support, however, Bush’s fragile economy finally collapsed in 2007 and 2008. Bush’s policies were widely blamed, and Obama’s opponent, John McCain, was tied to Bush’s policies. Obama won a convincing victory in the fall and became the nation’s first African American president.

President Obama’s first term was marked by domestic affairs, especially his efforts to combat the Great Recession and to pass a national healthcare law. Obama came into office as the economy continued to deteriorate. He managed the bank bailout begun under his predecessor and launched a limited economic stimulus plan to provide countercyclical government spending to spare the country from the worst of the downturn.

Despite Obama’s crushing electoral victory, national politics fractured and a conservative Republican firewall quickly arose against the Obama administration. “The Tea Party” became a catchall for a diffuse movement of fiercely conservative and politically frustrated American voters. Typically whiter, older, and richer than the average American, flush with support from wealthy backers, and clothed with the iconography of the Founding Fathers, Tea Party activists registered their deep suspicions of the federal government. ((Kate Zernike and Megan Thee-Brenan, “Poll Finds Tea Party Backers Wealthier and More Educated,” New York Times (April 14, 2010); Jill Lepore, The Whites of Their Eyes: The Tea Party’s Revolution and the Battle over American History (Princeton: Princeton University Press, 2011).)) Tea Party protests dominated the public eye in 2009 and activists steered the Republican Party far to the right, capturing primary elections all across the country.

Obama’s most substantive legislative achievement proved to be a national healthcare law, the Patient Protection and Affordable Care Act (“Obamacare”). Presidents since Theodore Roosevelt had striven to pass national healthcare reform and failed. Obama’s plan forsook liberal models of a national healthcare system and instead adopted a heretofore conservative model of subsidized private care (similar plans had been put forward by Republicans Richard Nixon, Newt Gingrich, and Obama’s 2012 opponent, Mitt Romney). Beset by conservative protests, Obama’s healthcare reform narrowly passed through Congress. It abolished “pre-existing conditions” as a cause for denying care, scrapped junk plans, provided for state-run health care exchanges (allowing individuals without healthcare to pool their purchasing power), offered states funds to subsidize an expansion of Medicare, and required all Americans to provide proof of a health insurance plan that measured up to government-established standards (those who did not purchase a plan would pay a penalty tax, and those who could not afford insurance would be eligible for federal subsidies).

In 2009, President Barack Obama deployed 17,000 additional troops to Afghanistan as part of a counterinsurgency campaign that aimed to “disrupt, dismantle, and defeat” al-Qaeda and the Taliban. Meanwhile, U.S. Special Forces and CIA drones targeted al-Qaeda and Taliban leaders. In May 2011, U.S. Navy SEALs conducted a raid deep into Pakistan that led to the killing of Osama bin Laden. The United States and NATO began a phased withdrawal from Afghanistan in 2011, with an aim of removing all combat troops by 2014. Although weak militarily, the Taliban remained politically influential in south and eastern Afghanistan. Al-Qaeda remained active in Pakistan, but shifted its bases to Yemen and the Horn of Africa. As of December 2013, the war in Afghanistan had claimed the lives of 3,397 U.S. service members.

These former Taliban fighters surrendered their arms to the government of the Islamic Republic of Afghanistan during a reintegration ceremony at the provincial governor’s compound in May 2012. Wikimedia, http://commons.wikimedia.org/wiki/File:Former_Taliban_fighters_return_arms.jpg.

These former Taliban fighters surrendered their arms to the government of the Islamic Republic of Afghanistan during a reintegration ceremony at the provincial governor’s compound in May 2012. Wikimedia.

Climate change, the role of government, gay marriage, the legalization of marijuana, the rise of China, inequality, surveillance, a stagnant economy, and a host of other issues have confronted recent Americans with sustained urgency.

In 2012, Barack Obama won a second term by defeating Republican Mitt Romney, the former governor of Massachusetts. However, Obama’s inability to control congress and the ascendancy of Tea Party Republicans stunted the passage of meaningful legislation. Obama was a lame duck before he ever won reelection. Cautious efforts to address climate change, for instance, went nowhere. The economy continued its half-hearted recovery. The Obama administration campaigned on little to address the crisis and accomplished far less. While corporate profits climbed, wages stagnated and employment sagged.

 

VII. New Horizons

Much public commentary in the early twenty-first century concerned the “millennials,” the new generation that came of age during the new millennium. Commentators, demographers, and political prognosticators continue to ask what the new generation will bring. TIME‘s May 20, 2013 cover, for instance, read “Millennials Are Lazy, Entitled Narcissists Who Still Live With Their Parents: Why They’ll Save Us All.” Pollsters have focused on features that distinguish the millennials from older Americans: millennials, the pollsters say, more diverse, more liberal, less religious, and wracked by economic insecurity. “They are,” as one Pew report read, “relatively unattached to organized politics and religion, linked by social media, burdened by debt, distrustful of people, in no rush to marry— and optimistic about the future.” ((Paul Taylor, The Next America: Boomers, Millennials, and the Looming Generational Showdown (New York: PublicAffairs, 2014).))

Millennial attitudes toward homosexuality and gay marriage reflect one of the most dramatic changes in popular attitudes during recent years. After decades of advocacy, American attitudes shifted rapidly. In 2006, a majority of Americans still told Gallup pollsters that “gay or lesbian relations” is “morally wrong;” ((Gallup. Available online: http://www.gallup.com/poll/1651/gay-lesbian-rights.aspx.)) But prejudice against homosexuality continued to fall and greater public acceptance of “coming out” opened the culture (73% of Americans in 2001 said they knew someone who was gay, lesbian or bisexual; in 1983, only 24% did). Gay characters–and characters with depth and complexity–can be found across the cultural landscape. And while national politicians refused to advocate for it, attitudes shifted and, by the 2010s, polls registered majority support for the legalization of gay marriage.  A writer for the Wall Street Journal called it “one of the fastest-moving changes in social attitudes of this generation.” ((Janet Hook, “Support for Gay Marriage Hits All-Time High,” Wall Street Journal (March 9, 2015).))

Such change was, in many respects, a generational one: on average, younger Americans supported gay marriage in higher numbers than older Americans. As attitudes shifted, the Obama administration moved tentatively. Refusing to push for national interventions on the gay marriage front, Obama did, however, direct a review of Defense Department policies that repealed the “Don’t Ask, Don’t Tell” policy in 2011. Gay marriage was left to the courts. Beginning in Massachusetts in 2003, state courts had begun slowly ruling against gay marriage bans. Then, in June, 2015, The Supreme Court ruled 5-4 in Obergefell v. Hodges that same-sex marriage was a constitutional right. Nearly two-thirds of Americans supported the position. ((Ibid.))

Even as anti-immigrant initiatives like California’s Proposition 187 (1994) and Arizona’s SB1070 (2010) reflected the anxieties of many, younger Americans proved far more comfortable with immigration and diversity–which makes sense, given that they are the most diverse American generation in living memory. Since Lyndon Johnson’s Great Society liberalized immigration laws, the demographics of the United States have been transformed. In 2012, nearly one-quarter of all Americans were immigrants or the sons and daughters of immigrants. Half came from Latin America. The ongoing “Hispanicization” of the United States and the ever shrinking proportion of non-Hispanic whites have been the most talked about trends among demographic observers. By 2013, 17% of the nation was Hispanic. In 2014, Latinos surpassed non-Latino whites to became the largest ethnic group in California. In Texas, the image of a white cowboy hardly captures the demographics of a “minority-majority” state in which Hispanic Texans will soon become the largest ethnic group. For the nearly 1.5 million people of Texas’s Rio Grande Valley, for instance, where the vast majority of residents speak Spanish at home, a full three-fourths of the population is bilingual. ((U.S. Census data, 2010.)) Political commentators often wonder what political transformations these populations will bring about when they come of age and begin voting in larger numbers.

Younger Americans are also more concerned about the environment and climate change, and yet, on that front, little has changed. In the 1970s and 1980s, experts substantiated the theory of anthropogenic (human-caused) global warming. Eventually, the most influential of these panels, the UN’s Intergovernmental Panel on Climate Change (IPCC) concluded in 1995 that there was a “discernible human influence on global climate.” ((Intergovernmental Panel on Climate Change, Climate Change 2013: The Physical Science Basis (Cambridge: Cambridge University Press, 2014).)) This conclusion, though stated conservatively, was by that point essentially a scientific consensus. By 2007, the IPCC considered the evidence “unequivocal” and warned that “unmitigated climate change would, in the long term, be likely to exceed the capacity of natural, managed and human systems to adapt.” ((Intergovernmental Panel on Climate Change, Climate Change 2014: Impacts, Adaptation and Vulnerability: Global and Sectoral Aspects (Cambridge: Cambridge University Press, 2014).))

Climate change became a permanent and major topic of public discussion and policy in the twenty-first century. Fueled by popular coverage, most notably, perhaps, the documentary An Inconvenient Truth, based on Al Gore’s book and presentations of the same name, addressing climate change became a plank of the American left and a point of denial for the American right. American public opinion and political action still lagged far behind the scientific consensus on the dangers of global warming. Conservative politicians, conservative think tanks, and energy companies waged war to sow questions in the minds of Americans, who remain divided on the question, and so many others.

Much of the resistance to addressing climate change is economic. As Americans look over their shoulder at China, many refuse to sacrifice immediate economic growth for long-term environmental security. Twenty-first century relations with China are characterized by contradictions and interdependence. After the collapse of the Soviet Union, China reinvigorated its efforts to modernize its country. By liberating and subsidizing much of its economy and drawing enormous foreign investments, China has posted enormous growth rates during the last several decades. Enormous cities rise by the day. In 2000, China had a gross domestic product around an eighth the size of the United States. Based on growth rates and trends, analysts suggest that China’s economy will bypass the United States’ soon. American concerns about China’s political system have persisted, but money sometimes speaks matters more to Americans. China has become one of the country’s leading trade partners. Cultural exchange has increased, and more and more Americans visit China each year, with many settling down to work and study. Conflict between the two societies is not inevitable, but managing bilateral relations will be one of the great challenges of the next decade. It is but one of several aspects of the world confronting Americans of the twenty-first century.

 

VIII. Conclusion

The collapse of the Soviet Union brought neither global peace nor stability and the attacks of September 11, 2001 plunged the United States into interminable conflicts around the world. At home, economic recession, entrenched joblessness, and general pessimism infected American life as contentious politics and cultural divisions poisoned social harmony. And yet the stream of history changes its course. Trends shift, things change, and events turn. New generations bring with them new perspective and new ideas. Our world is not foreordained. It is the product of history, the ever-evolving culmination of a longer and broader story, of a larger history, of a raw, distinctive, American Yawp.

 

Contributors

This chapter was edited by Michael Hammond, with content contributions by Eladio Bobadilla, Andrew Chadwick, Zach Fredman, Leif Fredrickson, Michael Hammond, Richara Hayward, Joseph Locke, Mark Kukis, Shaul Mitelpunkt, Michelle Reeves, Elizabeth Skilton, Bill Speer, and Ben Wright.

 

Recommended Reading

  1. Alexander, Michelle. The New Jim Crow: Mass Incarceration in the Age of Colorblindness. New York: The New Press, 2012.
  2. Carter, Dan T. From George Wallace to Newt Gingrich: Race in the Conservative Counterrevolution, 1963-1994. Baton Rouge: Louisiana State University Press, 1996.
  3. Ehrenreich, Barbara. Nickel And Dimed: On (Not) Getting By in America. New York: Metropolitan, 2001.
  4. Hollinger, David. Postethnic America: Beyond Multiculturalism. New York: Basic Books, 1995.
  5. Hunter James D. Culture Wars: The Struggle to Define America. New York: Basic Books, 1992.
  6. Moreton, Bethany. To Serve God and Walmart: The Making of Christian Free Enterprise. Cambridge: Harvard University Press, 2009.
  7. Osnos, Evan. Age of Ambition: Chasing Fortune, Truth and Faith in the New China. New York: Farrar, Straus and Giroux, 2014.
  8. Packer, George. The Unwinding: An Inner History of the New America. New York: Farrar, Straus and Giroux, 2013.
  9. Patterson, James T. Restless Giant: The United States from Watergate to Bush v. Gore. New York: Oxford University Press, 2005.
  10. Piketty, Thomas. Capital in the Twenty-First Century. Translated from the French by Arthur Goldhammer. Cambridge: The Belknap Press of Harvard University Press, 2013.
  11. Ricks, Thomas E. Fiasco: The American Military Adventure in Iraq. New York: Penguin, 2006.
  12. Stiglitz, Joseph. Freefall: America, Free Markets, and the Sinking of the World Economy. New York: Norton, 2010.
  13. Taylor, Paul. The Next America: Boomers, Millennials, and the Looming Generational Showdown. New York: PublicAffairs, 2014.
  14. Wilentz, Sean. The Age of Reagan: A History, 1974–2008. New York: Harper, 2008.
  15. Wright, Lawrence. The Looming Tower: Al Qaeda and the Road to 9/11. New York: Knopf, 2006.

 

Notes

29. The Triumph of the Right

Activist Phyllis Schlafly campaigns against the Equal Rights Amendment in 1978. Bettmann/Corbis.

Activist Phyllis Schlafly campaigns against the Equal Rights Amendment in 1978. Bettmann/Corbis.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

Speaking to Detroit autoworkers in October of 1980, Republican presidential candidate Ronald Reagan described what he saw as the American Dream under Democratic President Jimmy Carter. The family garage may have still held two cars, cracked Reagan, but they were “both Japanese and they’re out of gas.” ((Ronald Reagan quoted in Steve Neal, “Reagan Assails Carter On Auto layoffs,” Chicago Tribune, October 20, 1980, 5.)) The charismatic former governor of California suggested that a once-proud nation was running on empty, economically and politically outpaced by foreign competitors. But Reagan held out hope for redemption. Stressing the theme of “national decline,” he nevertheless promised to make the United States once again a glorious “city upon a hill.” ((Ronald Reagan quoted in James T. Patterson, Restless Giant: The United States From Watergate to Bush v. Gore (New York: Oxford University Press, 2005), 152.)) In November, Reagan’s vision of a dark present and a bright future triumphed.

Reagan rode the wave of a powerful political movement often referred to as the “New Right,” in contrast to the more moderate brand of conservatism prevalent after World War II. By the 1980s the New Right had evolved into the most influential wing of the Republican Party and could claim significant credit for its electoral success. The conservative ascendency built upon the gradual unraveling of the New Deal political order during the 1960s and 1970s. ((See Chapter 28, “The Unraveling.” )) It enjoyed the guidance of skilled politicians like Reagan but drew tremendous energy from a broad range of grassroots activists. Countless ordinary citizens–newly mobilized Christian conservatives, in particular–helped the Republican Party steer the country onto a rightward course. The New Right also attracted support from “Reagan Democrats,” blue-collar voters who had lost faith in the old liberal creed, All the while, enduring conflicts over race, economic policy, gender and sexual politics, and foreign affairs fatally fractured the liberal consensus that had dominated American politics since the presidency of Franklin Roosevelt.

The rise of the right affected Americans’ everyday lives in numerous ways. The Reagan administration embraced “free market” economic theory, dispensing with the principles of income redistribution and social welfare spending that had animated the New Deal and Great Society. Conservative policymakers tilted the regulatory and legal landscape of the United States toward corporations and wealthy individuals. This project depended foremost on weakening the “rights” framework that had undergirded advancements by African Americans, Latinos and Latinas, women, lesbians and gays, and other marginalized groups.

In many ways, however, the rise of the right promised more than it delivered. Battered but intact, the social welfare programs of the New Deal and Great Society (Social Security, Medicaid, Aid to Families With Dependent Children) survived the 1980s. Despite Republican vows of fiscal discipline, both the federal government and the national debt ballooned. At the end of the decade, conservative Christians viewed popular culture as more vulgar and hostile to their values than ever before. In the near term, the New Right registered only partial victories on a range of public policies and cultural issues. Yet, from a long-term perspective conservatives achieved a subtler and more enduring transformation of American politics and society. In the words of one historian, the conservative movement successfully “changed the terms of debate and placed its opponents on the defensive.” ((Robert Self, All in the Family: The Realignment of American Democracy Since the 1960s (New York: Hill and Wang, 2012), 369. )) Liberals and their programs and their policies did not disappear, but they increasingly fought battles on terrain chosen by the New Right.

 

II. Conservative Ascendance

The “Reagan Revolution” marked the culmination of a long process of political mobilization on the American right. In the first two decades after World War II the New Deal seemed firmly embedded in American electoral politics and public policy. Even two-term Republican President Dwight D. Eisenhower declined to roll back the welfare state. To be sure, William F. Buckley tapped into a deep vein of elite conservatism in 1955 by announcing in the first issue of National Review that his magazine “stands athwart history yelling Stop.” ((William F. Buckley, Jr., “Our Mission Statement,” National Review, November 19, 1955. http://www.nationalreview.com/article/223549/our-mission-statement-william-f-buckley-jr. Accessed on June 29, 2015. )) Senator Joseph McCarthy and John Birch Society founder Robert Welch stirred anti-communist fervor. But in general, the far right lacked organizational cohesion. Following Lyndon Johnson’s resounding defeat of Republican standard-bearer Barry Goldwater in the 1964 presidential election, many observers declared American conservatism finished. New York Times columnist James Reston wrote that Goldwater had “wrecked his party for a long time to come.” ((James Reston, “What Goldwater Lost: Voters Rejected His Candidacy, Conservative Cause and the G.O.P.,” New York Times, November 4, 1964, 23. ))

The Conservative insurgency occurred within both major political parties, but the New Right gradually coalesced under the Republican tent. The heightened appeal of conservatism had several causes. The expansive social and economic agenda of Johnson’s Great Society reminded anti-communists of Soviet-style central planning and enflamed fiscal conservatives worried about deficits. Race also drove the creation of the New Right. The civil rights movement, along with the Civil Rights Act and the Voting Rights Act, upended the racial hierarchy of the Jim Crow South. All of these occurred under Democratic leadership, pushing the South toward the Republican Party. In the late 1960s and early 1970s, Black Power, affirmative action, and court-ordered busing of children between schools to achieve racial balance brought “white backlash” to the North, often in cities previously known for political liberalism. To many ordinary white Americans, the urban rebellions, antiwar protests, and student uprisings of the late 1960s unleashed social chaos. At the same time, declining wages, rising prices, and growing tax burdens brought economic vulnerability to many working- and middle-class citizens who long formed the core of the New Deal coalition. Liberalism no longer seemed to offer the great mass of white Americans a roadmap to prosperity, so they searched for new political solutions.

Former Alabama governor and conservative Democrat George Wallace masterfully exploited the racial, cultural, and economic resentments of working-class whites during his presidential runs in 1968 and 1972. Wallace’s record as a staunch segregationist made him a hero in the Deep South, where he won five states as a third-party candidate in the 1968 general election. Wallace’s populist message also resonated with blue-collar voters in the industrial North who felt left behind by the rights revolution. On the campaign stump, the fiery candidate lambasted hippies, anti-war protestors, and government bureaucrats. He assailed female welfare recipients for “breeding children as a cash crop” and ridiculed “over-educated, ivory-tower” intellectuals who “don’t know how to park a bicycle straight.” ((George Wallace quoted in William Chafe, The Unfinished Journey: America Since World War II (New York: Oxford University Press, 1991), 377. )) Yet, Wallace also advanced progressive proposals for federal job training programs, a minimum wage hike, and legal protections for collective bargaining. Running as a Democrat in 1972 (and with anti-busing rhetoric as a new arrow in his quiver), Wallace captured the Michigan primary and polled second in the industrial heartland of Wisconsin, Pennsylvania, and Indiana. In May 1972 an assassin’s bullet left Wallace paralyzed and ended his campaign. Nevertheless, his amalgamation of older, New Deal-style proposals and conservative populism emblemized the rapid re-ordering of party loyalties in the late ’60s and early ’70s. Richard Nixon similarly harnessed the new right’s sense of grievance through his rhetoric about “law and order” and the “silent majority.” ((James Patterson, Grand Expectations: The United States, 1945-1974 (New York: Oxford University Press, 1996), 735-736. )) But Nixon and his Republican successor, Gerald Ford, continued to accommodate the politics of the New Deal order. The new right remained restive.

Christian conservatives also felt themselves under siege from liberalism. In the early 1960s, the Supreme Court decisions prohibiting teacher-led prayer (Engel v. Vitale) and Bible reading in public schools (Abington v. Schempp) led some on the right to conclude that a liberal judicial system threatened Christian values. In the following years, the counterculture’s celebration sex and drugs, along with relaxed obscenity and pornography laws, intensified the conviction that “permissive” liberalism encouraged immorality in private life. Evangelical Protestants—Christians who professed a personal relationship with Jesus Christ, upheld the Bible as an infallible source of truth, and felt a duty to convert, or evangelize, nonbelievers—comprised the core of the so-called “religious right.” The movement also drew energy from devout Catholics. The development of the religious right was not inevitable for several reasons. First, many evangelicals had for decades eschewed politics in favor of spiritual matters; moreover, evangelicalism did not necessarily lead to conservative politics (Democrat Jimmy Carter was an evangelical). Second, working-class Roman Catholics had a long record of loyalty to the Democratic Party. Third, the alliance between evangelicals and Catholics had to overcome decades of mutual antagonism. Only the common enemy of liberalism brought the groups together.

With increasing assertiveness in the 1960s and 1970s, Christian conservatives mobilized to protect the “traditional” family. Women comprised a striking number of the religious right’s foot soldiers. In 1968 and 1969 a group of newly politicized mothers in Anaheim, California led a sustained protest against sex education in public schools. The successful campaign of these “suburban warriors” reflected a widespread belief among Christian conservatives that permissive liberal morality, often emanating from public institutions, threatened their children. ((Lisa McGirr, Suburban Warriors: The Origins of the New American Right ( Princeton, NJ: Princeton University Press, 2001), 227-231. )) A number of other issues also stirred conservative women. Catholic activist Phyllis Schlafly marshaled opposition to the Equal Rights Amendment, while evangelical pop singer Anita Bryant drew national headlines for her successful fight to repeal Miami’s gay rights ordinance in 1977. In 1979, Beverly LaHaye (whose husband Tim–an evangelical pastor in San Diego–would later co-author the wildly popular Left Behind Christian book series) founded Concerned Women for America, which linked small groups of local activists opposed to the ERA, abortion, homosexuality, and no-fault divorce.

Activists like Schlafly and LaHaye valorized motherhood as the highest calling of all women. Abortion therefore struck at the core of their female identity. More than perhaps any other issue, abortion drew different segments of the religious right—Catholics and Protestants, women and men—together. The Supreme Court’s 1973 Roe v. Wade ruling outraged many devout Catholics, including Long Island housewife and political novice Ellen McCormack. In 1976, McCormack entered the Democratic presidential primaries in an unsuccessful attempt to steer the party to a pro-life position. Roe v. Wade also intensified anti-abortion sentiment among many evangelicals (who had been less universally opposed to the procedure than their Catholic counterparts). Christian author Francis Schaeffer cultivated evangelical opposition to abortion through the 1979 documentary film Whatever Happened to the Human Race? arguing that the “fate of the unborn is the fate of the human race.” ((Francis Schaeffer quoted in Whatever Happened to the Human Race? (Episode I), Film, directed by Franky Schaeffer, (1979, USA, Franky Schaeffer V Productions). https://www.youtube.com/watch?v=UQAyIwi5l6E. Accessed June 30, 2015. )) With the procedure framed in stark, existential terms, many evangelicals felt compelled to combat the procedure through political action.

This book cover succinctly demonstrates the mindset of many conservatives in the Reagan era: what happened to the human race? http://users.aber.ac.uk/jrl/scan0001.jpg.

This book cover succinctly demonstrates the mindset of many conservatives in the Reagan era: what happened to the human race? http://users.aber.ac.uk/jrl/scan0001.jpg.

Grassroots passion drove anti-abortion activism, but a set of religious and secular institutions turned the various strands of the New Right into a sophisticated movement. In 1979 Jerry Falwell—a Baptist minister and religious broadcaster from Lynchburg, Virginia—founded the Moral Majority, an explicitly political organization dedicated to advancing a “pro-life, pro-family, pro-morality, and pro-American” agenda. The Moral Majority skillfully weaved together social and economic appeals to make itself a force in Republican politics. Secular, business-oriented institutions also joined the attack on liberalism, fueled by stagflation and by the federal government’s creation of new regulatory agencies like the Environmental Protection Agency and the Occupational Safety and Health Administration. Conservative business leaders bankrolled new “think tanks” like the Heritage Foundation and the Cato Institute. These organizations provided grassroots activists with ready-made policy prescriptions. Other business leaders took a more direct approach by hiring Washington lobbyists and creating Political Action Committees (PACs) to press their agendas in the halls of Congress and federal agencies. Between 1976 and 1980 the number of corporate PACS rose from under 300 to over 1200.

Grassroots activists and business leaders received unlikely support from a circle of “neoconservatives”—disillusioned intellectuals who had rejected liberalism and the Left and become Republicans. Irving Kristol, a former Marxist who went on to champion free-market capitalism as a Wall Street Journal columnist, defined a neoconservative as a “liberal who has been mugged by reality.” ((Walter Goodman, “Irving Kristol: Patron Saint of the New Right,” New York Times Magazine, December 6, 1981. http://www.nytimes.com/1981/12/06/magazine/irving-kristol-patron-saint-of-the-new-right.html. Accessed June 24, 2015. )) Neoconservative journals like Commentary and Public Interest argued that the Great Society had proven counterproductive, perpetuating the poverty and racial segregation that it aimed to cure. By the middle of the 1970s, neoconservatives felt mugged by foreign affairs as well. As ardent Cold Warriors, they argued that Nixon’s policy of détente left the United States vulnerable to the Soviet Union.

In sum, several streams of conservative political mobilization converged in the late 1970s. Each wing of the burgeoning New Right—disaffected Northern blue-collar workers, white Southerners, evangelicals and devout Catholics, business leaders, disillusioned intellectuals, and Cold War hawks—turned to the Republican Party as the most effective vehicle for their political counter-assault on liberalism and the New Deal political order. After years of mobilization, the domestic and foreign policy catastrophes of the Carter administration provided the head winds that brought the conservative movement to shore.

 

III. The Conservatism of the Carter Years

The election of Jimmy Carter in 1976 brought a Democrat to the White House for the first time since 1969. Large Democratic majorities in Congress provided the new president with an opportunity to move aggressively on the legislative front. With the infighting of the early 1970s behind them, many Democrats hoped the Carter administration would update and expand the New Deal. But Carter won the presidency on a wave of post-Watergate disillusionment with government that did not translate into support for liberal ideas. Events outside Carter’s control certainly helped discredit liberalism, but the president’s own policies also pushed national politics further to the right. In his 1978 State of the Union address, Carter lectured Americans that “[g]overnment cannot solve our problems…it cannot eliminate poverty, or provide a bountiful economy, or reduce inflation, or save our cities, or cure illiteracy, or provide energy.” ((Jimmy Carter, 1978 State of the Union Address, Jan. 19, 1878, Jimmy Carter Presidential Library and Museum, http://www.jimmycarterlibrary.gov/documents/speeches/su78jec.phtml, accessed June 24, 2015.)) The statement neatly captured the ideological transformation of the county. Rather than leading a resurgence of American liberalism, Carter became, as one historian put it, “the first president to govern in a post-New Deal framework.” ((Jefferson Cowie, Stayin’ Alive: The 1970s and the Last Days of the Working Class (New York: The New Press, 2010),12.))

In its early days the Carter administration embraced several policies backed by liberals. It pushed an economic stimulus package containing $4 billion in public works, extended food stamp benefits to 2.5 million new recipients, enlarged the Earned Income Tax Credit for low-income households, and expanded the Nixon-era Comprehensive Employment and Training Act (CETA). ((Patterson, Restless Giant: The United States from Watergate to Bush v. Gore (New York: Oxford University Press, 2005), 113. )) But the White House quickly realized that Democratic control of Congress did not guarantee support for its initially left-leaning economic proposals. Many of the Democrats elected to Congress in the aftermath of Watergate were more moderate than their predecessors who had been catechized in the New Deal gospel. These conservative Democrats sometimes partnered with Congressional Republicans to oppose Carter, most notably in response to the administration’s proposal for a federal office of consumer protection.

At a deeper level, Carter’s own temperamental and philosophical conservatism hamstrung the administration. Early in his first term, Carter began to worry about the size of the federal deficit and killed a tax rebate he had proposed and Congressional Democrats had embraced. The president’s comprehensive national urban policy veered to the right by transferring many programs to state and local governments, relying on privatization, and endorsing voluntarism and self-help. Organized labor felt abandoned by Carter, who remained cool to several of their highest legislative priorities. The president offered tepid support for national health insurance proposal and declined to lobby aggressively for a package of modest labor law reforms. The business community rallied to defeat the latter measure, in what AFL-CIO chief George Meany described as “an attack by every anti-union group in America to kill the labor movement.” ((George Meany quoted in Cowie, 293.)) In 1977 and 1978, liberal Democrats rallied behind the Humphrey-Hawkins Full Employment and Training Act, which promised to end unemployment through extensive government planning. The bill aimed not only to guarantee a job to every American but also to re-unite the interracial, working-class Democratic coalition that had been fractured by deindustrialization and affirmative action. “We must create a climate of shared interests between the needs, the hopes, and the fears of the minorities, and the needs, the hopes, and the fears of the majority,” wrote Senator Hubert Humphrey, Lyndon Johnson’s vice president and the bill’s co-sponsor. ((Hubert Humphrey quoted in Cowie, 268. )) Carter’s lack of enthusiasm for the proposal allowed conservatives from both parties to water down the bill to a purely symbolic gesture. Liberals, like labor leaders, came to regard the president as an unreliable ally.

Carter also came under fire from Republicans, especially the religious right. His administration incurred the wrath of evangelicals in 1978 when the Internal Revenue Service established new rules revoking the tax-exempt status of racially segregated, private Christian schools. The rules only strengthened a policy instituted by the Nixon administration; however, the religious right accused Carter of singling out Christian institutions. Republican activist Richard Viguerie described the IRS controversy as the “spark that ignited the religious right’s involvement in real politics.” ((Richard Viguerie, quoted in Joseph Crespino, “Civil Rights and the Religious Right,” in Bruce J. Schulman and Julian Zelizer, eds., Rightward Bound: Making America Conservative in the 1970s (Cambridge, Mass.: Harvard University Press, 2008), 91. )) Race sat just below the surface of the IRS fight. After all, many of the schools had been founded to circumvent court-ordered desegregation. But the IRS ruling allowed the New Right to rain down fire on big government interference while downplaying the practice of racial exclusion at the heart of the case.

While the IRS controversy flared, economic crises multiplied. Unemployment reached 7.8% in May 1980, up from 6% at the start of Carter’s first term. ((Patterson, Restless Giant, 148. )) Inflation (the rate at which the cost of good and services increases) jumped from 6% in 1978 to a staggering 20% by the winter of 1980. ((Judith Stein, Pivotal Decade: How the United States Traded Factories for Finance in the Seventies (New Haven, Conn.: Yale University Press, 2010), 231. )) In another bad omen, the iconic Chrysler Corporation appeared close to bankruptcy. The administration responded to these challenges in fundamentally conservative ways. First, Carter proposed a tax cut for the upper-middle class, which Congress passed in 1978. Second, the White House embraced a long-time goal of the conservative movement by deregulating the airline and trucking industries in 1978 and 1980, respectively. Third, Carter proposed balancing the federal budget—much to the dismay of liberals, who would have preferred that he use deficit spending to finance a new New Deal. Finally, to halt inflation, Carter turned to Paul Volcker, his appointee as Chair of the Federal Reserve. Volcker raised interest rates and tightened the money supply—policies designed to reduce inflation in the long run but which increased unemployment in the short run. Liberalism was on the run.

The “energy crisis” in particular brought out the Southern Baptist moralist in Carter. On July 15, 1979, the president delivered a nationally televised speech on energy policy in which he attributed the country’s economic woes to a “crisis of confidence.” Carter lamented that “too many of us now tend to worship self-indulgence and consumption.” ((Jimmy Carter quoted in Chafe, 453. )) The president’s push to reduce energy consumption was reasonable, and the country’s initial response to the speech was favorable. Yet Carter’s emphasis on discipline and sacrifice, his spiritual diagnosis for economic hardship, sidestepped deeper questions of large-scale economic change and downplayed the harsh toll of inflation on regular Americans.

 

IV. The Election of 1980

These domestic challenges, combined with the Soviet invasion of Afghanistan and the hostage crisis in Iran, hobbled Carter heading into his 1980 reelection campaign. Many Democrats were dismayed by his policies. The president of the International Association of Machinists dismissed Carter as “the best Republican President since Herbert Hoover.” ((William Winpisinger quoted in Cowie, 261. )) Angered by the White House’s refusal to back national health insurance, Massachusetts Senator Ted Kennedy challenged Carter in the Democratic primaries. Running as the party’s liberal standard-bearer and heir to the legacy of his slain older brothers, Kennedy garnered support from key labor unions and leftwing Democrats. He won the Michigan and Pennsylvania primaries—states where Democrats had embraced George Wallace eight years earlier. Carter ultimately vanquished Kennedy, but the close primary tally betrayed the president’s vulnerability.

Carter’s opponent in the general election was Ronald Reagan, who ran as a staunch fiscal conservative and a Cold War hawk. He vowed to reduce government spending and shrink the federal bureaucracy while eliminating the departments of Energy and Education that Carter created. As in his 1976 primary challenge to Gerald Ford, Reagan accused his opponent of failing to confront the Soviet Union. The GOP candidate vowed steep increases in military spending, and hammered Carter for canceling the B-1 bomber and signing the Panama Canal and Salt II treaties. Carter responded by labeling Reagan a warmonger, but events in Afghanistan and Iran discredited Carter’s foreign policy in the eyes of many Americans.

The incumbent fared no better on domestic affairs. Unemployment remained at nearly 8%. ((Patterson, Restless Giant, 148. )) Meanwhile the Federal Reserve’s anti-inflation measures pushed interest rates to an unheard-of 18.5%. ((Patterson, Restless Giant, 148. )) Reagan seized on these bad economic trends. On the campaign trail he brought down the house by proclaiming: “A recession is when your neighbor loses his job, and a depression is when you lose your job.” Reagan would then pause before concluding, “And a recovery is when Jimmy Carter loses his job.” ((Patterson, Restless Giant, 148. )) Carter reminded voters that his opponent opposed the creation of Medicare in 1965 and warned that Reagan would slash popular programs if elected. But the anemic economy prevented Carter’s blows from landing.

Social and cultural issues presented yet another challenge for the president. Despite Carter’s background as a “born-again” Christian and Sunday school teacher, he struggled to court the religious right. Carter scandalized devout Christians by admitting to lustful thoughts during an interview with Playboy magazine in 1976, telling the reporter he had “committed adultery in my heart many times.” ((Jimmy Carter quoted in “Carter Tells of ‘Adultery in His Heart,’” Los Angeles Times, September 21, 1976, B6. )) Although Reagan was only a nominal Christian and rarely attended church, the religious right embraced him. Reverend Jerry Falwell directed the full weight of the Moral Majority behind Reagan. The organization registered an estimated 2 million new voters in 1980. Ellen McCormack, the New York Catholic who ran for president as a Democrat on an anti-abortion platform in 1976, moved over to the GOP in 1980. Reagan also cultivated the religious right by denouncing abortion and endorsing prayer in school. The IRS tax exemption issue resurfaced as well, with the 1980 Republican platform vowing to “halt the unconstitutional regulatory vendetta launched by Mr. Carter’s IRS commissioner against independent schools.” ((Crespino, 103. )) Early in the primary season, Reagan condemned the policy during a speech at South Carolina’s Bob Jones University, which had recently sued the IRS after losing its tax-exempt status because of the fundamentalist institution’s ban on interracial dating.

Jerry Falwell, the wildly popular TV evangelist, founded the Moral Majority political organization in the late 1970s. Decrying the demise of the nation’s morality, the organization gained a massive following, helping to cement the status of the New Christian Right in American politics. Photograph, date unknown. Wikimedia, http://commons.wikimedia.org/wiki/File:Jerry_Falwell_portrait.jpg.

Jerry Falwell, the wildly popular TV evangelist, founded the Moral Majority political organization in the late 1970s. Decrying the demise of the nation’s morality, the organization gained a massive following, helping to cement the status of the New Christian Right in American politics. Photograph, date unknown. Wikimedia.

Reagan’s campaign appealed subtly but unmistakably to the racial hostilities of white voters. The candidate held his first post-nominating convention rally at the Neshoba Count Fair near Philadelphia, Mississippi, the town where three civil rights workers had been murdered in 1964. In his speech, Reagan championed the doctrine of states rights, which had been the rallying cry of segregationists in the 1950s and 1960s. In criticizing the welfare state, Reagan had long employed thinly veiled racial stereotypes about a “welfare queen” in Chicago who drove a Cadillac while defrauding the government or a “strapping young buck” purchasing T-bone steaks with food stamps. ((Patterson, Restless Giant, 163; Jon Nordheimer, “Reagan is Picking His Florida Spots: His Campaign Aides Aim for New G.O.P. Voters in Strategic Areas,” New York Times, February 5, 1976, 24. )) Like George Wallace before him, Reagan exploited the racial and cultural resentments of struggling white working-class voters. And like Wallace, he attracted blue-collar workers in droves.

With the wind at his back on almost every issue, Reagan only needed to blunt Carter’s characterization of him as an angry extremist. Reagan did so during their only debate by appearing calm and amiable. “Are you better off than you were four years ago?” he asked the American people at the conclusion of the debate. ((Sean Wilentz, The Age of Reagan: A History, 1974-2008 (New York: Harper Collins, 2008), 124.)) The answer was no. Reagan won the election with 51% of the popular vote to Carter’s 41%. (Independent John Anderson captured 7%.) ((Meg Jacobs and Julian Zelizer, Conservatives in Power: The Reagan Years, 1981-1989: A Brief History with Documents (Boston: Bedford/St. Martin’s, 2011), 2.)) Despite capturing only a slim majority, Reagan scored a decisive 489-49 victory in the Electoral College. ((Patterson, Restless Giant, 150.)) Republicans gained control of the Senate for the first time since 1955 by winning 12 seats. Liberal Democrats George McGovern, Frank Church, and Birch Bayh went down in defeat, as did liberal Republican Jacob Javits. The GOP picked up 33 House seats, narrowing the Democratic advantage in the lower chamber. ((Patterson, Restless Giant, 150.)) The New Right had arrived in Washington, DC.

 

V. The New Right in Power

Harkening back to Jeffersonian politics of limited government, a viewpoint that would only increase in popularity over the next three decades, Ronald Reagan launched his campaign by saying bluntly, "I believe in states' rights." Reagan secured the presidency through appealing to the growing conservatism of much of the country. Ronald Reagan and wife Nancy Reagan waving from the limousine during the Inaugural Parade in Washington, D.C. on Inauguration Day, 1981. Wikimedia, http://commons.wikimedia.org/wiki/File:The_Reagans_waving_from_the_limousine_during_the_Inaugural_Parade_1981.jpg.

Harkening back to Jeffersonian politics of limited government, a viewpoint that would only increase in popularity over the next three decades, Ronald Reagan launched his campaign by saying bluntly, “I believe in states’ rights.” Reagan secured the presidency through appealing to the growing conservatism of much of the country. Ronald Reagan and wife Nancy Reagan waving from the limousine during the Inaugural Parade in Washington, D.C. on Inauguration Day, 1981. Wikimedia.

In his first inaugural address Reagan proclaimed that “government is not the solution to the problem, government is the problem.” ((Ronald Reagan quoted in Jacobs and Zelizer, 20.)) In reality, Reagan focused less on eliminating government than on redirecting government to serve new ends. In line with that goal, his administration embraced “supply-side” economic theories that had recently gained popularity among the New Right. While the postwar gospel of Keynesian economics had focused on stimulating consumer demand, supply-side economics held that lower personal and corporate tax rates would encourage greater private investment and production. The resulting wealth would “trickle down” to lower-income groups through job creation and higher wages. Conservative economist Arthur Laffer predicted that lower tax rates would generate so much economic activity that federal tax revenues would actually increase. The administration touted the so-called “Laffer Curve” as justification for the tax cut plan that served as the cornerstone of Reagan’s first year in office. Keynesian logic viewed tax cuts as inflationary, stifling the economy. But Republican Congressman Jack Kemp, an early supply-side advocate and co-sponsor of Reagan’s tax bill, promised that it would unleash the “creative genius that has always invigorated America.” ((Jack Kemp quoted in Jacobs and Zelizer, 21.))

The Iranian hostage crisis ended literally during President Reagan’s inauguration speech. By a coincide of timing, then, the Reagan administration received credit for ending the conflict. This group photograph shows the former hostages in the hospital before being released back to the U.S. Johnson Babela, Photograph, 1981. Wikimedia, http://commons.wikimedia.org/wiki/File:DF-SN-82-06759.jpg.

The Iranian hostage crisis ended literally during President Reagan’s inauguration speech. By a coincide of timing, then, the Reagan administration received credit for ending the conflict. This group photograph shows the former hostages in the hospital before being released back to the U.S. Johnson Babela, Photograph, 1981. Wikimedia.

The tax cut faced early skepticism from Democrats and even some Republicans. Vice president George H.W. Bush had belittled supple-side theory as “voodoo economics” during the 1980 Republican primaries. ((Wilentz, 121.)) But a combination of skill and serendipity pushed the bill over the top. Reagan aggressively and effectively lobbied individual members of Congress for support on the measure. Then on March 30, 1981, Reagan survived an assassination attempt by a mentally unstable young man named John Hinckley. Public support swelled for the hospitalized president. Congress ultimately approved a $675-billion tax cut in July 1981 with significant Democratic support. The bill reduced overall federal taxes by more than one quarter and lowered the top marginal rate from 70% to 50%, with the bottom rate dropping from 14% to 11%. It also slashed the rate on capital gains from 28% to 20%. ((Jacobs and Zelizer, 25-26. )) The next month, Reagan scored another political triumph in response to a strike called by the Professional Air Traffic Controllers Organization (PATCO). During the 1980 campaign, Reagan had wooed organized labor, describing himself as “an old union man” (he had led the Screen Actor’s Guild from 1947 to 1952) who still held Franklin Roosevelt in high regard. ((Ronald Reagan quoted in Steve Neal, “Reagan Assails Carter On Auto layoffs,” Chicago Tribune, October 20, 1980, 5)) PATCO had been one of the few labor unions to endorse Reagan. Nevertheless, the president ordered the union’s striking air traffic controllers back to work and fired more than 11,000 who refused. Reagan’s actions crippled PATCO and left the American labor movement reeling. For the rest of the 1980s the economic terrain of the United States—already unfavorable to union organizing—shifted decisively in favor of employers. The unionized portion of the private-sector workforce fell from 20% in 1980 to 12% in 1990. ((Stein, 267. )) Reagan’s defeat of PATCO and his tax bill enhanced the economic power of corporations and high-income households; the conflicts confirmed that a conservative age had dawned in American workplaces and in politics.

The new administration appeared to be flying high in the fall of 1981, but other developments challenged the rosy economic forecasts emanating from the White House. As Reagan ratcheted up tension with the Soviet Union, Congress approved his request for $1.2 trillion in new military spending. ((Chafe, 474. )) Contrary to the assurances of David Stockman—the young supply-side disciple who headed the Office of Management and Budget—the combination of lower taxes and higher defense budgets caused the national debt to balloon. By the end of Reagan’s first term it equaled 53% of GDP, as opposed to 33% in 1981. ((Patterson, Restless Giant, 159.)) The increase was staggering, especially for an administration that had promised to curb spending. Meanwhile, Federal Reserve Chairman Paul Volcker continued his policy from the Carter years of combating inflation by maintain high interest rates, which surpassed 20% in June 1981. ((Gil Troy, Morning in America: How Ronald Reagan Invented the 1980s (Princeton: Princeton University Press, 2005), 67.)) The Fed’s action increased the cost of borrowing money and stifled economic activity.

As a result, the United States experienced a severe economic recession in 1981 and 1982. Unemployment rose to nearly 11%, the highest figure since the Great Depression. ((Chafe, 476.)) Reductions in social welfare spending heightened the impact of the recession on ordinary people. Congress had followed Reagan’s lead by reducing funding for food stamps and Aid to Families with Dependent Children, eliminated the CETA program and its 300,000 jobs, and removed a half-million people from the Supplemental Social Security program for the physically disabled. ((Chafe, 474.)) The cuts exacted an especially harsh toll on low-income communities of color. The head of the NAACP declared the administration’s budget cuts had rekindled “war, pestilence, famine, and death.” ((Margaret Bush Wilson quoted in Troy, 93.)) Reagan also received bipartisan rebuke in 1981 after proposing cuts to Social Security benefits for early retirees. The Senate voted unanimously to condemn the plan, and Democrats framed it as a heartless attack on the elderly. Confronted with recession and harsh public criticism, a chastened White House worked with Democratic House Speaker Tip O’Neil in 1982 on a bill that restored $98 billion of the previous year’s tax cuts. ((Troy, 210.)) Despite compromising with the administration on taxes, Democrats railed against the so-called “Reagan Recession,” arguing that the president’s economic policies favored the most fortunate Americans. This appeal, which Democrats termed the “fairness issue,” helped them win 26 House seats in the autumn Congressional races. ((Troy, 110.)) The New Right appeared to be in trouble.

 

VI. Morning in America

Reagan nimbly adjusted to the political setbacks of 1982. Following the rejection of his Social Security proposals, Reagan appointed a bipartisan panel to consider changes to the program. In early 1983, the commission recommended a one-time delay in cost-of-living increases, a new requirement that government employees pay into the system, and a gradual increase in the retirement age from 65 to 67. The commission also proposed raising state and federal payroll taxes, with the new revenue poured into a trust fund that would transform Social Security from a pay-as-you-go system to one with significant reserves. ((Patterson, Restless Giant, 163-164. )) Congress quickly passed the recommendations into law, allowing Reagan to take credit for strengthening a program cherished by most Americans. The president also benefited from an economic rebound. Real disposable income rose 2.5% in 1983 and 5.8% the following year. ((Troy, 208. )) Unemployment dropped to 7.5% in 1984. ((Chafe, 477. )) Meanwhile, the “harsh medicine” of high interest rates helped lower inflation to 3.5%. ((Patterson, Restless Giant, 162. Many people used the term “harsh medicine” to describe Volcker’s action on interest rates, see Art Pine, “Letting Harsh Medicine Work,” Washington Post, October 14, 1979, G1. )) While campaigning for reelection in 1984, Reagan pointed to the improving economy as evidence that it was “morning again in America.” ((Patterson, Restless Giant, 189. )) His personal popularity soared. Most conservatives ignored the debt increase and tax hikes of the previous two years and rallied around the president.

The Democratic Party, on other hand, stood at an ideological crossroads in 1984. The favorite to win the party’s nomination was Walter Mondale, who had been a staunch ally of organized labor and civil rights as a senator during the 1960s and 1970s. He later served as Jimmy Carter’s vice president. Mondale’s chief rivals were civil rights activist Jesse Jackson and Colorado Senator Gary Hart, one of the young Democrats elected to Congress in 1974 following Nixon’s downfall. Hart and other “Watergate babies” still identified as liberals but rejected their party’s faith in activist government and embraced market-based approaches to policy issues. In so doing, they conceded significant political ground to supply-siders and conservative opponents of the welfare state. Many Democrats, however, were not prepared to abandon their New Deal inheritance. The ideological tension within the party played out in the 1984 primary campaign. Jackson offered a largely progressive program but won only two states. Hart’s platform—economically moderate but socially liberal—inverted the political formula of Mondale’s New Deal-style liberalism. Throughout the primaries, Hart contrasted his “new ideas” with Mondale’s “old-fashioned” politics. Mondale eventually secured his party’s nomination but suffered a crushing defeat in the general election. Reagan captured 49 of 50 states, winning 58.8% of the popular vote. ((Patterson, Restless Giant, 189.))

Mondale’s loss seemed to confirm that the new breed of moderate Democrats better understood the mood of the American people. The future of the party belonged to post-New Deal liberals like Hart and to the constituency that supported him in the primaries: upwardly mobile, white professionals and suburbanites. In February 1985, a group of centrists formed the Democratic Leadership Council (DLC) as a vehicle for distancing the party from organized labor and Keynesian economics while cultivating the business community. Jesse Jackson dismissed the DLC as “Democrats for the Leisure Class,” but the organization included many of the party’s future leaders, including Arkansas Governor Bill Clinton. ((Patterson, Restless Giant, 190-191.)) The formation of the DLC illustrated the degree to which to the New Right had transformed American politics.

Reagan entered his second term with a much stronger mandate than in 1981, but the GOP makeover of Washington, DC stalled. The Democrats regained control of the Senate in 1986 (Republicans maintained a majority in the House,) Democratic opposition prevented Reagan from eliminating means-tested social welfare programs; however, Congress failed to increase benefit levels for welfare programs or raise the minimum wage, decreasing the real value of those benefits. Democrats and Republicans occasionally fashioned legislative compromises, as with the Tax Reform Act of 1986. The bill lowered the top corporate tax rate from 46% to 34% and reduced the highest marginal rate from 50% to 28%, while also simplifying the tax code and eliminating numerous loopholes. ((Troy, 210; Patterson, Restless Giant, 165.)) Both parties—as well as the White House—claimed credit for the bargain and celebrated the bill as a significant achievement. The steep cuts to the corporate and individual rates certainly benefited wealthy individuals, but the legislation made virtually no net change to federal revenues. In 1986, Reagan also signed into law the Immigration Reform and Control Act. American policymakers hoped to do two things: deal with the millions of undocumented immigrants already in the United States while simultaneously choking off future unsanctioned migration. The former goal was achieved (nearly three million undocumented workers received legal status) but the latter proved elusive.

One of Reagan’s most far-reaching victories occurred through judicial appointments. He named 368 district and federal appeals court judges during his two terms. ((Patterson, Restless Giant, 173-174.)) Observers noted that almost all of the appointees were white men. (Seven were African American, fifteen were Latino, and two were Asian American.) Reagan also appointed three Supreme Court justices: Sandra Day O’Connor, who to the dismay of the religious right turned out to be a moderate; Anthony Kennedy, a solidly conservative Catholic who occasionally sided with the court’s liberal wing; and arch-conservative Antonin Scalia. The New Right’s transformation of the judiciary had limits. In 1987, Reagan nominated Robert Bork to fill a vacancy on the Supreme Court. Bork, a federal judge and former Yale University law professor, was a staunch conservative. He had opposed the 1964 Civil Rights Act, affirmative action, and the Roe v. Wade decision. After acrimonious confirmation hearings, the Senate rejected Bork’s nomination by a vote of 58-42. ((Patterson, Restless Giant, 171. ))

 

VII. African American Life in Reagan’s America

African Americans read Bork’s nomination as another signal of the conservative movement’s hostility to their social, economic, and political aspirations. Indeed, Ronald Reagan’s America presented African Americans with a series of contradictions. Blacks achieved significant advances in politics, culture, and socio-economic status. African Americans continued a trend from the late 1960s and 1970s by gaining control of major municipal governments across the country during the 1980s. In 1983, voters in Philadelphia and Chicago elected Wilson Goode and Harold Washington, respectively, as their cities’ first black mayors. At the national level, civil rights leader Jesse Jackson became the first African American man to run for president when he campaigned for the Democratic Party’s nomination in 1984 and 1988. Propelled by chants of “Run, Jesse, Run,” Jackson achieved notable success in 1988, winning nine state primaries and finishing second with 29% of the vote. ((1988 Democratic Primaries, CQ Voting and Elections Collection, database accessed June 30, 2015.))

Jesse Jackson was only the second African American to mount a national campaign for the presidency. His work as a civil rights activist and Baptist minister garnered him a significant following in the African American community, but never enough to secure the Democratic nomination. His Warren K. Leffler, “IVU w/ [i.e., interview with] Rev. Jesse Jackson,” July 1, 1983. Library of Congress, http://www.loc.gov/pictures/item/2003688127/.

Jesse Jackson was only the second African American to mount a national campaign for the presidency. His work as a civil rights activist and Baptist minister garnered him a significant following in the African American community, but never enough to secure the Democratic nomination. His Warren K. Leffler, “IVU w/ [i.e., interview with] Rev. Jesse Jackson,” July 1, 1983. Library of Congress, http://www.loc.gov/pictures/item/2003688127/.

The excitement created by Jackson’s campaign mirrored the acclaim received by a few prominent African Americans in media and entertainment. Comedian Eddie Murphy rose to stardom on television’s Saturday Night Live, and achieved box office success with movies like 48 Hours and Beverly Hills Cop. In 1982, pop singer Michael Jackson released Thriller, the best-selling album of all time. Oprah Winfrey began her phenomenally successful nationally syndicated talk show in 1985. Comedian Bill Cosby’s sitcom about an African American doctor and lawyer raising their four children drew the highest ratings on television for most of the decade. The popularity of The Cosby Show revealed how class informed perceptions of race in the 1980s. Cosby’s fictional TV family represented a growing number of black middle-class professionals in the United States. Indeed, income for the top fifth of African American households increased faster than that of white households for most of the decade. Middle-class African Americans found new doors open to them in the 1980s, but poor and working-class blacks faced continued challenges. During Reagan’s last year in office the African American poverty rate stood at 31.6%, as opposed to 10.1% for whites. ((The State of Black America, 1990 (New York: National Urban League, Inc., 1990), 34. )) Black unemployment remained double that of whites throughout the decade. ((Andrew Hacker, Two Nations: Black and White, Separate, Hostile, Unequal (New York: Charles Scribner’s Sons, 1992), 102. )) By 1990, the median income for black families was $21,423, 42% below white households. ((Hacker, 94.)) The Reagan administration failed to address such disparities and in many ways intensified them.New Right values threatened the legal principles and federal policies of the Great Society and the “rights revolution.” Reagan appointed conservative opponents of affirmative action to lead the Equal Employment Opportunity Commission (namely future Supreme Court Justice Clarence Thomas), U.S. Civil Rights Commission, and the Justice Department’s Civil Rights division. Reagan’s appointees took aim at key policy achievements of the civil rights movement. When the 1965 Voting Rights Act came up for renewal during Reagan’s first term, the Justice Department pushed the president to oppose any extension. Only the intervention of more moderate Congressional Republicans saved the groundbreaking law. The administration also initiated a plan to rescind federal affirmative action rules. In 1986, a broad coalition of groups–including the NAACP, the Urban League, the AFL-CIO, and even the National Association of Manufacturers–compelled the administration to abandon the effort. Despite the conservative tenor of the country, “diversity” programs were firmly entrenched in the corporate world by the end of the decade. The New Right nonetheless reframed the debate regarding rights and discrimination. Americans increasingly embraced racial diversity as a positive value but approached the issue through an individualistic, not a social, framework.Certain federal policies disproportionately affected racial minorities. Spending cuts enacted by Reagan and Congressional Republicans shrank Aid to Families with Dependent Children, Medicaid, food stamps, school lunch, and job training programs that provided crucial support to African American households. In 1982, the National Urban League’s annual “State of Black America” report concluded that “[n]ever [since the first report in 1976]… has the state of Black America been more vulnerable. Never in that time have black economic rights been under such powerful attack.” ((Troy, 91.)) African American communities, especially in urban areas, also bore the stigma of violence and criminality. Homicide was the leading cause of death for black males between 15 and 24, occurring at a rate six times higher than for other groups. ((American Social History Project, Who Built America? Vol. Two: Since 1877, (New York: Bedford/St. Martin’s, 2000), 723.)) Although African Americans were most often the victims of violent crime, sensationalist media reports incited fears about black-on-white crime in big cities. Ironically, such fear could by itself spark violence. In December 1984 a thirty-seven-year-old white engineer, Bernard Goetz, shot and seriously wounded four black teenagers on a New York City subway car. The so-called “Subway Vigilante” suspected that the young men—armed with screwdrivers—planned to rob him. Pollsters found that 90% of white New Yorkers sympathized with Goetz. ((Patterson, Restless Giant, 172-173.)) Echoing the law-and-order rhetoric (and policies) of the 1960s and 1970s, politicians and law enforcement agencies implemented more aggressive policing of minority communities and mandated longer prison sentences for those arrested, inaugurating the explosive growth of mass incarceration that exacted a heavy toll on African American communities long into the twenty-first century.

 

VIII. Bad Times and Good Times

Working- and middle-class Americans, especially those of color, struggled to maintain economic equilibrium during the Reagan years. The growing national debt generated fresh economic pain. The federal government borrowed money to finance the debt, raising interest rates to heighten the appeal of government bonds. Foreign money poured into the United States, raising the value of the dollar and attracting an influx of goods from overseas. The imbalance between American imports and exports grew from $36 billion in 1980 to $170 billion in 1987. ((Chafe, 487. )) Foreign competition battered the already anemic manufacturing sector. The appeal of government bonds likewise drew investment away from American industry.

Continuing an ongoing trend, many steel and automobile factories in the industrial Northeast and Midwest closed or moved overseas during the 1980s. Bruce Springsteen, the bard of blue-collar America, offered eulogies to Rust Belt cities in songs like “Youngstown” and “My Hometown,” in which the narrator laments that his “foreman says these jobs are going boys/and they ain’t coming back.” ((Bruce Springsteen, “My Hometown,” Born in the USA (Columbia Records: New York, 1984).)) Competition from Japanese carmakers spurred a “Buy American” campaign but also caused flare-ups of violent xenophobia. In 1982, two autoworkers in suburb Detroit fatally bludgeoned Vincent Chin, a Chinese-American engineer they reportedly mistook for Japanese. Chin’s murder was extreme, but economic strain caused social tension throughout the industrial heartland. Meanwhile, a “farm crisis” gripped the rural United States. Expanded world production meant new competition for American farmers, while soaring interest rates caused the already sizable debt held by family farms to mushroom. Farm foreclosures skyrocketed during Reagan’s tenure. In September 1985 prominent musicians including Neil Young and Willie Nelson organized “Farm Aid,” a benefit concert at the University of Illinois’s football stadium designed to raise money for struggling farmers.

At the other end of the economic spectrum, wealthy Americans thrived thanks to the policies of the New Right. The financial industry found new ways to earn staggering profits during the Reagan years. Wall Street brokers like “junk bond king” Michael Milken reaped fortunes selling high-risk, high-yield securities. Reckless speculation helped drive the stock market steadily upward until the crash of October 19, 1987. On “Black Friday,” the market plunged 800 points, erasing 13% of its value. Investors lost more than $500 billion. ((Chafe, 489. )) An additional financial crisis loomed in the savings and loan industry, and Reagan’s deregulatory policies bore significant responsibility. In 1982 Reagan signed a bill increasing the amount of federal insurance available to savings and loan depositors, making those financial institutions more popular with consumers. The bill also allowed “S & L’s” to engage in high-risk loans and investments for the first time. Many such deals failed catastrophically, while some S &L managers brazenly stole from their institutions. In the late 1980s, S & L’s failed with regularity, and ordinary Americans lost precious savings. The 1982 law left the government responsible for bailing out S&L’s out at an eventual cost of $132 billion. ((Patterson, Restless Giant, 175. ))

 

IX. Culture Wars of the 1980s

Popular culture of the 1980s offered another venue in which conservatives and liberals waged a battle of ideas. Reagan’s militarism and patriotism pervaded movies like Top Gun and the Rambo series, starring Sylvester Stallone as a Vietnam War veteran haunted by his country’s failure to pursue victory in Southeast Asia. In contrast, director Olive Stone offered searing condemnations of the war in Platoon and Born on the Fourth of July. Television shows like Dynasty and Dallas celebrated wealth and glamour, reflecting the pride in conspicuous consumption that emanated from the White House and corporate boardrooms during the decade. At the same time, films like Wall Street and novels like Bret Easton Ellis’s Less Than Zero skewered the excesses of the rich. Yet the most significant aspect of 1980s’ popular culture was its lack of politics altogether. Rather, Steven Spielberg’s E.T: The Extra-Terrestrial and his Indiana Jones adventure trilogy topped the box office. Cinematic escapism replaced the serious social examinations of 1970s’ film. Quintessential Hollywood leftist Jane Fonda appeared frequently on television but only to peddle exercise videos.

New forms of media changed the ways in which people experienced popular culture. In many cases, this new media contributed to the privatization of life, as people shifted focus from public spaces to their own homes. Movie theaters faced competition from the videocassette recorder (VCR), which allowed people to watch films (or exercise with Jane Fonda) in the privacy of their living room. Arcades gave way to home video game systems. Personal computers proliferated, a trend spearheaded by the Apple Company and its Apple II computer. Television viewership—once dominated by the “big three” networks of NBC, ABC, and CBS—fragmented with the rise of cable channels that catered to particular tastes. Few cable channels so captured the popular imagination as MTV, which debuted in 1981. Telegenic artists like Madonna, Prince, and Michael Jackson skillfully used MTV to boost their reputations and album sales. Conservatives condemned music videos for corrupting young people with vulgar, anti-authoritarian messages, but the medium only grew in stature. Critics of MTV targeted Madonna in particular. Her 1989 video “Like a Prayer” drew protests for what some people viewed as sexually suggestive and blasphemous scenes. The religious right increasingly perceived popular culture as hostile to Christian values.

The Apple II computer, introduced in 1977, was the first successful mass-produced microcomputer meant for home use. Rather clunky-looking to our twenty-first-century eyes, this 1984 version of the Apple II was the smallest and sleekest model yet introduced. Indeed, it revolutionized both the substance and design of personal computers. Photograph of the Apple iicb. Wikimedia, http://commons.wikimedia.org/wiki/File:Apple_iicb.jpg.

The Apple II computer, introduced in 1977, was the first successful mass-produced microcomputer meant for home use. Rather clunky-looking to our twenty-first-century eyes, this 1984 version of the Apple II was the smallest and sleekest model yet introduced. Indeed, it revolutionized both the substance and design of personal computers. Photograph of the Apple iicb. Wikimedia, http://commons.wikimedia.org/wiki/File:Apple_iicb.jpg.

Cultural battles were even more heated in the realm of gender and sexual politics. American women pushed farther into male-dominated spheres during the 1980s. By 1984, women in the workforce outnumbered those who worked at home. ((Ruth Rosen, The World Split Open: How the Modern Women’s Movement Changed America (New York: Penguin Books, 2000), 337. )) That same year, New York Representative Geraldine Ferraro became the first women to run on a major party’s presidential ticket when Democratic candidate Walter Mondale named her his running mate. Yet the triumph of the right placed fundamental questions about women’s rights near the center of American politics–particularly in regard to abortion. The issue increasingly divided Americans. Pro-life Democrats and pro-choice Republicans grew rare, as the National Abortion Rights Action League enforced pro-choice orthodoxy on the left and the National Right to Life Commission did the same with pro-life orthodoxy on the right. Religious conservatives took advantage of the Republican takeover of the White House and Senate in 1980 to push for new restrictions on abortion—with limited success. Senators Jesse Helms of North Carolina and Orrin Hatch of Utah introduced versions of a “Human Life Amendment” to the U.S. Constitution that defined life as beginning at conception; their efforts failed, though in 1982 Hatch’s amendment came within 18 votes of passage in the Senate. ((Self, 376-377. )) Reagan, more interested in economic issues than social ones, provided only lukewarm support for these efforts. He further outraged anti-abortion activists by appointing Sandra Day O’Connor, a supporter of abortion rights, to the Supreme Court. Despite these setbacks, anti-abortion forces succeeded in defunding some abortion providers. The 1976 Hyde Amendment prohibited the use of federal funds to pay for abortions; by 1990 almost every state had its own version of the Hyde Amendment. Yet some anti-abortion activists demanded more. In 1988 evangelical activist Randall Terry founded Operation Rescue, an organization that targeted abortion clinics and pro-choice politicians with confrontational—and sometimes violent—tactics. Operation Rescue demonstrated that the fight over abortion would grow only more heated in the 1990s.

The emergence of a deadly new illness, Acquired Immunodeficiency Syndrome (AIDS), simultaneously devastated, stigmatized, and energized the nation’s homosexual community. When AIDS appeared in the early 1980s, most of its victims were gay men. For a time the disease was known as GRID—Gay-Related Immunodeficiency Disorder. The epidemic rekindled older pseudo-scientific ideas about inherently diseased nature of homosexual bodies. The Reagan administration met the issue with indifference, leading liberal Congressman Henry Waxman to rage that “if the same disease had appeared among Americans of Norwegian descent…rather than among gay males, the response of both the government and the medical community would be different.” ((Self, 387-388. )) Some religious figures seemed to relish the opportunity to condemn homosexual activity; Catholic columnist Patrick Buchanan remarked that “the sexual revolution has begun to devour its children.” ((Self, 384. ))

Homosexuals were left to forge their own response to the crisis. Some turned to confrontation—like New York playwright Larry Kramer. Kramer founded the Gay Men’s Health Crisis, which demanded a more proactive response to the epidemic. Others sought to humanize AIDS victims; this was the goal of the AIDS Memorial Quilt, a commemorative project begun in 1985. By the middle of the decade the federal government began to address the issue haltingly. Surgeon General C. Everett Koop, an evangelical Christian, called for more federal funding on AIDS-related research, much to the dismay of critics on the religious right. By 1987 government spending on AIDS-related research reached $500 million—still only 25% of what experts advocated. ((Self, 389. )) In 1987 Reagan convened a presidential commission on AIDS; the commission’s report called for anti-discrimination laws to protect AIDS victims and for more federal spending on AIDS research. The shift encouraged activists. Nevertheless, on issues of abortion and gay rights—as with the push for racial equality—activists spent the 1980s preserving the status quo rather than building on previous gains. This amounted to a significant victory for the New Right.

The AIDS epidemic hit the gay and African American communities particularly hard in the 1980s, prompting awareness campaigns by celebrities like Patti LaBelle. Poster, c. 1980s. Wikimedia, http://commons.wikimedia.org/wiki/File:%22Don%27t_listen_to_rumors_about_AIDS,_get_the_facts!%22_Patti_LaBelle.A025218.jpg.

The AIDS epidemic hit the gay and African American communities particularly hard in the 1980s, prompting awareness campaigns by celebrities like Patti LaBelle. Poster, c. 1980s. Wikimedia.

 

X. The New Right Abroad

If the conservative movement recovered lost ground on the field on gender and sexual politics, it captured the battlefield on American foreign policy in the 1980s–for a time, at least. Ronald Reagan entered office a committed Cold Warrior. He held the Soviet Union in contempt, denouncing it in a 1983 speech as an “evil empire.” ((Wilentz, 163. )) And he never doubted that the Soviet Union would end up “on the ash heap of history,” as he said in a 1982 speech to the British Parliament. ((Lou Cannon, “President Calls for ‘Crusade’: Reagan Proposes Plan to Counter Soviet Challenge,” Washington Post, June 9, 1982, A1. )) Indeed, Reagan believed it was the duty of the United States to speed the Soviet Union to its inevitable demise. His “Reagan Doctrine” declared that the United States would supply aid to anti-communist forces everywhere in the world. ((Conservative newspaper columnist Charles Krauthammer coined the phrase. See Wilentz, 157. )) To give this doctrine force, Reagan oversaw an enormous expansion in the defense budget. Federal spending on defense rose from $171 billion in 1981 to $229 billion in 1985, the highest level since the Vietnam War. ((Patterson, Restless Giant, 205. )) He described this as a policy of “peace through strength,” a phrase that appealed to Americans who, during the 1970s, feared that the United States was losing its status as the world’s most powerful nation. Yet the irony is that Reagan, for all his militarism, helped bring the Cold War to an end. He achieved it not through nuclear weapons but through negotiation, a tactic he had once scorned.

Reagan’s election came at a time when many Americans feared their country was in an irreversible decline. American forces withdrew in disarray from South Vietnam in 1975. The United States returned control of the Panama Canal to Panama in 1978, despite protests from conservatives. Pro-American dictators were toppled in Iran and Nicaragua in 1979. The Soviet Union invaded Afghanistan that same year, leading conservatives to warn about American weakness in the face of Soviet expansion. Such warnings were commonplace in the 1970s. “Team B,” a group of intellectuals commissioned by the CIA to examine Soviet capabilities, released a report in 1976 stating that “all evidence points to an undeviating Soviet commitment to…global Soviet hegemony.” (( “Intelligence Community Experiment in Competitive Analysis: Soviet Strategic Objectives: An Alternative View,” Report of Team B, National Security Archive, George Washington University, http://www2.gwu.edu/~nsarchiv/NSAEBB/NSAEBB139/nitze10.pdf, accessed June 30, 2015. )) The Committee on the Present Danger, an organization of conservative foreign policy experts, issued similar statements. When Reagan warned, as he did in 1976, that “this nation has become Number Two in a world where it is dangerous—if not fatal—to be second best,” he was speaking to these fears of decline. ((Laura Kalman, Right Star Rising: A New Politics, 1974-1980 (New York: W.W. Norton, 2010), 166-167. ))

Margaret Thatcher and Ronald Reagan, leaders of two of the world’s most powerful countries, formed an alliance that benefited both throughout their tenures in office. Photograph of Margaret Thatcher with Ronald Reagan at Camp David, December 22, 1984. Wikimedia, http://commons.wikimedia.org/wiki/File:Thatcher_Reagan_Camp_David_sofa_1984.jpg.

Margaret Thatcher and Ronald Reagan, leaders of two of the world’s most powerful countries, formed an alliance that benefited both throughout their tenures in office. Photograph of Margaret Thatcher with Ronald Reagan at Camp David, December 22, 1984. Wikimedia, http://commons.wikimedia.org/wiki/File:Thatcher_Reagan_Camp_David_sofa_1984.jpg.

The Reagan administration made Latin America a showcase for its newly assertive policies. Jimmy Carter had sought to promote human rights in the region, but Reagan and his advisers scrapped this approach and instead focused on fighting communism—a term they applied to all Latin American left-wing movements. Reagan justified American intervention by pointing out Latin America’s proximity to the United States: “San Salvador [in El Salvador] is closer to Houston, Texas, than Houston is to Washington, DC,” he said in one speech, adding, “Central America is America.” And so when communists with ties to Cuba overthrew the government of the Caribbean nation of Grenada in October 1983, Reagan dispatched the United States Marines to the island. Dubbed “Operation Urgent Fury,” the Grenada invasion overthrew the leftist government after less than a week of fighting. Despite the relatively minor nature of the mission, its success gave victory-hungry Americans something to cheer about after the military debacles of the previous two decades.

Operation Urgent Fury, which the U.S. invasion of Grenada came to be called, was broadly supported by the U.S. public, even though it was violation of international law. This support was in large part due to incorrect intelligence disseminated by the U.S. government. This photograph shows the deployment of U.S. Army Rangers into Grenada. Photograph, October 25, 1983. Wikimedia, http://commons.wikimedia.org/wiki/File:US_Army_Rangers_parachute_into_Grenada_during_Operation_Urgent_Fury.jpg.

Operation Urgent Fury, the U.S. invasion of Grenada, was broadly supported by the U.S. public. This photograph shows the deployment of U.S. Army Rangers into Grenada. Photograph, October 25, 1983. Wikimedia.

Grenada was the only time Reagan deployed the American military in Latin America, but the United States also influenced the region by supporting rightwing, anti-communist movements there. From 1981 to 1990, the United States gave more than $4 billion to the government of El Salvador in a largely futile effort to defeat the guerillas of the Farabundo Martí National Liberation Front (FMLN). ((Ronald Reagan, “Address to the Nation on United States Policy in Central America,” May 9, 1984. http://www.reagan.utexas.edu/archives/speeches/1984/50984h.htm. Accessed June 30, 2015. )) Salvadoran security forces equipped with American weapons committed numerous atrocities, including the slaughter of almost 1,000 civilians at the village of El Mozote in December 1981. The United States also supported the contras, a right-wing insurgency fighting the leftist Sandinista government in Nicaragua. Reagan, overlooking the contras’ brutal tactics, hailed them as the “moral equivalent of the Founding Fathers.” ((Ronald Reagan, “Remarks at the Annual Dinner of the Conservative Political Action Conference,” March 1, 1985. http://www.presidency.ucsb.edu/ws/?pid=38274. Accessed June 30, 2015.))

The Reagan administration took a more cautious approach in the Middle East, where its policy was determined by a mix of anti-communism and hostility to the Islamic government of Iran. When Iraq invaded Iran in 1980, the United States supplied Iraqi dictator Saddam Hussein with military intelligence and business credits—even after it became clear that Iraqi forces were using chemical weapons. Reagan’s greatest setback in the Middle East came in 1982, when, shortly after Israel invaded Lebanon, he dispatched Marines to the Lebanese city of Beirut to serve as a peacekeeping force. On October 23, 1983, a suicide bomber killed 241 Marines stationed in Beirut. Congressional pressure and anger from the American public forced Reagan to recall the Marines from Lebanon in March 1984. Reagan’s decision demonstrated that, for all his talk of restoring American power, he took a pragmatic approach to foreign policy. He was unwilling to risk another Vietnam by committing American troops to Lebanon.

Though Reagan’s policies toward Central America and the Middle East aroused protest, it was his policy on nuclear weapons that generated the most controversy. Initially Reagan followed the examples of presidents Nixon, Ford, and Carter by pursuing arms limitation talks with the Soviet Union. American officials participated in the Intermediate-range Nuclear Force Talks (INF) that began in 1981 and Strategic Arms Reduction Talks (START) in 1982. But the breakdown of these talks in 1983 led Reagan to proceed with plans to place Pershing II nuclear missiles in Western Europe to counter Soviet SS-20 missiles in Eastern Europe. Reagan went a step further in March 1983, when he announced plans for a “Strategic Defense Initiative,” a space-based system that could shoot down incoming Soviet missiles. Critics derided the program as a “Star Wars” fantasy, and even Reagan’s advisors harbored doubts. “We don’t have the technology to do this,” Secretary of State George Shultz told aides. ((Frances Fitzgerald, Way Out There in the Blue: Reagan, Star Wars, and the End of the Cold War (New York: Simon & Schuster, 2000), 205. )) These aggressive policies fed a growing “nuclear freeze” movement throughout the world. In the United States, organizations like the Committee for a Sane Nuclear Policy organized protests that culminated in a June 1982 rally that drew almost a million people to New York City’s Central Park.

President Reagan proposed space- and ground-based systems to protect the United States from nuclear missiles in his 1984 Strategic Defense Initiative (SDI). Scientists argued it was unrealistic or impossible with contemporary technology, and it was lambasted in the media as “Star Wars.” Indeed, as this artist's representation of SDI shows, it was rather ridiculous. Created October 18, 1984. Wikimedia, http://commons.wikimedia.org/wiki/File:Space_Laser_Satellite_Defense_System_Concept.jpg.

President Reagan proposed space- and ground-based systems to protect the United States from nuclear missiles in his 1984 Strategic Defense Initiative (SDI). Scientists argued it was unrealistic or impossible with contemporary technology, and it was lambasted in the media as “Star Wars.” Indeed, as this artist’s representation of SDI shows, it was rather ridiculous. Created October 18, 1984. Wikimedia, http://commons.wikimedia.org/wiki/File:Space_Laser_Satellite_Defense_System_Concept.jpg.

Protests in the streets were echoed by resistance in Congress. Congressional Democrats opposed Reagan’s policies on the merits; congressional Republicans, though they supported Reagan’s anti-communism, were wary of the administration’s fondness for circumventing Congress. In 1982 the House voted 411-0 to approve the Boland Amendment, which barred the United States from supplying funds to overthrow Nicaragua’s Sandinista government. A second Boland Amendment in 1984 prohibited any funding for the anti-Sandinista contra movement. The Reagan administration’s determination to flout these amendments led to a scandal that almost destroyed Reagan’s presidency. Robert MacFarlane, the president’s National Security Advisor, and Oliver North, a member of the National Security Council, raised money to support the contras by selling American missiles to Iran and funneling the money to Nicaragua. When their scheme was revealed in 1986, it was hugely embarrassing for Reagan. The president’s underlings had not only violated the Boland Amendments but had also, by selling arms to Iran, made a mockery of Reagan’s declaration that “America will never make concessions to the terrorists.” But while the Iran-Contra affair generated comparisons to the Watergate scandal, investigators were never able to prove Reagan knew about the operation. Without such a “smoking gun,” talk of impeaching Reagan remained simply talk.

Though the Iran-Contra scandal tarnished the Reagan administration’s image, it did not derail Reagan’s most significant achievement: easing tensions with the Soviet Union. This would have seemed impossible in Reagan’s first term, when the president exchanged harsh words with a rapid succession of Soviet leaders—Leonid Brezhnev, Yuri Andropov, and Konstantin Chernenko. In 1985, however, the aged Chernenko’s unsurprising death handed leadership of the Soviet Union to Mikhail Gorbachev. Gorbachev, a true believer in socialism, nonetheless realized that the Soviet Union desperately needed to reform its stolid Stalinism. He instituted a program of perestroika, which referred to the restructuring of the Soviet system, and of glasnost, which meant greater transparency in government. Gorbachev also reached out to Reagan in hopes of negotiating an end the arms race that was bankrupting the Soviet Union. Reagan and Gorbachev met in Geneva, Switzerland in 1985 and Reykjavik, Iceland in 1986. The summits failed any concrete agreements—largely thanks to Reagan’s refusal to limit the Strategic Defense Initiative. But the two leaders developed a relationship unprecedented in the history of US-Soviet relations. This trust made possible the Intermediate Nuclear Forces Treaty of 1987, which committed both sides to a sharp reduction in their nuclear arsenal.

By the late 1980s the Soviet empire was crumbling. Some credit must go to Reagan, who successfully combined anti-communist rhetoric (such as his 1987 speech at the Berlin Wall, where he declared, “General Secretary Gorbachev, if you seek peace…tear down this wall!”) with a willingness to negotiate with Soviet leadership. ((Lou Cannon, “Reagan Challenges Soviets To Dismantle Berlin Wall: Aides Disappointed at Crowd’s Lukewarm Reception,” Washington Post, June 13, 1987, A1. )) But most significant causes of collapse lay within the Soviet empire itself. Soviet-allied governments in Eastern Europe tottered under pressure from dissident organizations like Poland’s Solidarity and East Germany’s Neues Forum. Some of these countries, such as Poland, were also pressured from within by the Roman Catholic Church, which had turned toward active anti-communism under Pope John Paul II. When Gorbachev made it clear that he would not send the Soviet military to prop up these regimes, they collapsed one by one in 1989—in Poland, Hungary, Czechoslovakia, Romania, Bulgaria, and East Germany. Within the Soviet Union, Gorbachev’s proposed reforms unraveled the decaying Soviet system rather than bringing stability. By 1991 the Soviet Union itself had vanished, dissolving into a “Commonwealth of Independent States.”

 

XI. Conclusion

Reagan left office in 1988 with the Cold War waning and the economy booming. Unemployment had dipped to 5% by 1988. ((Patterson, Restless Giant, 163.)) Between 1981 and 1986, gas prices fell from $1.38 per gallon to 95¢. ((Patterson, Restless Giant, 163. )) The stock market recovered from the crash, and the Dow Jones Industrial Average—which stood at 950 in 1981—reached 2,239 by the end of Reagan’s second term. ((Patterson, Restless Giant. )) Yet, the economic gains of the decade were unequally distributed. The top fifth of households enjoyed rising incomes while the rest stagnated or declined. ((Jacobs and Zelizer, 32. )) In constant dollars, annual CEO pay rose from $3 million in 1980 to roughly $12 million during Reagan’s last year in the White House. ((Patterson, Restless Giant, 186. )) Between 1985 and 1989 the number of Americans living in poverty remained steady at 33 million. ((Patterson, Restless Giant, 164. )) Real per capita money income grew at only 2% per year, a rate roughly equal to the Carter years. ((Patterson, Restless Giant, 166. )) The American economy saw more jobs created than lost during the 1980s, but half of the jobs eliminated were in high-paying industries. ((Chafe, 488. )) Furthermore, half of the new jobs failed to pay wages above the poverty line. The economic divide was most acute for African Americans and Latinos, one-third of whom qualified as poor. Trickle-down economics, it seemed, rarely trickled down.

The triumph of the right proved incomplete. The number of government employees actually increased under Reagan. With more than 80% of the federal budget committed to defense, entitlement programs, and interest on the national debt, the right’s goal of deficit elimination floundered for lack of substantial areas to cut. ((Jacobs and Zelizer, 31. )) Between 1980 and 1989 the national debt rose from $914 billion to $2.7 trillion. ((Patterson, Restless Giant, 158.)) Despite steep tax cuts for corporations and the wealthy, the overall tax burden of the American public basically remained unchanged. Moreover, so-called regressive taxes on payroll and certain goods actually increased the tax burden on low- and middle-income Americans. Finally, Reagan slowed but failed to vanquish the five-decade legacy of economic liberalism. Most New Deal and Great Society proved durable. Government still offered its neediest citizens a safety net, if a now continually shrinking one.

Yet the discourse of American politics had irrevocably changed. The preeminence of conservative political ideas grew ever more pronounced, even when as controlled Congress or the White House. Indeed, the Democratic Party adapted its own message in response to the conservative mood of the country that accepted many of the Republicans’ Reagan-era ideas and innovations. The United States was on a rightward path.

 

Contributors

This chapter was edited by Richard Anderson and William J. Schultz, with content contributions by Richard Anderson, Laila Ballout, Marsha Barrett, Seth Bartee, Eladio Bobadilla, Kyle Burke, Andrew Chadwick, Jennifer Donnally, Leif Fredrickson, Kori Graves, Karissa A. Haugeberg, Jonathan Hunt, Stephen Koeth, Colin Reynolds, William J. Schultz, and Daniel Spillman.

 

Recommended Reading

  1. Carter, Dan T. The Politics of Rage: George Wallace, the Origins of the New Conservatism, and the Transformation of American Politics. Baton Rouge: Louisiana State University Press, 1995.
  2. Crespino, Joseph. In Search of Another Country: Mississippi and the Conservative Counterrevolution. Princeton: Princeton University Press, 2007.
  3. Critchlow, Donald. The Conservative Ascendancy: How the GOP Right Made Political History. Cambridge, MA: Harvard University Press, 2007.
  4. Dallek, Matthew. The Right Moment: Ronald Reagan’s First Victory and the Decisive Turning Point in American Politics. New York: Free Press, 2000.
  5. Hunter, James D. Culture Wars: The Struggle to Define America. New York: Basic Books, 1992.
  6. Kruse, Kevin M. White Flight: Atlanta and the Making of Modern Conservatism. Princeton: Princeton University Press, 2005.
  7. Lassiter, Matthew D. The Silent Majority: Suburban Politics in the Sunbelt South. Princeton: Princeton University Press, 2006.
  8. Patterson, James T. Restless Giant: The United States from Watergate to Bush v. Gore. New York: Oxford University Press, 2005.
  9. Rodgers, Daniel T. Age of Fracture. Cambridge: The Belknap Press of Harvard University Press, 2011.
  10. Schoenwald, Jonathan. A Time for Choosing: The Rise of Modern American Conservatism. New York: Oxford University Press, 2001.
  11. Self, Robert O. All in the Family: The Realignment of American Democracy Since the 1960s. New York: Hill and Wang, 2012.
  12. Troy, Gil. Morning in America: How Ronald Reagan Invented the 1980s. Princeton: Princeton University Press, 2005.
  13. Westad, Odd Arne. The Global Cold War: Third World Interventions and the Making of Our Times. New York: Cambridge University Press, October 2005.
  14. Wilentz, Sean. The Age of Reagan: A History, 1974–2008. New York: Harper, 2008.
  15. Williams, Daniel K. God’s Own Party: The Making of the Christian Right. New York: Oxford University Press, 2007.

 

Notes

28. The Unraveling

Abandoned Packard Automotive Plant in Detroit, Michigan. Via Wikimedia.

Abandoned Packard Automotive Plant in Detroit, Michigan. Via Wikimedia.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

On December 6, 1969, an estimated 300,000 people converged on the Altamont Motor Speedway in Northern California for a massive free concert headlined by the Rolling Stones and featuring some of the era’s other great rock acts. ((Acts included Santana, Jefferson Airplane, Crosby, Stills, Nash & Young, and the Flying Burrito Brothers. The Grateful Dead were scheduled but refused to play.)) Only four months earlier, Woodstock had shown the world the power of peace and love and American youth. Altamont was supposed to be “Woodstock West.” ((Bruce J. Schulman, The Seventies: The Great Shift in American Culture, Society, and Politics (Cambridge, Mass.: Da Capo Books, 2002), 18))

But Altamont was a disorganized disaster. Inadequate sanitation, a horrid sound system, and tainted drugs strained concertgoers. To save money, the Hell’s Angels biker gang was paid $500 in beer to be the the show’s “security team.” The crowd grew progressively angrier throughout the day. Fights broke out. Tensions rose. The Angels, drunk and high, armed themselves with sawed-off pool cues and indiscriminately beat concert-goers who tried to come on the stage. The Grateful Dead refused to play. Finally, the Stones came on stage. ((Allen J. Matusow, The Unraveling of America: A History of Liberalism in the 1960s, updated ed., (Athens, Ga.: University of Georgia Press, 2009), 304-05.))

The crowd’s anger was palpable. Fights continued near the stage. Mick Jagger stopped in the middle of playing “Sympathy for the Devil” to try and calm the crowd: “Everybody be cool now, c’mon,” he plead. Then, a few songs later, in the middle of “Under my Thumb,” eighteen-year-old Meredith Hunter approached the stage and was beaten back. Pissed off and high on methamphetamines, Hunter brandished a pistol, charged again, and was stabbed and killed by an Angel. His lifeless body was stomped into the ground. The Stones just kept playing. ((Owen Gleibman, “Altamont at 45: The Most Dangerous Rock Concert,” BBC, December 5, 2014, http://www.bbc.com/culture/story/20141205-did-altamont-end-the-60s.))

If the more famous Woodstock music festival captured the idyll of the sixties youth culture, Altamont revealed its dark side. There, drugs and music and youth were associated not with peace and love but with anger, violence, and death. Likewise, while many Americans in the 1970s continued to celebrate the political and cultural achievements of the previous decade, a more anxious, conservative mood grew across the nation. For some, the United States had not gone nearly far enough to promote greater social equality; for others, the nation had gone too far, unfairly trampling the rights of one group to promote the selfish needs of another. Onto these brewing dissatisfactions the 1970s dumped the divisive remnants of a failed war, the country’s greatest political scandal, and an intractable economic crisis. It seemed as if the nation was ready to unravel.

 

II. The Strain of Vietnam

Frank Wolfe, Vietnam War protestors at the March on the Pentagon, Lyndon B. Johnson Library via Wikimedia, http://commons.wikimedia.org/wiki/File:Vietnam_War_protestors_at_the_March_on_the_Pentagon.jpg.

Frank Wolfe, Vietnam War protestors at the March on the Pentagon, Lyndon B. Johnson Library via Wikimedia.

Perhaps no single issue contributed more to public disillusionment than the Vietnam War. As the war deteriorated, the Johnson administration escalated American involvement by deploying hundreds of thousands of troops to prevent the communist takeover of the South. Stalemate, body counts, hazy war aims, and the draft catalyzed an antiwar movement and triggered protests throughout the U.S. and Europe. With no end in sight, protesters burned their draft cards, refused to pay their income taxes, occupied government buildings, and delayed trains loaded with war materials. By 1967, antiwar demonstrations were drawing hundreds of thousands. In one protest, hundreds were arrested after surrounding the Pentagon. ((Jeff Leen, “The Vietnam Protests: When Worlds Collided,” Washington Post, September 27, 1999, http://www.washingtonpost.com/wp-srv/local/2000/vietnam092799.htm.))

Vietnam was the first “living room war.” ((Michael J. Arlen, Living-Room War (New York: Viking Press, 1969).)) Television, print media, and open access to the battlefield provided unprecedented coverage of the conflict’s brutality. Americans confronted grisly images of casualties and atrocities. In 1965, CBS Evening News aired a segment in which United States Marines burned the South Vietnamese village of Cam Ne with little apparent regard for the lives of its occupants, who had been accused of aiding Viet Cong guerrillas. President Johnson berated the head of CBS, yelling over the phone, “[Y]our boys just shat on the American flag.” ((Tom Engelhardt, The End of Victory Culture: Cold War America and the Disillusioning of a Generation, revised edition (Amherst: University of Massachusetts Press, 2007), 190.))

While the U.S. government imposed no formal censorship on the press during Vietnam, the White House and military nevertheless used press briefings and interviews to paint a deceptive image of the war. The U.S. was winning the war, officials claimed. They cited numbers of enemies killed, villages secured, and South Vietnamese troops trained. However, American journalists in Vietnam quickly realized the hollowness of such claims (the press referred to afternoon press briefing in Saigon as “the Five O’Clock Follies”). ((Mitchel P. Roth, Historical Dictionary of War Journalism (Westport, Conn.: Greenwood Press, 1997), 105.)) Editors frequently toned down their reporters’ pessimism, often citing conflicting information received from their own sources, who were typically government officials. But the evidence of a stalemate mounted.

Stories like CBS’s Cam Ne piece exposed a “credibility gap,” the yawning chasm between the claims of official sources and the increasingly evident reality on the ground in Vietnam. ((David L. Anderson, The Columbia Guide to the Vietnam War (New York: Columbia University Press, 2002), 109.)) Nothing did more to expose this gap than the 1968 Tet Offensive. In January, communist forces engaged in a coordinated attack on more than one hundred American and South Vietnamese sites throughout South Vietnam, including the American embassy in Saigon. While U.S. forces repulsed the attack and inflicted heavy casualties on the Viet Cong, Tet demonstrated that, despite the repeated claims of administration officials, the enemy could still strike at will anywhere in the country, even despite years of war. Subsequent stories and images eroded public trust even further. In 1969, investigative reporter Seymour Hersh revealed that U.S. troops had massacred and/or raped hundreds of civilians in the village of My Lai. ((Guenter Lewy, America in Vietnam (New York: Oxford University Press, 1978), 325-26.)) Three years later, Americans cringed at Nick Ut’s wrenching photograph of a naked Vietnamese child fleeing an American napalm attack. More and more American voices came out against the war.

Reeling from the war’s growing unpopularity, on March 31, 1968, President Johnson announced on national television that he would not seek reelection. ((Lyndon B. Johnson, “Address to the Nation Announcing Steps to Limit the War in Vietnam and Reporting His Decision Not to Seek Reelection,” March 31, 1968, Lyndon Baines Johnson Library, http://www.lbjlib.utexas.edu/johnson/archives.hom/speeches.hom/680331.asp.)) Eugene McCarthy and Robert F. Kennedy unsuccessfully battled against Johnson’s vice president, Hubert Humphrey, for the Democratic Party nomination (Kennedy was assassinated in June). At the Democratic Party’s national convention in Chicago, local police brutally assaulted protestors on national television.

For many Americans, the violent clashes outside the convention hall reinforced their belief that civil society was coming unraveled. Republican challenger Richard Nixon played on these fears, running on a platform of “law and order” and a vague plan to end the war. Well aware of domestic pressure to wind down the war, Nixon sought, on the one hand, to appease antiwar sentiment by promising to phase out the draft, train South Vietnamese forces to assume more responsibility for the war effort, and gradually withdraw American troops. Nixon and his advisors called it “Vietnamization.” ((Lewy, America in Vietnam, 164-69. Henry Kissinger, Ending the Vietnam War: A History of America’s Involvement in and Extrication from the Vietnam War (New York: Simon & Schuster, 2003), 81-82.)) At the same time, Nixon appealed to the so-called “silent majority” of Americans who still supported the war (and opposed the antiwar movement) by calling for an “honorable” end to U.S. involvement—what he later called “peace with honor.” ((Richard Nixon, “Address to the Nation Announcing Conclusion of an Agreement on Ending the War and Restoring Peace in Vietnam,” January 23, 1973, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=3808.)) He narrowly edged Humphrey in the fall’s election.

“Tragedy at Kent,” May 15, 1970, Life Magazine, http://life.tumblr.com/post/50507601384/on-this-day-in-life-may-15-1970-tragedy-at.

“Tragedy at Kent,” May 15, 1970, Life Magazine.

Public assurances of American withdrawal, however, masked a dramatic escalation of conflict. Looking to incentivize peace talks, Nixon pursued a “madman strategy” of attacking communist supply lines across Laos and Cambodia, hoping to convince the North Vietnamese that he would do anything to stop the war. ((Richard Nixon, quoted in Walter Isaacson, Kissinger: A Biography (New York: Simon & Schuster, 2005), 163-64.)) Conducted without public knowledge or Congressional approval, the bombings failed to spur the peace process and talks stalled before the American imposed November 1969 deadline. News of the attacks renewed anti-war demonstrations. Police and National Guard troops killed six students in separate protests at Jackson State University in Mississippi, and, more famously, Kent State University in Ohio in 1970.

Another three years passed—and another 20,000 American troops died—before an agreement was reached. ((Geneva Jussi Hanhimaki, The Flawed Architect: Henry Kissinger and American Foreign Policy (New York: Oxford University Press, 2004), 257.)) After Nixon threatened to withdraw all aid and guaranteed to enforce a treaty militarily, the North and South Vietnamese governments signed the Paris Peace Accords in January of 1973, marking the official end of U.S. force commitment to the Vietnam War. Peace was tenuous, and when war resumed North Vietnamese troops quickly overwhelmed Southern forces. By 1975, despite nearly a decade of direct American military engagement, Vietnam was united under a communist government.

The Vietnam War profoundly influenced domestic politics. Moreover, it poisoned many Americans’ perceptions of their government and its role in the world. And yet, while the antiwar demonstrations attracted considerable media attention and stand as a hallmark of the sixties counterculture so popularly remembered today, many Americans nevertheless continued to regard the war as just. Wary of the rapid social changes that reshaped American society in the 1960s and worried that antiwar protests threatened an already tenuous civil order, a growing number of Americans criticized the protests and moved closer to a resurgent American conservatism that brewed throughout the 1970s.

 

III. Racial, Social, and Cultural Anxieties

Los Angeles police hustle rioter into car, August 13, 1965, Wikimedia, http://commons.wikimedia.org/wiki/File:Wattsriots-policearrest-loc.jpg.

Los Angeles police hustle rioter into car, August 13, 1965, Wikimedia.

The civil rights movement looked dramatically different at the end of the 1960s than it had at the beginning. The movement had never been monolithic, but prominent, competing ideologies had fractured it with new assertive ideas. The rise of the Black Power movement challenged the integrationist dreams of many older activist as the assassinations of Martin Luther King. Jr and Malcolm X fueled disillusionment and many alienated activists recoiled from liberal reformers.

The political evolution of the civil rights movement was reflected in American culture. The lines of race and class and and gender ruptured American “mass” culture. As the monolith of popular American culture shattered—a monolith pilloried in the fifties and sixties as exclusively white, male-dominated, conservative, and stifling—Americans retreated into ever smaller, segmented subcultures. Marketers now targeted particular products to ever smaller pieces of the population, including previously neglected groups such as African Americans. ((Lizabeth Cohen, A Consumer’s Republic: The Politics of Mass Consumption in Postwar America (New York: Vintage Books, 2004).)) Subcultures often revolved around certain musical styles, whether pop, disco, hard rock, punk rock, country, or hip-hop. Styles of dress and physical appearance likewise aligned with cultures of choice.

If the popular rock acts of the sixties appealed to a new counterculture, the seventies witnessed the resurgence of cultural forms that appealed to a white working class confronting the social and political upheavals of the 1960s. Country hits such as Merle Haggard’s “Okie from Muskogee” evoked simpler times and places where people “still wave Old Glory down at the courthouse” and they “don’t let our hair grow long and shaggy like the hippies out in San Francisco.” A popular television sitcom, All in the Family, became an unexpected hit among “middle America” The show’s main character, Archie Bunker, was designed to mock reactionary middle-aged white men, but audiences embraced him. “Isn’t anyone interested in upholding standards?” he lamented in an episode dealing with housing integration. “Our world is coming crumbling down. The coons are coming!” ((Quotes, “Lionel Moves into the Neighborhood,” All in the Family, season 1, episode 8 (1971), http://www.tvrage.com/all-in-the-family/episodes/5587.))

CBS Television, All in the Family Cast 1973, Wikimedia, http://commons.wikimedia.org/wiki/File:All_In_the_Family_cast_1973.JPG.

CBS Television, All in the Family Cast 1973, Wikimedia.

As Bunker knew, African-Americans were becoming much more visible in American culture. While black cultural forms had been prominent throughout American history, they assumed new popular forms in the 1970s. Disco offered a new, optimistic, racially integrated pop music. Behind the scenes, African American religious styles became an outsized influence on pop music. Musicians like Aretha Franklin, Andre Crouch, and “fifth Beatle” Billy Preston brought their background in church performance to their own recordings as well as to the work of white artists like the Rolling Stones, with whom they collaborated. And by the end of the decade African American musical artists had introduced American society to one of the most significant musical innovations in decades: the Sugarhill Gang’s 1979 record, Rapper’s Delight. A lengthy paean to black machismo, it became the first rap single to reach the top 40. ((Jim Dawson and Steve Propes, 45 RPM: The History, Heroes and Villains of a Pop Music Revolution (San Francisco, Ca.: Backbeat Books, 2003), 120.))

Just as rap represented a hyper-masculine black cultural form, Hollywood popularized its white equivalent. Films such as 1971’s Dirty Harry captured a darker side of the national mood. Clint Eastwood’s titular character exacted violent justice on clear villains, working within the sort of brutally simplistic ethical standard that appealed to Americans anxious about a perceived breakdown in “law and order.” (“The film’s moral position is fascist,” said critic Roger Ebert, who nevertheless gave it three out of four stars. ((Roger Ebert, “Review of Dirty Harry,” January 1, 1971, http://www.rogerebert.com/reviews/dirty-harry-1971. )))

Perhaps the strongest element fueling American anxiety over “law and order” was the increasingly visible violence that marked the waning civil rights movement. No longer confined to the anti-black terrorism that struck the southern civil rights movement in the 1950s and 1960s, publicly visible violence now broke out among blacks in urban riots and among whites protesting new civil rights programs. In the mid-1970s, for instance, protests over the use of busing to overcome residential segregation and truly integrate public schools in Boston washed the city in racial violence. Stanley Forman’s Pulitzer Prize-winning photo, “The Soiling of Old Glory,” famously captured one black teenager, Ted Landsmark, being attacked by a mob of anti-busing protesters, one of whom wielded an American flag. ((Ronald P. Formisano, Boston Against Busing: Race, Class, and Ethnicity in the 1960s and 1970s (Chapel Hill, North Carolina: University of North Carolina Press, 1991).))

Urban riots, though, rather than anti-integration violence, tainted many white Americans’ perception of the civil rights movement and urban life in general. Civil unrest broke out across the country, but the riots in Watts/Los Angeles (1965), Newark (1967), and Detroit (1967) were most shocking. In each, a physical altercation between white police officers and African Americans spiraled into days of chaos and destruction. Tens of thousands participated in urban riots. Many looted and destroyed white-owned business. There were dozens of deaths, tens of millions of dollars in property damage, and an exodus of white capital that only further isolated urban poverty. ((Michael W. Flamm, Law and Order: Street Crime, Civil Unrest, and the Crisis of Liberalism in the 1960s (New York: Columbia University Press, 2005), 58-59, 85-93.))

In 1967, President Johnson appointed the Kerner Commission to investigate the causes of America’s riots. Their report became an unexpected bestseller. ((Thomas J. Sugrue, Sweet Land of Liberty: The Forgotten Struggle for Civil Rights in the North (New York: Random House, 2008), 348. )) The Commission cited black frustration with the hopelessness of poverty as the underlying cause of urban unrest. As the head of the black National Business League testified, “It is to be more than naïve—indeed, it is a little short of sheer madness—for anyone to expect the very poorest of the American poor to remain docile and content in their poverty when television constantly and eternally dangles the opulence of our affluent society before their hungry eyes.” ((Lizabeth Cohen, A Consumer’s Republic: The Politics of Mass Consumption in Postwar America (New York: Vintage Books, 2004), 373. )) A Newark rioter who looted several boxes of shirts and shoes put it more simply: “They tell us about that pie in the sky but that pie in the sky is too damn high.” ((Cohen, A Consumer’s Republic, 376. )) But white conservatives blasted the conclusion that white racism and economic hopelessness were to blame for the violence. African Americans wantonly destroying private property, they said, was not a symptom of America’s intractable racial inequalities, but the logical outcome of a liberal culture of permissiveness that tolerated—even encouraged—nihilistic civil disobedience. Many white moderates and liberals, meanwhile, saw the explosive violence as a sign African Americans had rejected the nonviolence of the earlier civil rights movement.

The unrest of the late sixties did, in fact, reflect a real and growing disillusionment among African Americans with the fate of the civil rights crusade. In the still-moldering ashes of Jim Crow, African Americans in Watts and other communities across the country bore the burdens of lifetimes of legally sanctioned discrimination in housing, employment, and credit. Segregation too often survived the legal dismantling of Jim Crow. The perseverance into the present day of stark racial and economic segregation in nearly all American cities destroyed any simple distinction between southern “de jure” segregation and non-southern “de facto” segregation. The inner cities became traps that too few could escape.

Political achievements such as the 1964 Civil Rights Act and 1965 Voting Rights Act were indispensable legal preconditions for social and political equality, but, for most, the movement’s long (and now often forgotten) goal of economic justice proved as elusive as ever. “I worked to get these people the right to eat cheeseburgers,” Martin Luther King Jr. supposedly said to Bayard Rustin as they toured the devastation in Watts some years earlier, “and now I’ve got to do something… to help them get the money to buy it.” ((Martin Luther King, quoted in David J. Garrow, Bearing the Cross: Martin Luther King Jr. and the Southern Christian Leadership Conference (New York: William Morrow, 1986), 439. )) What good was the right to enter a store without money for purchases?

 

IV. The Crisis of 1968

To Americans in 1968, the country seemed to be unraveling. Martin Luther King, Jr. was killed on April 4, 1968. He had been in Memphis to support a striking sanitation workers. (Prophetically, he had reflected on his own mortality in a rally the night before. Confident that the civil rights movement would succeed without him, he brushed away fears of death. “I’ve been to the mountaintop” he said, “And I’ve seen the promised land.”). The greatest leader in the American civil rights movement was lost. Riots broke out in over 100 American cities. Two months later, on June 6, Robert F. Kennedy, Jr., was killed campaigning in California. He had represented the last best hope of liberal idealists. Anger and disillusionment washed over the country.

As the Vietnam War descended ever deeper into a brutal stalemate and the Tet Offensive exposed the lies of the Johnson administration, students shut down college campuses and government facilities. Protests enveloped the nation.

Protesters converged on the Democratic National Convention in Chicago at the end of August, 1968, when a bitterly fractured Democratic Party gathered to assemble a passable platform and nominate a broadly acceptable presidential candidate. Demonstrators planned massive protests in Chicago’s public spaces. Initial protests were peaceful, but the situation quickly soured as police issued stern threats and young people began to taunt and goad officials. Many of the assembled students had protest and sit-in experiences only in the relative safe havens of college campuses, and were unprepared for Mayor Richard Daley’s aggressive and heavily armed police force and by National Guard troops in full riot gear. Attendees recounted vicious beatings at the hands of police and Guardsmen, but many young people—convinced that much public sympathy could be won via images of brutality against unarmed protesters—continued stoking the violence. Clashes spilled from the parks into city streets, and eventually the smell of tear gas penetrated upper floors of the opulent hotels hosting Democratic delegates. Chicago’s brutality overshadowed the convention and culminated in an internationally televised, violent standoff in front of the Hilton Hotel. “The whole world is watching,” the protesters chanted. The Chicago riots encapsulated the growing sense that chaos now governed American life.

For the idealists of the 1960s, the violence of 1968 represented the death of a dream. Disorder and chaos overshadowed hope and progress. And for conservatives, it was confirmation of all of their fears and hesitations. Americans of 1968 turned their back on hope. They wanted peace. They wanted stability. They wanted “law and order.”

 

V. The Rise and Fall of Richard Nixon

Richard Nixon campaigns in Philadelphia during the 1968 presidential election. National Archives via Wikimedia

Richard Nixon campaigns in Philadelphia during the 1968 presidential election. National Archives via Wikimedia

Beleaguered by an unpopular war, inflation, and domestic unrest, President Johnson opted against reelection in March of 1968—an unprecedented move in modern American politics. The forthcoming presidential election was shaped by Vietnam and the aforementioned unrest as much as the campaigns of Democratic nominee Vice President Hubert Humphrey, Republican Richard Nixon, and third-party challenger George Wallace, the infamous segregationist governor of Alabama. The Democratic Party was in disarray in the spring of 1968, when senators Eugene McCarthy and Robert Kennedy challenged Johnson’s nomination and the president responded with his shocking announcement. Nixon’s candidacy was aided further by riots that broke out across the country after the assassination of Martin Luther King, Jr., and the shock and dismay experienced after the slaying of Robert Kennedy in June. The Republican nominee’s campaign was defined by shrewd maintenance of his public appearances and a pledge to restore peace and prosperity to what he called “the silent center; the millions of people in the middle of the political spectrum.” This campaign for “the Silent Majority” was carefully calibrated to attract suburban Americans by linking liberals with violence and protest and rioting. Many embraced Nixon’s message; a September 1968 poll found that 80 percent of Americans believed public order had “broken down.”

Meanwhile, Humphrey struggled to distance himself from Johnson and maintain working-class support in northern cities, where voters were drawn to Wallace’s appeals for law and order and a rejection of civil rights. The vice president had a final surge in northern cities with the aid of union support, but it was not enough to best Nixon’s campaign. The final tally was close: Nixon won 43.3 percent of the popular vote (31,783,783), narrowly besting Humphrey’s 42.7 percent (31,266,006). Wallace, meanwhile, carried five states in the Deep South, and his 13.5 percent (9,906,473) of the popular vote constituted an impressive showing for a third-party candidate. The Electoral College vote was more decisive for Nixon; he earned 302 electoral votes, while Humphrey and Wallace received only 191 and 45 votes, respectively. Although Republicans won a few seats, Democrats retained control of both the House and Senate and made Nixon the first president in 120 years to enter office with the opposition party controlling both houses.

Once installed in the White House, Richard Nixon focused his energies on American foreign policy, publicly announcing the “Nixon Doctrine” in 1969. On the one hand, Nixon asserted the supremacy of American democratic capitalism and conceded that the U.S. would continue supporting its allies financially. However, he denounced previous administrations’ willingness to commit American forces to third-world conflicts and warned other states to assume responsibility for their own defense. He was turning America away from the policy of active, anti-communist containment, and toward a new strategy of “détente.” ((Richard M. Nixon, “Address to the Nation on the War in Vietnam,” November 3, 1969, American Experience, http://www.pbs.org/wgbh/americanexperience/features/primary-resources/nixon-vietnam/.))

Promoted by national security advisor and eventual Secretary of State Henry Kissinger, détente sought to stabilize the international system by thawing relations with Cold War rivals and bilaterally freezing arms levels. Taking advantage of tensions between the People’s Republic of China (PRC) and the Soviet Union, Nixon pursued closer relations with both in order to de-escalate tensions and strengthen the United States’ position relative to each. The strategy seemed to work. Nixon became the first American president to visit communist China (1971) and the first since Franklin Roosevelt to visit the Soviet Union (1972). Direct diplomacy and cultural exchange programs with both countries grew and culminated with the formal normalization of U.S.-Chinese relations and the signing of two U.S.-Soviet arms agreements: the anti-ballistic missile (ABM) treaty and the Strategic Arms Limit Treaty (SALT I). By 1973, after almost thirty years of Cold War tension, peaceful coexistence suddenly seemed possible.

Soon, though, a fragile calm gave way once again to Cold War instability. In November 1973, Nixon appeared on television to inform Americans that energy had become “a serious national problem” and that the United States was “heading toward the most acute shortages of energy since World War II.” ((Richard Nixon, “Address to the Nation about Policies to Deal with Energy Shortages,” November 7, 1973, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=4034. )) The previous month Arab members of the Organization of Petroleum Exporting Countries (OPEC), a cartel of the world’s leading oil producers, embargoed oil exports to the United States in retaliation for American intervention in the Middle East. The embargo launched the first U.S. energy crisis. By the end of 1973, the global price of oil had quadrupled. ((Office of the Historian, “Oil Embargo, 1973-1974,” U.S. Department of State, https://history.state.gov/milestones/1969-1976/oil-embargo.)) Drivers waited in line for hours to fill up their cars. Individual gas stations ran out of gas. American motorists worried that oil could run out at any moment. A Pennsylvania man died when his emergency stash of gasoline ignited in his trunk and backseat. (( “Gas Explodes in Man’s Car,” Uniontown Morning Herald, December 5, 1973, p. 12.)) OPEC rescinded its embargo in 1974, but the economic damage had been done. The crisis extended into the late 1970s.

Like the Vietnam War, the oil crisis showed that small countries could still hurt the United States. At a time of anxiety about the nation’s future, Vietnam and the energy crisis accelerated Americans’ disenchantment with the United States’ role in the world and the efficacy and quality of its leaders. Furthermore, government scandals in the 1970s and early 80s sapped trust in America’s public institutions. In 1971, the Nixon administration tried unsuccessfully to sue the New York Times and the Washington Post to prevent the publication of the Pentagon Papers, a confidential and damning history of U.S. involvement in Vietnam commissioned by the Defense Department and later leaked. The Papers showed how presidents from Truman to Johnson repeatedly deceived the public on the war’s scope and direction. ((Larry H. Addington, America’s War in Vietnam: A Short Narrative History (Bloomington: Indiana University Press, 2000), 140-41. )) Nixon faced a rising tide of congressional opposition to the war, and Congress asserted unprecedented oversight of American war spending. In 1973, it passed the War Powers Resolution, which dramatically reduced the president’s ability to wage war without congressional consent.

However, no scandal did more to unravel public trust than Watergate. On June 17, 1972, five men were arrested inside the offices of the Democratic National Committee (DNC) in the Watergate Complex in downtown Washington, D.C. After being tipped by a security guard, police found the men attempting to install sophisticated bugging equipment. One of those arrested was a former CIA employee then working as a security aide for the Nixon administration’s Committee to Reelect the President (lampooned as “CREEP”).

While there is no direct evidence that Nixon ordered the Watergate break-in, he had been recorded in conversation with his Chief of Staff requesting the DNC chairman be illegally wiretapped to obtain the names of the committee’s financial supporters. The names could then be given to the Justice Department and the IRS to conduct spurious investigations into their personal affairs. Nixon was also recorded ordering his Chief of Staff to break into the offices of the Brookings Institute and take files relating to the war in Vietnam, saying, “Goddamnit, get in and get those files. Blow the safe and get it.” ((Bruce J. Schulman, The Seventies: The Great Shift in American Culture, Society, and Politics (Cambridge, Mass.: Da Capo Books, 2002), 44.))

Whether or not the president ordered the Watergate break-in, the White House launched a massive cover-up. Administration officials ordered the CIA to halt the FBI investigation and paid hush-money to the burglars and White House aides. Nixon distanced himself from the incident publicly and went on to win a landslide election victory in November 1972. But, thanks largely to two persistent journalists at the Washington Post, Bob Woodward and Carl Bernstein, information continued to surface that tied the burglaries ever closer to the CIA, the FBI, and the White House. The Senate held televised hearings. Citing “executive privilege,” Nixon refused to comply with orders to produce tapes from the White House’s secret recording system. In July 1974, the House Judiciary Committee approved a bill to impeach the president. Nixon resigned before the full House could vote on impeachment. He became the first and only American president to resign from office. ((“Executive Privilege,” in John J. Patrick, Richard M. Pious, and Donald A. Ritchie, The Oxford Guide to the United States Government (New York: Oxford University Press, 2001), p. 227. Schulman, The Seventies, 44-48. ))

Vice President Gerald Ford was sworn in as his successor and a month later granted Nixon a full presidential pardon. Nixon disappeared from public life without ever publicly apologizing, accepting responsibility, or facing charges.

 

VI. Deindustrialization and the Rise of the Sunbelt

Abandoned Youngstown factory, via Flickr user stu_spivack.

Abandoned Youngstown factory, via Flickr user stu_spivack.

American Workers had made substantial material gains throughout the 1940s and 1950s. During the so-called “Great Compression,” Americans of all classes benefited from postwar prosperity. Segregation and discrimination perpetuated racial and gender inequalities, but unemployment continually fell and a highly progressive tax system and powerful unions lowered general income inequality as working class standards-of-living nearly doubled between 1947 and 1973.

But, general prosperity masked deeper vulnerabilities. Perhaps no case better illustrates the decline of American industry and the creation of an intractable “urban crisis”  than Detroit. As the automobile industry expanded and especially as the United States transitioned to a wartime economy during World War II, Detroit boomed. When auto manufacturers like Ford and General Motors converted their assembly lines to build machines for the American war effort, observers dubbed the city the “arsenal of democracy.”

After the war, however, automobile firms began closing urban factories and moving to outlying suburbs. Several factors fueled the process. Some cities partly deindustrialized themselves. Municipal governments in San Francisco, St. Louis, and Philadelphia banished light industry to make room for high-rise apartments and office buildings. Mechanization also contributed to the decline of American labor. A manager at a newly automated Ford engine plant in postwar Cleveland captured the interconnections between these concerns when he glibly noted to United Automobile Workers (UAW) president Walter Reuther, “you are going to have trouble collecting union dues from all of these machines.” ((Sugrue, Origins of the Urban Crisis, 132.)) More importantly, however, manufacturing firms sought to lower labor costs by automating, downsizing, and relocating to areas with “business friendly” policies like low tax rates, anti-union “right-to-work” laws, and low wages.

Detroit began to bleed industrial jobs. Between 1950 and 1958, Chrysler, who actually kept more jobs in Detroit than either Ford or General Motors, cut its Detroit production workforce in half. In the years between 1953 and 1960, East Detroit lost ten plants and over seventy-one thousand jobs. ((Sugrue, Origins of the Urban Crisis, 136, 149.)) Because Detroit was a single-industry city, decisions made by the “Big Three” automakers reverberated across the city’s industrial landscape. When auto companies mechanized or moved their operations, ancillary suppliers like machine tool companies were cut out of the supply chain and likewise forced to cut their own workforce. Between 1947 and 1977, the number of manufacturing firms in the city dropped from over three thousand to fewer than two thousand. The labor force was gutted. Manufacturing jobs fell from 338,400 to 153,000 over the same three decades. ((Sugrue, Origins of the Urban Crisis, 144.))

Industrial restructuring decimated all workers, but deindustrialization fell heaviest on the city’s African Americans. Although many middle class blacks managed to move out of the city’s ghettoes, by 1960, 19.7 percent of black autoworkers in Detroit were unemployed, compared to just 5.8 percent of whites. ((Sugrue, Origins of the Urban Crisis, 144.)) Overt discrimination in housing and employment had for decades confined blacks to segregated neighborhoods where they were forced to pay exorbitant rents for slum housing. Subject to residential intimidation and cut off from traditional sources of credit, few blacks could afford to follow industry as it left the city for the suburbs and other parts of the country, especially the South. Segregation and discrimination kept them stuck where there fewer and fewer jobs. Over time, Detroit devolved into a mass of unemployment, crime, and crippled municipal resources. When riots rocked Detroit in 1967, 25 to 30 percent of blacks between age eighteen and twenty-four were unemployed. ((Sugrue, Origins of the Urban Crisis, 261. ))

Deindustrialization in Detroit and elsewhere also went hand in hand with the long assault on unionization that began in the aftermath of World War II. Lacking the political support they had enjoyed during the New Deal years, unions such as the Congress of Industrial Organizations (CIO) and the United Auto Workers (UAW) shifted tactics and accepted labor-management accords in which cooperation, not agitation, was the strategic objective.

This accord held mixed results for workers. On the one hand, management encouraged employee loyalty through privatized welfare systems that offered workers health benefits and pensions. Grievance arbitration and collective bargaining also provided workers official channels through which to criticize policies and push for better conditions. At the same time, bureaucracy and corruption increasingly weighed down unions and alienated them from workers and the general public. Union management came to hold primary influence in what was ostensibly a “pluralistic” power relationship. Workers—though still willing to protest—by necessity pursued a more moderate agenda compared to the union workers of the 1930s and 40s. Conservative politicians meanwhile seized on popular suspicions of “Big Labor,” stepping up their criticism of union leadership and positioning themselves as workers’ true ally.

While conservative critiques of union centralization did much to undermine the labor movement, labor’s decline also coincided with ideological changes within American liberalism. Labor and its political concerns undergirded Roosevelt’s New Deal coalition, but by the 1960s, many liberals had forsaken working class politics. More and more saw poverty as stemming not from structural flaws in the national economy, but from the failure of individuals to take full advantage of the American system. Roosevelt’s New Deal might have attempted to rectify unemployment with government jobs, but Johnson’s Great Society and its imitators funded government-sponsored job training, even in places without available jobs. Union leaders in the 1950s and 1960s typically supported such programs and philosophies.

Internal racism also weakened the labor movement. While national CIO leaders encouraged black unionization in the 1930s, white workers on the ground often opposed the integrated shop. In Detroit and elsewhere after World War II, white workers participated in “hate strikes” where they walked off the job rather than work with African Americans. White workers similarly opposed residential integration, fearing, among other things, that black newcomers would lower property values. ((Jefferson Cowie and Nick Salvatore, “The Long Exception: Rethinking the Place of the New Deal in American History,” International Labor and Working-Class History, vol. 74 (Fall 2008), 1-32, esp. 9.))

By the mid-1970s, widely-shared postwar prosperity leveled off and began to retreat. Growing international competition, technological inefficiency, and declining productivity gains stunted working- and middle-class wages. As the country entered recession, wages decreased and the pay gap between workers and management began its long widening. At the same time, dramatic increases in mass incarceration coincided with the deregulation of prison labor to allow more private companies access to cheaper inmate labor, a process that, whatever its aggregate impact, impacted local communities where free jobs were moved into prisons. The tax code became less progressive and labor lost its foothold in the marketplace. Unions represented a third of the workforce in the 1950s, but only one in ten workers belonged to one as of 2015. ((Quoctrung Bui, “50 Years of Shrinking Union Membership in One Map,” February 23, 2015, NPR, http://www.npr.org/sections/money/2015/02/23/385843576/50-years-of-shrinking-union-membership-in-one-map.))

Geography dictated much of labor’s fall, as American firms fled pro-labor states in the 1970s and 1980s. Some went overseas in the wake of new trade treaties to exploit low-wage foreign workers, but others turned to anti-union states in the South and West stretching from Virginia to Texas to southern California. Factories shuttered in the North and Midwest, leading commentators by the 1980s to dub America’s former industrial heartland the “the Rust Belt.” With this, they contrasted the prosperous and dynamic “Sun Belt.”

Coined by journalist Kevin Phillips in 1969, the “Sun Belt” refers to the swath of southern and western states that saw unprecedented economic, industrial, and demographic growth after World War II. ((Kevin P. Phillips, The Emerging Republic Majority (New Rochelle, N. Y.: Arlington House, 1969), 17.)) During the New Deal, President Franklin D. Roosevelt declared the American South “the nation’s No. 1 economic problem” and injected massive federal subsidies, investments, and military spending into the region. During the Cold War, Sun Belt politicians lobbied hard for military installations and government contracts for their states. ((Bruce J. Schulman, From Cotton Belt to Sunbelt: Federal Policy, Economic Development, & the Transformation of the South, 1938-1980, 3rd printing (Durham, N. C.: Duke University Press, 2007), 3.))

Meanwhile, southern states’ hostility toward organized labor beckoned corporate leaders. The Taft-Hartley Act in 1949 facilitated southern states’ frontal assault on unions. Thereafter, cheap, nonunionized labor, low wages, and lax regulations pulled northern industries away from the Rust Belt. Skilled northern workers followed the new jobs southward and westward, lured by cheap housing and a warm climate slowly made more tolerable by modern air conditioning.

The South attracted business but struggled to share their profits. Middle class whites grew prosperous, but often these were recent transplants, not native southerners. As the cotton economy shed farmers and laborers, poor white and black southerners found themselves mostly excluded from the fruits of the Sun Belt. Public investments were scarce. White southern politicians channeled federal funding away from primary and secondary public education and toward high-tech industry and university-level research. The Sun Belt inverted Rust Belt realities: the South and West had growing numbers of high-skill, high-wage jobs but lacked the social and educational infrastructure needed to train native poor and middle-class workers for those jobs.

Regardless, more job meant more people, and by 1972, southern and western Sun Belt states had more electoral votes than the Northeast and Midwest. This gap continues to grow. ((William H. Frey, “The Electoral College Moves to the Sun Belt,” research brief, Brookings Institution, May 2005. )) Though the region’s economic and political ascendance was a product of massive federal spending, New Right politicians who constructed an identity centered on “small government” found their most loyal support in the Sun Belt. These business-friendly politicians successfully synthesized conservative Protestantism and free-market ideology, creating a potent new political force. Housewives organized reading groups in their homes, and from those reading groups sprouted new organized political activities. Prosperous and mobile, old and new suburbanites gravitated towards an individualistic vision of free enterprise espoused by the Republican Party. Some, especially those most vocally anti-communist, joined groups like the Young Americans for Freedom and the John Birch Society. Less radical suburban voters, however, still gravitated towards the more moderate brand of conservatism promoted by Richard Nixon.

 

VII. The Politics of Love, Sex, and Gender

Warren K. Leffler, Demonstrators opposed to the ERA in front of the White House, 1977, http://www.loc.gov/item/2002712194/.

Warren K. Leffler, Demonstrators opposed to the ERA in front of the White House, 1977 via Library of Congress.

The sexual revolution continued into the 1970s. Many Americans—feminists, gay men, lesbians, and straight couples—challenged strict gender roles and rejected the rigidity of the nuclear family. Cohabitation without marriage spiked, straight couples married later (if at all), and divorce levels climbed. Sexuality, decoupled from marriage and procreation, became for many not only a source of personal fulfillment, but a worthy political cause.

At the turn of the decade, sexuality was considered a private matter yet rigidly regulated by federal, state, and local law. Statutes typically defined legitimate sexual expression within the confines of patriarchal, procreative marriage. Interracial marriage, for instance, was illegal in many states until 1967 and remained largely taboo long after. Same-sex intercourse and cross-dressing were criminalized in most states, and gay men, lesbians, and transgender people were vulnerable to violent police enforcement as well as discrimination in housing and employment.

Two landmark legal rulings in 1973 established the battle lines for the “sex wars” of the 1970s. First, the Supreme Court’s 7-1 ruling in Roe v. Wade (1973) struck down a Texas law that prohibited abortion in all cases when a mother’s life was not in danger. The Court’s decision built upon precedent from a 1965 ruling that, in striking down a Connecticut law prohibiting married couples from using birth control, recognized a constitutional “right to privacy.” ((Griswold v. Connecticut, 381 U.S. 479, June 7, 1965.)) In Roe, the Court reasoned that “this right of privacy . . . is broad enough to encompass a woman’s decision whether or not to terminate her pregnancy.” ((Roe v. Wade, 410 U.S. 113, January 22, 1973.)) The Court held that states could not interfere with a woman’s right to an abortion during the first trimester of pregnancy and could only fully prohibit abortions during the third trimester.

Other Supreme Court rulings, however, found that sexual privacy could be sacrificed for the sake of “public” good. Miller v. California (1973), a case over the unsolicited mailing of sexually explicit advertisements for illustrated “adult” books, held that the first amendment did not protect “obscene” material, defined by the Court as anything with sexual appeal that lacked, “serious literary, artistic, political, or scientific value.” ((Miller v. California, 413 U.S. 15, June 21, 1973.)) The ruling expanded states’ abilities to pass laws prohibiting materials like hardcore pornography. However, uneven enforcement allowed pornographic theaters and sex shops to proliferate despite whatever laws states had on the books. Americans debated whether these represented the pinnacle of sexual liberation or, as poet and lesbian feminist Rita Mae Brown suggested, “the ultimate conclusion of sexist logic.” ((Rita Mae Brown, quoted in David Allyn, Make Love, Not War – The Sexual Revolution: An Unfettered History (New York, Routledge, 2001), 239.))

Of more tangible concern for most women, though, was the right to equal employment access. Thanks partly to the work of black feminists like Pauli Murray, Title VII of the 1964 Civil Rights Act banned employment discrimination based on sex, in addition to race, color, religion, and national origin. “If sex is not included,” she argued in a memorandum sent to members of Congress, “the civil rights bill would be including only half of the Negroes.” ((Nancy MacLean, Freedom Is Not Enough: The Opening of the American Workplace (Cambridge, Mass.: Harvard University Press), 121.)) Like most laws, Title VII’s full impact came about slowly, as women across the nation cited it to litigate and pressure employers to offer them equal opportunities as men. For one, employers in the late sixties and seventies still viewed certain occupations as inherently feminine or masculine. The National Organization for Women (NOW) organized airline workers against a major company’s sexist ad campaign that showed female flight attendants wearing buttons that read, “I’m Debbie, Fly Me” or “I’m Cheryl, Fly Me.” Actual female flight attendants were required to wear similar buttons. ((MacLean, Freedom Is Not Enough, 129.)) Other women sued to gain access to traditionally male jobs like factory work. Protests prompted the Equal Employment Opportunity Commission (EEOC) to issue a more robust set of protections between 1968 and 1971. Though advancement came haltingly and partially, women used these protections to move eventually into traditional male occupations, politics, and corporate management.

The battle for sexual freedom was not just about the right to get into places, though. It was also about the right to get out of them—specifically, unhappy households and marriages. Between 1959 and 1979, the American divorce rate more than doubled. By the early 1980s, nearly half of all American marriages ended in divorce. ((Arland Thornton, William G. Axinn, and Yu Xie, Marriage and Cohabitation (Chicago: University of Chicago Press, 2007), 57.)) The stigma attached to divorce evaporated and a growing sense of sexual and personal freedom motivated individuals to leave abusive or unfulfilling marriages. Legal changes also promoted higher divorce rates. Before 1969, most states required one spouse to prove that the other was guilty of a specific offense, such as adultery. The difficulty of getting a divorce under this system encouraged widespread lying in divorce courts. Even couples desiring an amicable split were sometimes forced to claim that one spouse had cheated on the other even if neither (or both) had. Other couples temporarily relocated to states with more lenient divorce laws, such as Nevada. ((Glenda Riley, Divorce: An American Tradition (New York: Oxford University Press, 1991), 135-39.)) Widespread recognition of such practices prompted reforms. In 1969, California adopted the first no-fault divorce law. By the end of the 1970s, almost every state had adopted some form of no-fault divorce. The new laws allowed for divorce on the basis of “irreconcilable differences,” even if only one party felt that he or she could not stay in the marriage. ((Riley, Divorce, 161-65. Mary Ann Glendon, The Transformation of Family Law: State, Law, and Family in the United States and Western Europe (Chicago: University of Chicago Press, 1989), 188-89.))

As straight couples enjoyed eased bonds of matrimony, gay men and women negotiated a harsh world that stigmatized homosexuality as a mental illness or depraved immoral act. Building upon postwar efforts by gay rights organizations to bring homosexuality into the mainstream of American culture, young gay activists of the late sixties and seventies began to challenge what they saw as the conservative gradualism of the “homophile” movement. Inspired by the burgeoning radicalism of the Black Power movement, the New Left protests of the Vietnam War, and the counterculture movement for sexual freedom, gay and lesbian activists agitated for a broader set of sexual rights that emphasized an assertive notion of “liberation” rooted not in mainstream assimilation, but in pride of sexual difference.

Perhaps no single incident did more to galvanize gay and lesbian activism than the 1969 uprising at the Stonewall Inn in New York City’s Greenwich Village. Police regularly raided gay bars and hangouts. But when police raided the Stonewall in June 1969, the bar patrons protested and sparked a multi-day street battle that catalyzed a national movement for gay liberation. Seemingly overnight, calls for homophile respectability were replaced with chants of “Gay Power!” ((David Carter, Stonewall: The Riots That Sparked the Gay Revolution (New York: St. Martin’s Press, 2004), 147.))

The window under the Stonewall sign reads: “We homosexuals plead with our people to please help maintain peaceful and quiet conduct on the streets of the Village--Mattachine.” Stonewall Inn 1969, Wikimedia, http://commons.wikimedia.org/wiki/File:Stonewall_Inn_1969.jpg.

The window under the Stonewall sign reads: “We homosexuals plead with our people to please help maintain peaceful and quiet conduct on the streets of the Village–Mattachine.” Stonewall Inn 1969, via Wikimedia.

In the following years, gay Americans gained unparalleled access to private and public spaces. Gay activists increasingly attacked cultural norms that demanded they keep their sexuality hidden. Citing statistics that sexual secrecy contributed to stigma and suicide, gay activists urged people to “come out” and embrace their sexuality. A step towards the “normalization” of homosexuality occurred in 1973, when the American Psychiatric Association stopped classifying homosexuality as a mental illness. Pressure mounted on politicians. In 1982, Wisconsin became the first state to ban discrimination based on sexual orientation. More than eighty cities and nine states followed suit over the following decade. But progress proceeded unevenly, and gay Americans continued to suffer hardships from a hostile culture.

As well, like all social movements, the sexual revolution was not free of division. Transgender people were often banned from participating in Gay Pride rallies and lesbian feminist conferences. They, in turn, mobilized to fight the high incidence of rape, abuse, and murder of transgender people. A 1971 newsletter denounced the notion that transgender people were mentally ill, highlighted the particular injustices they faced in and out of the gay community, and declared, “All power to Trans Liberation.” ((Trans Liberation Newsletter, in Susan Styker, Transgender History (Berkeley, Ca.: Seal Press, 2008), 96-97. ))

As events in the 1970s broadened sexual freedoms and promoted greater gender equality, so too did they generate sustained and organized opposition. Evangelical Christians and other moral conservatives, for instance, mobilized to reverse gay victories. In 1977, activists in Dade County, Florida used the slogan “Save Our Children” to overturn an ordinance banning discrimination based on sexual orientation. ((William N. Eskridge, Dishonorable Passions: Sodomy Laws in America, 1861-2003 (New York: Viking, 2008), 209-12. )) A leader of the ascendant religious right, Jerry Falwell, said in 1980 that, “It is now time to take a stand on certain moral issues …. We must stand against the Equal Rights Amendment, the feminist revolution, and the homosexual revolution. We must have a revival in this country.” ((Jerry Falwell, Listen, America! (Garden City, N. Y.: Doubleday), 19. ))

Much to Falwell’s delight, conservative Americans did, in fact, stand against and defat the Equal Rights Amendment (ERA), their most stunning social victory of the 1970s. Versions of the Amendment—which declared, “Equality of rights under the law shall not be denied or abridged by the United States or any state on account of sex”—were introduced to Congress each year since 1923. It finally passed amid the upheavals of the sixties and seventies and went to the states for ratification in March 1972. ((Donald Critchlow, Phyllis Schlafly and Grassroots Conservatism: A Woman’s Crusade (Princeton, N. J.: Princeton University Press, 2005), 213-16. )) With high approval ratings, the ERA seemed destined to pass swiftly through state legislatures and become the Twenty-Seventh Amendment. Hawaii ratified the Amendment the same day it cleared Congress. Within a year, thirty states had done so. But then the Amendment stalled. It took years for more states to pass it. In 1977, Indiana became the thirty-fifth and final state to ratify. ((Critchlow, Phyllis Schlafly and Grassroots Conservatism, 218-19. Joel Krieger, ed., The Oxford Companion to the Politics of the World, 2nd ed. (New York: Oxford University Press, 2001), 256.))

By 1977, anti-ERA forces had successfully turned the political tide against the Amendment. At a time when many women shared Betty Friedan’s frustration that society seemed to confine women to the role of homemaker, Phyllis Schlafly’s STOP ERA organization (“Stop Taking Our Privileges”) trumpeted the value and lived advantages of being a homemaker and mother. ((Critchlow, Phyllis Schlafly and Grassroots Conservatism, 219. )) Marshaling the support of evangelical Christians and other religious conservatives, Schlafly worked tirelessly to stifle the ERA. She lobbied legislators and organized counter-rallies to ensure that Americans heard “from the millions of happily married women who believe in the laws which protect the family and require the husband to support his wife and children.” ((Phyllis Schlafly, quoted in Christine Stansell, The Feminist Promise: 1792 to the Present (New York: The Modern Library, 2010), 340. )) The Amendment needed only three more states for ratification. It never got them. In 1982, the time limit for ratification expired—and along with it, the Amendment. ((Critchlow, Phyllis Schlafly and Grassroots Conservatism, 281.))

The failed battle for the ERA uncovered the limits of the feminist crusade. And it illustrated the women’s movement’s inherent incapacity to represent fully the views of fifty percent of the country’s population, a population riven by class differences, racial disparities, and cultural and religious divisions.

 

VIII. The Misery Index

Pumpkins carved in the likeness of President Jimmy Carter in Polk County, Florida, October 1980, State Library and Archives of Florida via Flickr, https://www.flickr.com/photos/floridamemory/10554725056/in/photolist-7YvoG-5T4LFb-9L84Cp-dkZJrj-2ZWj62-2PZWZy-2PVsYB-dpHgKe-h5FJ3d-63diMF-dUscrS-8Xrcgm-2PVujx-BtVQR-2ikusL-8SqgFG-9YhFMv-9Y2e3M-8Xrcm5-6raVQ1-8ELVDa-868hpW-5ZqsiS-9BY7r8-nvHn8g-5suU6Y-5suW69-3gEeUd-vTudu-cPabXu-4xsepv-2PVuEZ-2PVtzR-cPaaTj-yeSxD-m3veAn-nvHDjq-8ESf1U-7nPjSi-2PVws4-nNaapU-868inq-8vhd9Z-9tjaTo-f9Ardt-4FYKJ8-cQk1Co-f8deRv-5nQJeN-dUmA4a.

Pumpkins carved in the likeness of President Jimmy Carter in Polk County, Florida, October 1980, State Library and Archives of Florida via Flickr.

Although Nixon eluded prosecution, Watergate continued to weigh on voters’ minds. It netted big congressional gains for Democrats in the 1974 mid-term elections, and Ford’s pardon damaged his chances in 1976. Former one-term Georgia governor Jimmy Carter, a nuclear physicist and peanut farmer who represented the rising generation of younger, racially liberal “New South” Democrats, captured the Democratic nomination. Unidentified with either his party’s liberal or conservative wings, his appeal was more personal and moral than political. He ran on no great political issues, letting his background as a hardworking, honest, Southern Baptist navy-man ingratiate him to voters around the country, especially in his native South, where support for Democrats had wavered in the wake of the civil rights movement. Carter’s wholesome image was painted in direct contrast to the memory of Nixon, and by association with the man who pardoned him. Carter sealed his party’s nomination in June and won a close victory in November. ((Sean Wilentz, The Age of Reagan – 1974-2008 (New York: Harper Collins, 2008), 69-72.))

When Carter took the oath of office on January 20, 1977, however, he became president of a nation in the midst of economic turmoil. Oil shocks, inflation, stagnant growth, unemployment, and sinking wages weighed down the nation’s economy. Some of these problems were traceable to the end of World War II, when American leaders erected a complex system of trade policies to help rebuild the shattered economies of Western Europe and Asia. After the war, American diplomats and politicians used trade relationships to win influence and allies around the globe. They saw the economic health of their allies, particularly West Germany and Japan, as a crucial bulwark against the expansion of communism. Americans encouraged these nations to develop vibrant export-oriented economies and tolerated restrictions on U.S. imports.

The 1979 energy crisis prompted a panic for consumers who remembered the 1973 oil shortage, prompting many Americans to buy oil in huge quantities. Long lines and high gas prices characterized 1979, oil prices to remained quite high until the mid-1980s. Warren K. Leffler, “Gasoline lines,” June 15, 1979. Library of Congress, http://www.loc.gov/pictures/item/2003677600/.

The 1979 energy crisis prompted a panic for consumers who remembered the 1973 oil shortage, prompting many Americans to buy oil in huge quantities. Long lines and high gas prices characterized 1979, oil prices to remained quite high until the mid-1980s. Warren K. Leffler, “Gasoline lines,” June 15, 1979. Library of Congress.

This came at great cost to the United States. As the American economy stalled, Japan and West Germany soared and became major forces in the global production for autos, steel, machine tools, and electrical products. By 1970, the United States began to run massive trade deficits. The value of American exports dropped and the prices of its imports skyrocketed. Coupled with the huge cost of the Vietnam War and the rise of oil-producing states in the Middle East, growing trade deficits sapped the United States’ dominant position in the global economy.

American leaders didn’t know how to respond. After a series of negotiations with leaders from France, Great Britain, West Germany, and Japan in 1970 and 1971, the Nixon administration allowed these rising industrial nations to continue flouting the principles of free trade. They maintained trade barriers that sheltered their domestic markets from foreign competition while at the same time exporting growing amounts of goods to the United States, which no longer maintained so comprehensive a tariff system. By 1974, in response to U.S. complaints and their own domestic economic problems, many of these industrial nations overhauled their protectionist practices but developed even subtler methods (such as state subsidies for key industries) to nurture their economies.

The result was that Carter, like Ford before him, presided over a hitherto unimagined economic dilemma: the simultaneous onset of inflation and economic stagnation, a combination popularized as “stagflation.” ((Wilentz, Age of Reagan, 75.)) Neither Ford nor Carter had the means or ambition to protect American jobs and goods from foreign competition. As firms and financial institutions invested, sold goods, and manufactured in new rising economies like Mexico, Taiwan, Japan, Brazil, and elsewhere, American politicians allowed them to sell their often less costly products in the United States.

As American officials institutionalized this new unfettered global trade, many American manufacturers perceived only one viable path to sustained profitability: moving overseas, often by establishing foreign subsidiaries or partnering with foreign firms. Investment capital, especially in manufacturing, fled the U.S. looking for overseas investments and hastened the decline in the productivity of American industry.

During the 1976 presidential campaign, Carter had touted the “misery index,” the simple addition of the unemployment rate to the inflation rate, as an indictment of Gerald Ford and Republican rule. But Carter failed to slow the unraveling of the American economy and the stubborn and confounding rise of both unemployment and inflation damaged his presidency.

Just as Carter failed to offer or enact policies to stem the unraveling of the American economy, his idealistic vision of human rights-based foreign policy crumbled. He had not made human rights a central theme in his campaign, but in May 1977 he declared his wish to move away from a foreign policy in which “inordinate fear of communism” caused American leaders to “adopt the flawed and erroneous principles and tactics of our adversaries.” Carter proposed instead “a policy based on constant decency in its values and on optimism in our historical vision.” ((Jimmy Carter, “University of Notre Dame – Address at the Commencement Exercises at the University,” May 22, 1977, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=7552.))

Carter’s human rights policy achieved real victories: the U.S. either reduced or eliminated aid to American-supported right-wing dictators guilty of extreme human rights abuses in places like South Korea, Argentina, and the Philippines. In September 1977, Carter negotiated the return to Panama of the Panama Canal, which cost him enormous political capital in the United States. ((Wilentz, Age of Reagan, 100-02.)) A year later, in September 1978, Carter negotiated a peace treaty between Israeli Prime Minister Menachem Begin and Egyptian President Anwar Sadat. The Camp David Accords–named for the president’s rural Maryland retreat, where thirteen days of secret negotiations were held–represented the first time an Arab state had recognized Israel, and the first time Israel promised Palestine self-government. The Accords had limits, both for Israel and the Palestinians, but they represented a major foreign policy coup for Carter. ((Harvey Sicherman, Palestinian Autonomy, Self-Government, & Peace (Boulder, Co.: Westview Press, 1993), 35.))

And yet Carter’s dreams of a human rights-based foreign policy crumbled before the Cold War and the realities of American politics. The United States continued to provide military and financial support for dictatorial regimes vital to American interests, such as the oil-rich state of Iran. When the President and First Lady Rosalynn Carter visited Tehran, Iran in January 1978, the President praised the nation’s dictatorial ruler, Shah Reza Pahlavi, and remarked on the “respect and the admiration and love” Iranians had for their leader. ((Jimmy Carter, “Tehran, Iran Toasts of the President and the Shah at a State Dinner,” December 31, 1977, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=7080.)) When the Shah was deposed in November 1979, revolutionaries stormed the American embassy in Tehran and took fifty-two Americans hostage.  Americans not only experienced another oil crisis as Iran’s oil fields shut down, they watched America’s news programs, for 444 days, remind them of the hostages and America’s new global impotence. Carter couldn’t win their release. A failed rescue mission only ended in the deaths of eight American servicemen. Already beset with a punishing economy, Carter’s popularity plummeted.

Carter’s efforts to ease the Cold War by achieving a new nuclear arms control agreement disintegrated under domestic opposition from conservative Cold War hawks such as Ronald Reagan, who accused Carter of weakness. A month after the Soviets invaded Afghanistan in December 1979, a beleaguered Carter committed the United States to defending its “interests” in the Middle East against Soviet incursions, declaring that “an assault [would] be repelled by any means necessary, including military force.” The “Carter Doctrine” not only signaled Carter’s ambivalent commitment to deescalation and human rights, it testified to his increasingly desperate presidency. ((Jimmy Carter, “The State of the Union Address,” January 23, 1980, American Presidency Project, http://www.presidency.ucsb.edu/ws/?pid=33079.))

The collapse of American manufacturing, the stubborn rise of inflation, the sudden impotence of American foreign policy, and a culture ever more divided: the sense of unraveling pervaded the nation. “I want to talk to you right now about a fundamental threat to American democracy,” Jimmy Carter said in televised address on July 15, 1979. “The threat is nearly invisible in ordinary ways. It is a crisis of confidence. It is a crisis that strikes at the very heart and soul and spirit of our national will.”

 

IX. Conclusion

Though American politics moved right after Lyndon Johnson’s administration, Nixon’s 1968 election was no conservative counterrevolution. American politics and society remained in flux throughout the 1970s. American politicians on the right and the left pursued relatively moderate courses compared to those in the preceding and succeeding decades. But a groundswell of anxieties and angers brewed beneath the surface. The world’s greatest military power had floundered in Vietnam and an American president stood flustered by Middle Eastern revolutionaries. The cultural clashes from the sixties persisted and accelerated. While cities burned, a more liberal sexuality permeated American culture. The economy crashed, leaving America’s cities prone before poverty and crime and its working class gutted by deindustrialization and globalization. American weakness was everywhere. And so, by 1980, many Americans—especially white middle- and upper-class Americans—felt a nostalgic desire for simpler times and simpler answers to the frustratingly complex geopolitical, social, and economic problems crippling the nation. The appeal of Carter’s soft drawl and Christian humility had signaled this yearning, but his utter failure to stop the unraveling of American power and confidence opened the way for a new movement, one with new personalities and a new conservatism—one that promised to undo the damage and restore the United States to its own nostalgic image of itself.

 

Contributors

This chapter was edited by Edwin Breeden, with content contributions by Seth Anziska, Jeremiah Bauer, Edwin Breeden, Kyle Burke, Alexandra Evans, Sean Fear, Anne Gray Fischer, Destin Jenkins, Matthew Kahn, Suzanne Kahn, Brooke Lamperd, Katherine McGarr, Matthew Pressman, Adam Parsons, Emily Prifogle, John Rosenberg, Brandy Thomas Wells, and Naomi R. Williams.

 

Recommended Reading

  1. Carter, Dan T. The Politics of Rage: George Wallace, the Origins of the New Conservatism, and the Transformation of American Politics. Baton Rouge: Louisiana State University Press, 1995.
  2. Cowie, Jefferson R., Stayin’ Alive: The 1970s and the Last Days of the Working Class. New York: New Press, 2010.
  3. Michael W. Flamm, Law and Order: Street Crime, Civil Unrest, and the Crisis of Liberalism in the 1960s. New York: Columbia University Press, 2005.
  4. Formisano, Ronald P. Boston Against Busing: Race, Class, and Ethnicity in the 1960s and 1970s. Chapel Hill: University of North Carolina Press, 1991.
  5. Harvey, David. The Condition of Postmodernity: An Enquiry into the Origins of Cultural Change. Cambridge: Blackwell, 1989.
  6. Jenkins, Philip. Decade of Nightmares: The End of the Sixties and the Making of Eighties America. New York: Oxford University Press, 2008.
  7. Lassiter, Matthew D. The Silent Majority: Suburban Politics in the Sunbelt South. Princeton: Princeton University Press, 2006.
  8. Matusow, Allen J. The Unraveling of America: A History of Liberalism in the 1960s. New York: Harper & Row, 1984.
  9. Patterson, James T. Grand Expectations: The United States, 1945–1974. New York: Oxford University Press, 1996.
  10. Perlstein, Rick. Nixonland: The Rise of a President and the Fracturing of America. New York: Norton, 2003.
  11. Roth, Benita. Separate Roads to Feminism: Black, Chicana, and White Feminist Movements in America’s Second Wave. New York: Cambridge University Press, 2004.
  12. Schulman, Bruce J. The Seventies: The Great Shift in American Culture, Society, and Politics. New York: Free Press, 2001.
  13. Rodgers, Daniel T. Age of Fracture. Cambridge: The Belknap Press of Harvard University Press, 2011.
  14. Stein, Judith. Pivotal Decade: How the United States Traded Factories for Finance in the 1970s. New Haven: Yale University Press. 2010.
  15. Zaretsky, Natasha. No Direction Home: The American Family and the Fear of National Decline. Chapel Hill: University of North Carolina Press, 2007.

 

Notes

27. The Sixties

"Participants, some carrying American flags, marching in the civil rights march from Selma to Montgomery, Alabama in 1965," via Library of Congress.

“Participants, some carrying American flags, marching in the civil rights march from Selma to Montgomery, Alabama in 1965,” via Library of Congress.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

Perhaps no decade is so immortalized in American memory as the 1960s. Couched in the colorful rhetoric of peace and love, complemented by stirring images of the civil rights movement, and fondly remembered for its music, art, and activism, for many the decade brought hopes for a more inclusive, forward-thinking nation. But the decade was also plagued by strife, tragedy, and chaos. It was the decade of the Vietnam War, of inner-city riots, and assassinations that seemed to symbolize the crushing of a new generation’s idealism. A decade of struggle and disillusionment rocked by social, cultural, and political upheaval, the 1960s are remembered because so much changed, and because so much did not.

 

II. Kennedy and Cuba

The decade’s political landscape began with a watershed presidential election. Americans were captivated by the 1960 race between Republican Vice President Richard Nixon and Democratic Senator John F. Kennedy, two candidates who pledged to move the nation forward and invigorate an economy experiencing the worst recession since the Great Depression. Kennedy promised to use federal programs to strengthen the economy and address pockets of longstanding poverty, while Nixon called for a reliance on private enterprise and reduction of government spending. Both candidates faced criticism as well; Nixon had to defend Dwight Eisenhower’s domestic policies, while Kennedy, who was attempting to become the first Catholic president, had to counteract questions about his faith and convince voters that he was experienced enough to lead.

One of the most notable events of the Nixon-Kennedy presidential campaign was their televised debate in September, the first of its kind between major presidential candidates. The debate focused on domestic policy and provided Kennedy with an important moment to present himself as a composed, knowledgeable statesman. In contrast, Nixon, an experienced debater who faced higher expectations, looked sweaty and defensive. Radio listeners famously thought the two men performed equally well, but the TV audience was much more impressed by Kennedy, giving him an advantage in subsequent debates. Ultimately, the election was extraordinarily close; in the largest voter turnout in American history up to that point, Kennedy bested Nixon by less than one percentage point (34,227,096 to 34,107,646 votes). Although Kennedy’s lead in electoral votes was more comfortable at 303 to 219, the Democratic Party’s victory did not translate in Congress, where Democrats lost a few seats in both houses. As a result, Kennedy entered office in 1961 without the mandate necessary to achieve the ambitious agenda he would refer to as the New Frontier.

The United States entered the 1960s unaccustomed to stark foreign policy failures, having emerged from World War II as a global superpower before waging a Cold War against the Soviet Union in the 1950s. In the new decade, unsuccessful conflicts in Cuba and Vietnam would yield embarrassment, fear, and tragedy, stunning a nation used to triumph and altering the way many thought of America’s role in international affairs.

On January 8, 1959, Fidel Castro and his revolutionary army initiated a new era of Cuban history. Having ousted the corrupt Cuban President Fulgenico Batista, who had fled Havana on New Year’s Eve, Castro and his rebel forces made their way triumphantly through the capital city’s streets. The United States, who had long propped up the Batista’s corrupt regime, had withdrawn support and, initially, expressed sympathy for Castro’s new government, which was immediately granted diplomatic recognition. But President Dwight Eisenhower and members of his administration were wary. The new Cuban government soon instituted leftist economic policies centered around agrarian reform, land redistribution, and the nationalization of private enterprises. Cuba’s wealthy and middle class citizens fled the island in droves. Many settled in Miami, Florida, and other American cities.

The relationship between Cuba and the United States deteriorated rapidly. On October 19, 1960, the United States instituted a near-total trade embargo to economically isolate the Cuban regime, and in January 1961 the two nations broke off formal diplomatic relations. The Central Intelligence Agency, acting under the mistaken belief that the Castro government lacked popular support and that Cuban citizens would revolt if given the opportunity, began to recruit members of the exile community to participate in an invasion of the island. On April 16, 1961, an invasion force consisting primarily of Cuban émigrés landed on Girón Beach at the Bay of Pigs. Cuban soldiers and civilians quickly overwhelmed the exiles, many of whom were taken prisoner. The Cuban government’s success at thwarting the Bay of Pigs invasion did much to legitimize the new regime and was a tremendous embarrassment for the Kennedy administration.

As the political relationship between Cuba and the United States disintegrated, the Castro government became more closely aligned with the Soviet Union. This strengthening of ties set the stage for the Cuban Missile Crisis, perhaps the most dramatic foreign policy crisis in the history of the United States. In 1962, in response to the US’s long-time maintenance of a nuclear arsenal in Turkey and at the invitation of the Cuban government, the Soviet Union deployed nuclear missiles in Cuba. On October 14, 1962, American spy planes detected the construction of missile launch sites, and on October 22, President Kennedy addressed the American people to alert them to this threat. Over the course of the next several days, the world watched in horror as the United States and the Soviet Union hovered on the brink of nuclear war. Finally, on October 28, the Soviet Union agreed to remove its missiles from Cuba in exchange for a U.S. agreement to remove its missiles from Turkey and a formal pledge that the United States would not invade Cuba, and the crisis was resolved peacefully.

The Cuban Missile Crisis was a time of great fear throughout America. Women in this photograph urged President Kennedy to be cautious of instigating war. Phil Stanziola, “800 women strikers for peace on 47 St near the UN Bldg,” 1962. Library of Congress, http://www.loc.gov/pictures/item/2001696167/.

The Cuban Missile Crisis was a time of great fear throughout America. Women in this photograph urged President Kennedy to be cautious of instigating war. Phil Stanziola, “800 women strikers for peace on 47 St near the UN Bldg,” 1962. Library of Congress.

Though the Cuban Missile Crisis temporarily halted the flow of Cuban refugees into the United States, emigration began again in earnest in the mid-1960s. In 1965, the Johnson administration and the Castro government brokered a deal that facilitated the reunion of families that had been separated by earlier waves of migration, opening the door for thousands to leave the island. In 1966 President Lyndon B. Johnson signed the Cuban Adjustment Act, a law allowing Cuban refugees to become permanent residents. Over the course of the 1960s, hundreds of thousands of Cubans left their homeland and built new lives in America.

 

III. The Civil Rights Movement Continues

So much of the energy and character of “the sixties” emerged from the civil rights movement, which won its greatest victories in the early years of the decade. The movement itself was changing. Many of the civil rights activists pushing for school desegregation in the 1950s were middle-class and middle-aged. In the 1960s, a new student movement arose whose members wanted swifter changes in the segregated South. Confrontational protests, marches, boycotts, and sit-ins accelerated. ((For the major events of the civil rights movement, see Taylor Branch, Parting the waters: America in the King years, 1954–1963 (New York: Simon & Schuster, 1988); Pillar of Fire: America in the King years, 1963–1965 (New York: Simon & Schuster, 1998); and At Canaan’s Edge: America in the King Years, 1965-68 (New York: Simon & Schuster, 2007).))

The tone of the modern U.S. civil rights movement changed at a North Carolina department store in 1960, when four African American students participated in a “sit-in” at a whites-only lunch counter. The 1960 Greensboro sit-ins were typical. Activists sat at segregated lunch counters in an act of defiance, refusing to leave until being served and willing to be ridiculed, attacked, and arrested if they were not. It drew resistance but it forced the desegregation of Woolworth’s department store. It prompted copycat demonstrations across the South. The protests offered evidence that student-led direct action could enact social change and established the civil rights movement’s direction in the forthcoming years. ((Branch, Parting.))

The following year, civil rights advocates attempted a bolder variation of a “sit-in” when they participated in the Freedom Rides. Activists organized interstate bus rides following a Supreme Court decision outlawing segregation on public buses and trains. The rides intended to test the court’s ruling, which many southern states had ignored. An interracial group of Freedom Riders boarded buses in Washington D.C. with the intention of sitting in integrated patterns on the buses as they traveled through the Deep South. On the initial rides in May 1961, the riders encountered fierce resistance in Alabama. Angry mobs composed of KKK members attacked riders in Birmingham, burning one of the buses and beating the activists who escaped. Despite the fact that the first riders abandoned their trip and decided to fly to their destination, New Orleans, civil rights activists remained vigilant. Additional Freedom Rides launched through the summer and generated national attention amid additional violent resistance. Ultimately, the Interstate Commerce Commission enforced integrated interstate buses and trains in November 1961. ((Raymond Arsenault, Freedom Riders: 1961 and the Struggle for Racial Justice (New York: Oxford University Press, 2006).))

In the fall of 1961, civil rights activists descended on Albany, a small city in southwest Georgia. A place known for entrenched segregation and racial violence, Albany seemed an unlikely place for black Americans to rally and demand civil rights gains. The activists there, however, formed the Albany Movement, a coalition of civil rights organizers that included members of the Student Nonviolent Coordinating Committee (SNCC, or, “snick”), the Southern Christian Leadership Conference (SCLC), and the NAACP. But in Albany the movement was stymied by police chief Laurie Pritchett, who launched mass arrests but refused to engage in police brutality and bailed out leading officials to avoid negative media attention. It was a peculiar scene, and a lesson for southern acvtivists. ((Clayborne Carson, In Struggle: SNCC and the Black Awakening of the 1960s (Cambridge: Harvard University Press, 1980); Adam Fairclough, To Redeem the Soul of America: The Southern Christian Leadership Conference & Martin Luther King (Athens: University of Georgia Press, 1987).))

The Albany Movement included elements of a Christian commitment to social justice in its platform, with activists stating that all people were “of equal worth” in God’s family and that “no man may discriminate against or exploit another.” In many instances in the 1960s, black Christianity propelled civil rights advocates to action and demonstrated the significance of religion to the broader civil rights movement. King’s rise to prominence underscored the role that African American religious figures played in the 1960s civil rights movement. Protestors sang hymns and spirituals as they marched. Preachers rallied the people with messages of justice and hope. Churches hosted meetings, prayer vigils, and conferences on nonviolent resistance. The moral thrust of the movement strengthened African American activists while also confronting white society by framing segregation as a moral evil. ((David L. Chappell, A Stone of Hope: Prophetic Religion and the Death of Jim Crow (Chapel Hill: University of North Carolian Press, 2005).))

As the civil rights movement garnered more followers and more attention, white resistance stiffened. In October 1962, James Meredith became the first African American student to enroll at the University of Mississippi. Meredith’s enrollment sparked riots on the Oxford campus, prompting President John F. Kennedy to send in U.S. Marshals and National Guardsmen to maintain order.  On an evening known infamously as the Battle of Ole Miss, segregationists clashed with troops in the middle of campus, resulting in two deaths and hundreds of injuries. Violence despite federal intervention served as a reminder of the strength of white resistance to the civil rights movement, particularly in the realm of education. ((Branch, Parting.))

James Meredith, accompanied by U.S. Marshalls, walks to class at the University of Mississippi in 1962. Meredith was the first African-American student admitted to the still segregated Ole Miss. Marion S. Trikosko, “Integration at Ole Miss[issippi] Univ[ersity],” 1962. Library of Congress, http://www.loc.gov/pictures/item/2003688159/.

James Meredith, accompanied by U.S. Marshalls, walks to class at the University of Mississippi in 1962. Meredith was the first African-American student admitted to the still segregated Ole Miss. Marion S. Trikosko, “Integration at Ole Miss[issippi] Univ[ersity],” 1962. Library of Congress, http://www.loc.gov/pictures/item/2003688159/.

The following year, 1963, was perhaps the decade’s most eventful year for civil rights. In April and May, the SCLC organized the Birmingham Campaign, a broad campaign of direct action aiming to topple segregation in Alabama’s largest city. Activists used business boycotts, sit-ins, and peaceful marches as part of the campaign. SCLC leader Martin Luther King Jr. was jailed, prompting his famous handwritten letter urging not only his nonviolent approach but active confrontation to directly challenge injustice. The campaign further added to King’s national reputation and featured powerful photographs and video footage of white police officers using fire hoses and attack dogs on young African American protesters. It also yielded an agreement to desegregate public accommodations in the city: activists in Birmingham scored a victory for civil rights and drew international praise for the nonviolent approach in the face of police-sanctioned violence and bombings. ((Branch, Parting.))

 

Images of police brutality against peaceful Civil Rights demonstrators shocked many Americans and helped increase support for the movement. Photograph. http://www.legacy.com/UserContent/ns/Photos/Fire%20hoses%20used%20against%20civil%20rights%20protesters%20in%20Birmingham%201963.jpg.

Images of police brutality against peaceful Civil Rights demonstrators shocked many Americans and helped increase support for the movement. Photograph. Source.

White resistance magnified. While much of the rhetoric surrounding the 1960s focused on a younger, more liberal generation’s progressive ideas, conservatism maintained a strong presence on the American political scene. Few political figures in the decade embodied the working-class, conservative views held by millions of white Americans quite like George Wallace. Wallace’s vocal stance on segregation was immortalized in his 1963 inaugural address as Alabama governor with the phrase: “Segregation now, segregation tomorrow, segregation forever!” Just as the civil rights movement began to gain unprecedented strength, Wallace became the champion of the many white southerners opposed to the movement. Consequently, Wallace was one of the best examples of the very real opposition civil rights activists faced in the late twentieth century. ((Dan T. Carter, The Politics of Rage: George Wallace, the Origins of the New Conservatism, and the Transformation of American Politics (Baton Rouge: Louisiana University Press, 2000).))

As governor, Wallace loudly supported segregation. His efforts were symbolic, but they earned him national recognition as a political figure willing to fight for what many southerners saw as their traditional way of life. In June 1963, just five months after becoming governor, in his “Stand in the Schoolhouse Door,” Wallace famously stood in the door of a classroom building to protest integration at the University of Alabama. President Kennedy addressed the nation that evening, criticizing Wallace and calling for a comprehensive civil rights bill. A day later, civil rights leader Medgar Evers was assassinated at his home in Jackson, Mississippi.

Alabama governor George Wallace stands defiantly at the door of the University of Alabama, blocking the attempted integration of the school. Wallace was perhaps the most notoriously pro-segregation politician of the 1960s, proudly proclaiming in his 1963 inaugural address “segregation now, segregation tomorrow, segregation forever.” Warren K. Leffler, “[Governor George Wallace attempting to block integration at the University of Alabama],” June 11, 1963. Library of Congress, http://www.loc.gov/pictures/item/2003688161/.

Alabama governor George Wallace stands defiantly at the door of the University of Alabama, blocking the attempted integration of the school. Wallace was perhaps the most notoriously pro-segregation politician of the 1960s, proudly proclaiming in his 1963 inaugural address “segregation now, segregation tomorrow, segregation forever.” Warren K. Leffler, “[Governor George Wallace attempting to block integration at the University of Alabama],” June 11, 1963. Library of Congress.

That summer, civil rights leaders organized the August 1963 March on Washington. The march called for, among other things, civil rights legislation, school integration, an end to discrimination by public and private employers, job training for the unemployed, and a raise in the minimum wage. On the steps of the Lincoln Memorial, King delivered his famous “I Have a Dream” speech, an internationally renowned call for civil rights that raised the movement’s profile to new heights and put unprecedented pressure on politicians to pass meaningful civil rights legislation. ((Branch, Parting.))

 

White activists increasingly joined African Americans in the Civil Rights Movement during the 1960s. This photograph shows Martin Luther King, Jr., and other black civil rights leaders arm-in-arm with leaders of the Jewish community. Photograph, August 28, 1963. Wikimedia, http://commons.wikimedia.org/wiki/File:March_on_washington_Aug_28_1963.jpg.

White activists increasingly joined African Americans in the Civil Rights Movement during the 1960s. This photograph shows Martin Luther King, Jr., and other black civil rights leaders arm-in-arm with leaders of the Jewish community. Photograph, August 28, 1963. Wikimedia, http://commons.wikimedia.org/wiki/File:March_on_washington_Aug_28_1963.jpg.

Kennedy offered support for a civil rights bill, but, unable to push past southern resistance and unwilling to expend too much political capital, the bill stalled in Congress. Then, on November 22, 1963, President Kennedy was assassinated in Dallas. The nation’s youthful, popular president was gone. Vice President Lyndon Johnson lacked Kennedy’s youth, his charisma, his popularity, and his aristocratic upbringing, but no one knew Washington better and no one before or since fought harder and more successfully to pass meaningful civil rights legislation. Raised in poverty in the Texas Hill Country, Johnson scratched and clawed his way up the political ladder. He was both ruthlessly ambitious and keenly conscious of poverty and injustice. He idolized Franklin Roosevelt, for instance, whose New Deal had brought improvements for the impoverished Central Texans Johnson grew up with.

President Lyndon Johnson, then, an old white southerner with a thick Texas drawl, embraced the civil rights movement. He took Kennedy’s stalled civil rights bill, ensured that it would have teeth, and navigated it through Congress. The following summer he signed the Civil Rights Act of 1964, widely considered to be among the most important pieces of civil rights legislation in American history. The comprehensive act barred segregation in public accommodations and outlawed discrimination based on race, ethnicity, gender, and national or religious origin.

Lyndon B. Johnson sits with Civil Rights Leaders in the White House. One of Johnson’s greatest legacies would be his staunch support of civil rights legislation. Photograph, January 18, 1964. Wikimedia, http://commons.wikimedia.org/wiki/File:Lyndon_Johnson_meeting_with_civil_rights_leaders.jpg.

Lyndon B. Johnson sits with Civil Rights Leaders in the White House. One of Johnson’s greatest legacies would be his staunch support of civil rights legislation. Photograph, January 18, 1964. Wikimedia.

Lyndon B. Johnson was not afraid to use whatever means necessary to get his legislation passed. Johnson was notoriously crude, rude, and irreverent, making the massive amount of legislation he got passed even more incredible. Yoichi R. Okamoto, Photograph of Lyndon B. Johnson pressuring Senator Richard Russell, December 17, 1963. Wikimedia, http://en.wikipedia.org/wiki/File:Lyndon_Johnson_and_Richard_Russell.jpg.

Johnson gives Senator Richard Russell the famous “Johnson Treatment.” Yoichi R. Okamoto, Photograph of Lyndon B. Johnson pressuring Senator Richard Russell, December 17, 1963. Wikimedia.

The civil rights movement created space for political leaders to pass legislation, and the movement continued pushing forward. Direct action continued through the summer of 1964, as student-run organizations like SNCC and CORE (The Congress of Racial Equality) helped with the Freedom Summer in Mississippi, a drive to register African American voters in a state with an ugly history of discrimination. Freedom Summer campaigners set up schools for African American children and endured intimidation tactics. Even with progress, violent resistance against civil rights continued, particularly in regions with longstanding traditions of segregation. ((Branch, Pillar.))

In March 1965, when activists attempted to march from Selma to Montgomery, Alabama, with the support of prominent civil rights leaders on behalf of local African American voting rights. In a narrative that had become familiar, “Bloody Sunday” featured peaceful protesters attacked by white law enforcement with batons and tear gas. After they were turned away violently a second time, marchers finally made the 70-mile trek to the state capitol later in the month. Coverage of the first march prompted President Johnson to present the bill that became the Voting Rights Act of 1965, an act that abolished voting discrimination in federal, state, and local elections. In two consecutive years, landmark pieces of legislation had assaulted de jure segregation and disenfranchisement. ((Branch, Canann’s Edge.))

Five leaders of the Civil Rights Movement. From left: Bayard Rustin, Andrew Young, N.Y. Congressman William Ryan, James Farmer, and John Lewis in 1965. Stanley Wolfson, Photograph, 1965. Library of Congress, http://www.loc.gov/pictures/item/98515229/.

Five leaders of the Civil Rights Movement. From left: Bayard Rustin, Andrew Young, N.Y. Congressman William Ryan, James Farmer, and John Lewis in 1965. Stanley Wolfson, Photograph, 1965. Library of Congress, http://www.loc.gov/pictures/item/98515229/.

 

IV. Lyndon Johnson’s Great Society

On a May morning in 1964, President Johnson laid out a sweeping vision for a package of domestic reforms known as the Great Society. Speaking before that year’s graduates of the University of Michigan, Johnson called for “an end to poverty and racial injustice” and challenged both the graduates and American people to “enrich and elevate our national life, and to advance the quality of our American civilization.” At its heart, he promised, the Great Society would uplift racially and economically disfranchised Americans, too long denied access to federal guarantees of equal democratic and economic opportunity, while simultaneously raising all Americans’ standards and quality of life. ((Lyndon Baines Johnson, “Remarks at the University of Michigan,” May 22, 1964, Public Papers of the Presidents of the United States: Lyndon B. Johnson, 1964, (Washington, D.C.: Government Printing Office, 1965), 704.))

The Great Society’s legislation was breathtaking in scope, and many of its programs and agencies are still with us today. Most importantly, the Civil Rights Act of 1964 and the Voting Rights Act of 1965 codified federal support for many of the civil rights movement’s goals by prohibiting job discrimination, abolishing the segregation of public accommodations, and providing vigorous federal oversight of southern states’ primary and general election laws in order to guarantee minority access to the ballot. Ninety years after Reconstruction, these measures effectively ended Jim Crow.

In addition to civil rights, the Great Society took on a range of quality of life concerns that seemed suddenly solvable in a society of such affluence. It established the first federal Food Stamp Program. Medicare and Medicaid would ensure access to quality medical care for the aged and poor. In 1965, the Elementary and Secondary Education Act was the first sustained and significant federal investment in public education, totaling more than $1 billion. Significant funds were poured into colleges and universities. The Great Society also established the National Endowment for the Arts and the National Endowment for the Humanities, federal investments in arts and letters that fund American cultural expression to this day.

While these programs persisted and even thrived, in the years immediately following this flurry of legislative activity, the national conversation surrounding Johnson’s domestic agenda largely focused on the $3 billion spent on War on Poverty programming within the Great Society’s Economic Opportunity Act of 1964. No EOA program was more controversial than Community Action, considered the cornerstone antipoverty program. Johnson’s antipoverty planners felt the key to uplifting disfranchised and impoverished Americans was involving poor and marginalized citizens in the actual administration of poverty programs, what they called “maximum feasible participation.” Community Action Programs would give disfranchised Americans a seat at the table in planning and executing federally funded programs that were meant to benefit themselves—a significant sea change in the nation’s efforts to confront poverty, which had historically relied upon local political and business elites or charitable organizations for administration. ((See, for instance, Wesley G. Phelps, A People’s War on Poverty: Urban Politics and Grassroots Activists in Houston (Athens: University of Georgia Press, 2014).))

In fact, Johnson himself had never conceived of poor Americans running their own poverty programs. While the president’s rhetoric offered a stirring vision of the future, he had singularly old-school notions for how his poverty policies would work. In contrast to “maximum feasible participation,” the President imagined a second New Deal: local elite-run public works camps that would instill masculine virtues in unemployed young men. Community Action almost entirely bypassed local administrations and sought to build grassroots civil rights and community advocacy organizations, many of which had originated in the broader civil rights movement. Despite widespread support for most Great Society programs, the War on Poverty increasingly became the focal point of domestic criticisms from the left and right. On the left, frustrated liberals recognized the president’s resistance to empowering minority poor and also assailed the growing war in Vietnam, the cost of which undercut domestic poverty spending. As racial unrest and violence swept across urban centers, critics from the right lambasted federal spending for “unworthy” and even criminal citizens.

Johnson had secured a series of of meaningful civil rights laws, but then things began to stall. Days after the ratification of the Voting Rights Act, race riots broke out in the Watts District of Los Angeles. Rioting in Watts stemmed from local African American frustrations with residential segregation, police brutality, and racial profiling. Waves of riots would rock American cities every summer thereafter. Particularly destructive riots occurred in 1967—two summers later—in Newark and Detroit. Each resulted in deaths, injuries, arrests, and millions of dollars in property damage. In spite of black achievements, inner-city problems persisted for many African Americans. The phenomenon of “white flight”—when whites in metropolitan areas fled city centers for the suburbs—often resulted in “re-segregated” residential patterns. Limited access to economic and social opportunities in urban areas bred discord. In addition to reminding the nation that the civil rights movement was a complex, ongoing event without a concrete endpoint, the unrest in northern cities reinforced the notion that the struggle did not occur solely in the South. Many Americans also viewed the riots as an indictment of the Great Society, President Johnson’s sweeping agenda of domestic programs that sought to remedy inner-city ills by offering better access to education, jobs, medical care, housing, and other forms of social welfare. The civil rights movement was never the same. ((Ibid.))

Despite the fact that the Civil Rights and Voting Rights Acts and the War on Poverty provoked conservative resistance and were crucial catalysts for the rise of Republicans in the South and West, subsequent presidents and Congresses have left largely intact the bulk of the Great Society. Many of its programs, such as Medicare and Medicaid, food stamps, federal spending for arts and literature, and Head Start, are considered by many to be effective forms of government action. Even Community Action programs, so fraught during their few short years of activity, inspired and empowered a new generation of minority and poverty community activists who had never before felt, as one put it, “this government is with us.” ((Guian A. McKee, “‘This Government is With Us’: Lyndon Jonson and the Grassroots War on Poverty,” in Annelise Orleck and Lisa Gayle Hazirjian, editors, The War on Poverty: A New Grassroots History, 1964-1980 (Athens: University of Georgia Press, 2011).))

 

V. The Origins of the Vietnam War

American involvement in the Vietnam War began during the postwar period of decolonization. The Soviet Union backed many nationalist movements across the globe, but the United States feared the expansion of communist influence and pledged to confront any revolutions aligned against western capitalism. The “domino theory”—the idea that if a country fell to communism, then neighboring states would soon follow—governed American foreign policy. After the communist takeover of China in 1949, the United States financially supported the French military’s effort to retain control over its colonies in Vietnam, Cambodia and Laos. 

Between 1946 and 1954, France fought a counterinsurgency campaign against the nationalist Viet Minh forces led by Ho Chi Minh. The United States assisted the French war effort with funds, arms, and advisors, but it was not enough. On the eve of the Geneva Peace Conference in 1954, Viet Minh forces defeated the French army at Dien Bien Phu. The conference temporarily divided Vietnam into two separate states until United Nations-monitored elections occurred. But the U.S. feared a communist electoral victory and so blocked the elections. The temporary partition became permanent. The U.S. established the Republic of Vietnam, or South Vietnam, with the US-backed Ngo Dinh Diem as prime minister. Although he was a nationalist, Diem, who had lived in the United States, was a committed anti-communist.

Diem’s government, however, and its Army of the Republic of Vietnam (ARVN) could not contain the communist insurgency seeking the reunification of Vietnam. The Americans provided weapons and support, but, despite a clear numerical and technological advantage, South Vietnam’s stumbled before insurgent Vietcong (VC) units. Diem, a corrupt leader propped up by the American government with little domestic support, was assassinated in 1963. A merry-go-round of military dictators followed as the situation in South Vietnam continued to deteriorate. The American public, though, remained largely unaware of Vietnam in the early 1960s, even as President John F. Kennedy deployed some sixteen thousand military advisers to help South Vietnam suppress a domestic communist insurgency. ((Michael P. Sullivan, The Vietnam War: A Study in the Making of American Foreign Policy (Lexington: University Press of Kentucky, 1985), 58.))

This all changed in 1964. On August 2, the USS Maddox reported incoming fire from North Vietnamese ships in the Gulf of Tonkin. Although the details of the incident are controversial, the Johnson administration exploited the incident to provide a pretext for escalating American involvement in Vietnam. Congress passed the Gulf of Tonkin Resolution, granting President Johnson the authority to deploy the American military to defend South Vietnam. U.S. Marines landed in Vietnam in March, 1965, and the American ground war began.

American forces under General William Westmoreland were tasked with defending South Vietnam against the insurgent Vietcong (VC) and the regular North Vietnamese Army (NVA). But no matter how many troops the Americans sent, or how many bombs they dropped, they could not win. This was a different kind of war. Progress was not measured by cities won or territory taken, but by body counts and kill-ratios. Although American officials like Westmoreland and Secretary of Defense Robert McNamara claimed a communist defeat was on the horizon, by 1968 half-a-million American troops were stationed in Vietnam, nearly 20,000 had been killed, and the war was still no closer to being won. Protests, which would provide the backdrop for the American counterculture, erupted across the country.

 

VI. Culture and Activism

Epitomizing the folk music and protest culture of 1960s youth, Joan Baez and Bob Dylan are pictured here singing together at the March on Washington in 1963. Photograph, Wikimedia, http://upload.wikimedia.org/wikipedia/commons/3/33/Joan_Baez_Bob_Dylan.jpg.

Epitomizing the folk music and protest culture of 1960s youth, Joan Baez and Bob Dylan are pictured here singing together at the March on Washington in 1963. Photograph, Wikimedia.

The 1960s wrought enormous cultural change. The United States that entered the decade looked and sounded little like the one that left it. Rebellion rocked the supposedly hidebound conservatism of the 1950s as the youth counterculture became mainstream. Native Americans, Chicanos, women, and environmentalists participated in movements demonstrating that “rights” activism could be applied to ethnicity, gender, and nature. Even established religious institutions such as the Catholic Church underwent transformations reflecting an emerging emphasis on freedom and tolerance. In each instance, the decade brought about substantial progress with a reminder that the activism in each cultural realm remained fluid and unfinished.

Much of the counterculture was filtered through popular culture and consumption. The fifties consumer culture still saturated the country and advertisers continued to appeal to teenagers and the expanding youth market. During the 1960s, though, advertisers looked to a growing counterculture to sell their products. Popular culture and popular advertising in the 1950s had promoted an ethos of “fitting in” and buying products to conform. The new countercultural ethos touted individuality and rebellion. Some advertisers were subtle; ads for Volkswagens acknowledged the flaws and strange look of their cars. One ad read, “Presenting America’s slowest fastback,” which “won’t go over 72 mph even though the speedometer shows a wildly optimistic top speed of 90.” Another stated, “And if you run out of gas, it’s easy to push.” By marketing the car’s flaws and reframing them as positive qualities, the advertisers commercialized young peoples’ resistance to commercialism. And it positioned the VW as a car for those who didn’t mind standing out in a crowd. A more obviously countercultural ad for the VW Bug showed two cars: one black and one painted multi-color in the hippie style; the contrasting captions read, “We do our thing,” and “You do yours.”

The Volkswagen Beetle became an icon of 1960s culture and a paradigm of a new advertising age. This tongue-in-cheek advertisement attracted laughs and attention from the public and business world. http://www.videosurrey.com/wp-content/uploads/2013/03/beetle-coccinelle-volkswagen-vw-publicite-vintage-03.jpg.

The Volkswagen Beetle became an icon of 1960s culture and a paradigm of a new advertising age. This tongue-in-cheek advertisement attracted laughs and attention from the public and business world. http://www.videosurrey.com/wp-content/uploads/2013/03/beetle-coccinelle-volkswagen-vw-publicite-vintage-03.jpg.

Companies marketed their products as countercultural in and of themselves. One of the more obvious examples was a 1968 ad from Columbia Records, a hugely successful record label since the 1920s. The ad pictured a group of stock rebellious characters—a shaggy-haired white hippie, a buttoned up Beat, two biker types, and a black jazz man sporting an afro—in a jail cell. The counterculture had been busted, the ad states, but “the man can’t bust our music.” Merely buying records from Columbia was an act of rebellion, one that brought the buyer closer to the counterculture figures portrayed in the ad. ((Thomas Frank, The Conquest of Cool: Business Culture, Counterculture, and the Rise of Hip Consumerism (Chicago: University of Chicago Press, 1998), 7.))

But it wasn’t just advertising: the culture was changing and changing rapidly. Conservative cultural norms were falling everywhere. The dominant style of women’s fashion in the 1950s, for instance, was the poodle skirt and the sweater, tight-waisted and buttoned up. The 1960s ushered in an era of much less restrictive clothing. Capri pants became popular casual wear. Skirts became shorter. When Mary Quant invented the miniskirt in 1964, she said it was a garment “in which you could move, in which you could run and jump.” ((Brenda Polan and Roger Tredre, The Great Fashion Designers (New York: Berg, 2009), 103-104.)) By the late 1960s, the hippies’ more androgynous look became trendy. Such trends bespoke the new popular ethos of the 1960s: freedom, rebellion, and individuality.

Fashion can tell us a lot about a generation’s values and world view. Miniskirts – one of the most radical and popular fashions of the 1960s – demonstrated the new sexual openness of young women during this era of free love. Photograph of young woman in Eugene, Oregon, 1966. Wikimedia, http://commons.wikimedia.org/wiki/File:1960s_fashions_(1709303069).jpg.

Fashion can sometimes capture a generation’s world view. Miniskirts – one of the most radical and popular fashions of the 1960s – demonstrated the new sexual openness of young women. Photograph of young woman in Eugene, Oregon, 1966. Wikimedia, http://commons.wikimedia.org/wiki/File:1960s_fashions_(1709303069).jpg.

In a decade plagued by social and political instability, the American counterculture also sought psychedelic drugs as its remedy for alienation. For young, middle-class whites, society had become stagnant and bureaucratic. The New Left, for instance, arose on college campuses frustrated with the lifeless bureaucracies that they believed strangled true freedom. LSD began its life as a drug used primarily in psychological research before trickling down into college campuses and out into society at large. The counterculture’s notion that American stagnation could be remedied by a spiritual-psychedelic experience drew heavily from psychologists and sociologists. (By 1966, enough incidents had been connected to LSD to spur a Senate hearing on the drug and newspapers were reporting that hundreds of LSD users had been admitted to psychiatric wards.)

The counterculture conquered popular culture. Rock ‘n’ roll, liberalized sexuality, an embrace of diversity, recreational drug use, unalloyed idealism, and pure earnestness marked a new generation. Criticized by conservatives as culturally dangerous and by leftists as empty narcissism, the youth culture nevertheless dominated headlines and steered American culture. Perhaps 100,000 youth descended on San Francisco for the utopic promise of 1967’s Summer of Love. 1969’s  Woodstock concert became shorthand for the new youth culture and its mixture of politics, protest, and personal fulfillment. While the ascendance of the hippies would be both exaggerated and short-lived, and while Vietnam and Richard Nixon shattered much of its idealism, the counterculture’s liberated social norms and its embrace of personal fulfillment still define much of American culture.

 

VII. Beyond Civil Rights

Despite substantial legislative achievements, frustrations with the slow pace of change and with the limits of the civil movement rose. Tensions continued to mount in cities and the tone of the civil rights movement changed yet again. Activists became less conciliatory in their calls for progress. Many embraced the more militant message of the burgeoning Black Power Movement and the late Malcolm X, a Nation of Islam (NOI) minister who had encouraged African Americans to pursue freedom, equality, and justice by “any means necessary.” Prior to his death, Malcolm X and the NOI emerged as the radical alternative to the racially integrated, largely Protestant approach of the Martin Luther King, Jr.-led civil rights movement. Malcolm advocated armed resistance in defense for the safety and well being of black Americans, stating, “I don’t call it violence when it’s self-defense, I call it intelligence.” For his part, King and leaders from more mainstream organizations like the NAACP and the Urban League criticized both Malcolm X and the NOI for what they perceived to be racial demagoguery. King believed Malcolm’s speeches were a “great disservice” to black Americans, claiming that X’s speeches lamented the problems of African Americans without offering solutions. The differences between Dr. King and Malcolm X represented a core ideological tension that would inhabit black political thought throughout the 1960s and 1970s. ((Manning Marable, Malcolm X: A Life of Reinvention (New York: Penguin, 2011).))

Like Booker T. Washington and W.E.B. Du Bois before them, Martin Luther King, Jr., and Malcolm X represented two styles of racial uplift while maintaining the same general goal of ending racial discrimination. How they would get to that goal is where the men diverged. Marion S. Trikosko, “[Martin Luther King and Malcolm X waiting for press conference],” March 26, 1964. Library of Congress, http://www.loc.gov/pictures/item/92522562/.

Like Booker T. Washington and W.E.B. Du Bois before them, Martin Luther King, Jr., and Malcolm X represented two styles of racial uplift while maintaining the same general goal of ending racial discrimination. How they would get to that goal is where the men diverged. Marion S. Trikosko, “[Martin Luther King and Malcolm X waiting for press conference],” March 26, 1964. Library of Congress, http://www.loc.gov/pictures/item/92522562/.

By the late 1960s, the Student Nonviolent Coordinating Committee (SNCC), led by figures such as Stokely Carmichael, had expelled its white members and shunned the interracial effort in the rural South, focusing instead on injustices in northern urban areas. After President Johnson refused to take up the cause of the black delegates in the Mississippi Freedom Democratic Party at the 1964 Democratic National Convention, SNCC activists became frustrated with institutional tactics and turned away from the organization’s founding principle of nonviolence over the course of the next year. This evolving, more aggressive movement called for African Americans to play a dominant role in cultivating black institutions and articulating black interests rather than relying on interracial, moderate approaches. At a June 1966 civil rights march, Carmichael told the crowd, “What we gonna start saying now is black power!” ((Peniel E. Joseph, editor, The Black Power Movement: Rethinking the Civil Rights-Black Power Era (New York: Routledge, 2013), 2.)) The slogan not only resonated with audiences, it also stood in direct contrast to King’s “Freedom Now!” campaign. The political slogan of black power could encompass many meanings, but at its core stood for the self-determination of blacks in political, economic, and social organizations.

 

The Black Panther Party used radical and incendiary tactics to bring attention to the continued oppression of blacks in America. Read the bottom paragraph on this rally poster carefully. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/e/e7/Black_Panther_DC_Rally_Revolutionary_People's_Constitutional_Convention_1970.jpg.

The Black Panther Party used radical and incendiary tactics to bring attention to the continued oppression of blacks in America. Read the bottom paragraph on this rally poster carefully. Wikimedia.

While Carmichael asserted that “black power means black people coming together to form a political force,” to many it also meant violence. ((Gordon Parks “Whip of Black Power,” LIFE (May 19, 1967), 82.)) In 1966, Huey Newton and Bobby Seale formed the Black Panther Party in Oakland, California. The Black Panthers became the standard-bearers for direct action and self-defense, using the concept of “decolonization” in their drive to liberate black communities from white power structures. The revolutionary organization also sought reparations and exemptions for black men from the military draft. Citing police brutality and racist governmental policies, the Panthers aligned themselves with the “other people of color in the world” against whom America was fighting abroad. Although it was perhaps most well-known for its open display of weapons, military-style dress, and black nationalist beliefs, the Party’s 10-Point Plan also included employment, housing, and education. The Black Panthers worked in local communities to run “survival programs” that provided food, clothing, medical treatment, and drug rehabilitation. They focused on modes of resistance that empowered black activists on their own terms. ((Joshua Bloom and Waldo E. Martin, Jr., Black against Empire: The History and Politics of the Black Panther Party (Berkeley: University of California Press, 2012).))

But African Americans weren’t the only Americans struggling to assert themselves in the 1960s . The successes of the civil rights movement and growing grassroots activism inspired countless new movements. In the summer of 1961, for instance, frustrated Native American university students founded the National Indian Youth Council (NIYC) to draw attention to the plight of indigenous Americans. In the Pacific Northwest, the Council advocated for tribal fisherman to retain immunity from conservation laws on reservations and in 1964 held a series of “fish-ins”: activists and celebrities cast nets and waited for the police to arrest them. ((In 1974, fishing rights activists and tribal leaders reached a legal victory in United States v. Washington, otherwise known as the Boldt Decision, which declared that Native Americans were entitled to up to 50 percent of the fish caught in the “usual and accustomed places,” as stated in 1850s treaties.)) NIYC’s militant rhetoric and use of direct action marked the beginning of what was called the Red Power movement, an intertribal movement designed to draw attention to Native issues and protest discrimination. The American Indian Movement (AIM) and other activists staged dramatic demonstrations. In November 1969, dozens began a year-and-a-half occupation of the abandoned Alcatraz Island, in San Francisco Bay. In 1973, hundreds occupied the town of Wounded Knee, South Dakota, site of an infamous 1890 Indian massacre, for several months. ((Paul Chaat Smith and Robert Allen Warrior, Like a Hurricane: The Indian Movement from Alcatraz to Wounded Knee (New York: The New Press, 1997).))

While the Pan-Indian movement of the 1960s failed, a sign remains of the Native American occupation of Alcatraz Island in the San Francisco Bay. Photograph, July 18, 2006. Wikimedia, http://commons.wikimedia.org/wiki/File:Alcatraz_Island_01_Prison_sign.jpg.

While the Pan-Indian movement faded, a sign remains of the Native American occupation of Alcatraz Island in the San Francisco Bay. Photograph, July 18, 2006. Wikimedia,

Meanwhile, the Chicano movement in the 1960s emerged out of the broader Mexican American civil rights movement of the post-World War II era. While “Chicano” was initially considered a derogatory term for Mexican immigrants, activists in the 1960s reclaimed the term and used it as a catalyst to campaign for political and social change among Mexican Americans. The Chicano movement confronted discrimination in schools, politics, agriculture, and other formal and informal institutions. Organizations like the Mexican American Political Association (MAPA) and the Mexican American Legal Defense Fund (MALDF) buoyed the Chicano movement and patterned themselves after similar influential groups in the African American civil rights movement. (See, for instance, Juan Gómez-Quiñones and Irene Vásquez Making Aztlán: Ideology and Culture of the Chicana and Chicano Movement, 1966-1977 (Albuquerque: University of New Mexico Press, 2014).))

Cesar Chavez became the most well-known figure of the Chicano movement, using nonviolent tactics to campaign for workers’ rights in the grape fields of California. Chavez and activist Dolores Huerta founded the National Farm Workers Association, which eventually merged and became the United Farm Workers of America (UFWA). The UFWA fused the causes of Chicano and Filipino activists protesting subpar working conditions of California farmers on American soil. In addition to embarking on a hunger strike and a boycott of table grapes, Chavez led a 300-mile march in March and April of 1966 from Delano, California to the state capital of Sacramento. The pro-labor campaign garnered the national spotlight and the support of prominent political figures such as Robert Kennedy. Today, Chavez’s birthday (March 31) is observed as a federal holiday in California, Colorado, and Texas.

The United Farm Workers Union become a strong force for bettering working conditions of laborers in California and Florida agriculture. Cesar Chavez (center) and UFW supporters attend an outdoor Mass on the capitol steps in Sacramento, Calif.,  before start of a labor protest march, date unknown. http://i.huffpost.com/gen/1608804/thumbs/o-CESAR-CHAVEZ-facebook.jpg.

The United Farm Workers Union become a strong force for bettering working conditions of laborers in California and Florida agriculture. Cesar Chavez (center) and UFW supporters attend an outdoor Mass on the capitol steps in Sacramento, Calif., before start of a labor protest march, date unknown. Huffington Post.

Rodolfo “Corky” Gonzales was another activist whose calls for Chicano self-determination resonated long past the 1960s. A former boxer and Denver native, Gonzales founded the Crusade for Justice in 1966, an organization that would establish the first annual Chicano Liberation Day at the National Chicano Youth Conference by decade’s end. The conference also yielded the Plan Espiritual de Aztlan, a Chicano nationalist manifesto that reflected Gonzales’ vision of Chicano as a unified, historically grounded, all-encompassing group fighting against discrimination in the United States. By 1970, the Texas-based La Raza Unida political party had a strong foundation for promoting Chicano nationalism and continuing the campaign for Mexican American civil rights. ((Armando Navarro, Mexican American Youth Organization: Avant-Garde of the Movement in Texas (Austin: University of Texas Press, 1995); Ignacio M. Garcia, United We Win: The Rise and Fall of La Raza Unida Party (Tucson: University of Arizona Mexican American Studies Research Center, 1989).))

The 1966 Rio Grande Valley Farm Workers March (“La Marcha”). August 27, 1966. Via the University of Texas-San Antonio Libraries' Special Collections (MS 360: E-0012-187-D-16)

The 1966 Rio Grande Valley Farm Workers March (“La Marcha”). August 27, 1966. Via the University of Texas-San Antonio Libraries’ Special Collections (MS 360: E-0012-187-D-16)

The feminist movement also made great strides in the 1960s. Women were active in both the civil rights movement and the labor movement, but their increasing awareness of gender inequality did not find a receptive audience among male leaders in those movements. In the 1960s, then, many of these women began to form a movement of their own. Soon the country experienced a groundswell of feminist consciousness.

An older generation of women who preferred to work within state institutions figured prominently in the early part of the decade. When John F. Kennedy established the President’s Commission on the Status of Women in 1961, former first lady Eleanor Roosevelt headed the effort. The Commission’s Invitation to Action was released in 1963. Finding discriminatory provisions in the law and practices of industrial, labor, and governmental organizations, the Commission advocated for “changes, many of them long overdue, in the conditions of women’s opportunity in the United States.” Change was necessary in areas of employment practices, federal tax and benefit policies affecting women’s income, labor laws, and services for women as wives, mothers, and workers. This call for action, if heeded, would ameliorate the types of discrimination primarily experienced by middle-class and elite white working women, all of whom were used to advocating through institutional structures like government agencies and unions. ((Flora Davis, Moving the Mountain: The Women’s Movement in America since 1960  (Champaign: University of Illinois, 1999); Cynthia Ellen Harrison, On Account of Sex: The Politics of Women’s Issues, 1945–1968 (Berkeley: University of California Press, 1988).))

Betty Friedan’s Feminine Mystique hit bookshelves the same year the Commission released its report. Friedan had been active in the union movement, and was by this time a mother in the new suburban landscape of post-war America. In her book, Friedan labeled the “problem that has no name,” and in doing so helped many white middle-class American women come to see their dissatisfaction as housewives not as something “wrong with [their] marriage, or [themselves],” but instead as a social problem experienced by millions of American women. Friedan observed that there was a “discrepancy between the reality of our lives as women and the image to which we were trying to conform, the image I call the feminine mystique.” No longer would women allow society to blame the “problem that has no name” on a loss of femininity, too much education, or too much female independence and equality with men. ((Betty Friedan, The Feminine Mystique (New York: Norton, 1963), 50.))

The 1960s also saw a different group of women pushing for change in government policy. Welfare mothers began to form local advocacy groups in addition to the National Welfare Rights Organization founded in 1966. Mostly African American, these activists fought for greater benefits and more control over welfare policy and implementation. Women like Johnnie Tillmon successfully advocated for larger grants for school clothes and household equipment in addition to gaining due process and fair administrative hearings prior to termination of welfare entitlements.

Yet another mode of feminist activism was the formation of consciousness-raising groups. These groups met in women’s homes and at women’s centers, providing a safe environment for women to discuss everything from experiences of gender discrimination to pregnancy, from relationships with men and women to self-image. The goal of consciousness-raising was to increase self-awareness and validate the experiences of women. Groups framed such individual experiences as examples of society-wide sexism, and claimed that “the personal is political.” ((Carol Hanisch, “The Personal is Political,” in Shulamith Firestone and Anne Koedt, editors, Notes from the Second Year: Women’s Liberation (New York: Radical Feminism, 1970).)) Consciousness-raising groups created a wealth of personal stories that feminists could use in other forms of activism and crafted networks of women that activists could mobilize support for protests.

The end of the decade was marked by the Women’s Strike for Equality celebrating the 50th anniversary of women’s right to vote. Sponsored by NOW (the National Organization for Women), the 1970 protest focused on employment discrimination, political equality, abortion, free childcare, and equality in marriage. All of these issues foreshadowed the backlash against feminist goals in the 1970s. Not only would feminism face opposition from other women who valued the traditional homemaker role to which feminists objected, the feminist movement would also fracture internally as minority women challenged white feminists’ racism and lesbians vied for more prominence within feminist organizations.

The women’s movement stagnated after gaining the vote in 1920, but by the 1960s it was back in full force. Inspired by the Civil Rights Movement and fed up with gender discrimination, women took to the streets to demand their rights as American citizens. Warren K. Leffler, “Women's lib[eration] march from Farrugut Sq[uare] to Layfette [i.e., Lafayette] P[ar]k,” August 26, 1970. Library of Congress, http://www.loc.gov/pictures/item/2003673992/.

The women’s movement stagnated after gaining the vote in 1920, but by the 1960s it was back in full force. Inspired by the Civil Rights Movement and fed up with gender discrimination, women took to the streets to demand their rights as American citizens. Warren K. Leffler, “Women’s lib[eration] march from Farrugut Sq[uare] to Layfette [i.e., Lafayette] P[ar]k,” August 26, 1970. Library of Congress, http://www.loc.gov/pictures/item/2003673992/.

American environmentalism’s significant gains during the 1960s emerged in part from Americans’ recreational use of nature. Postwar Americans backpacked, went to the beach, fished, and joined birding organizations in greater numbers than ever before. These experiences, along with increased formal education, made Americans more aware of threats to the environment and, consequently, to themselves.  Many of these threats increased in the post-war years as developers bulldozed open space for suburbs and new hazards from industrial and nuclear pollutants loomed over all organisms.

 

By the time that biologist Rachel Carson published her landmark book, Silent Spring, in 1962, a nascent environmentalism had emerged in America. Silent Spring stood out as an unparalleled argument for the interconnectedness of ecological and human health. Pesticides, Carson argued, also posed a threat to human health, and their over-use threatened the ecosystems that supported food production.  Carson’s argument was compelling to many Americans, including President Kennedy, and was virulently opposed by chemical industries that suggested the book was the product of an emotional woman, not a scientist. ((Rachel Carson, Silent Spring (New York: Houghton Mifflin, 1962; Linda Lear, Rachel Carson: Witness for Nature (New York: Henry Holt and Company, 1997).))

After Silent Spring, the social and intellectual currents of environmentalism continued to expand rapidly, culminating in the largest demonstration in history, Earth Day, on April 22, 1970, and in a decade of lawmaking that significantly restructured American government. Even before the massive gathering for Earth Day, lawmakers from the local to federal level had pushed for and achieved regulations to clean up the air and water. President Richard Nixon signed the National Environmental Policy Act into law in 1970, requiring environmental impact statements for any project directed or funded by the federal government. He also created the Environmental Protection Agency, the first agency charged with studying, regulating, and disseminating knowledge about the environment. A raft of laws followed that were designed to offer increased protection for air, water, endangered species, and natural areas.

The decade’s activism worked all across society. It even affected the Catholic Church. The Second Vatican Council, called by Pope John XXIII to modernize the church and bring it in closer dialogue with the non-Catholic world, operated from 1962 to 1965, when it proclaimed multiple reforms, including the “vernacular mass” (or, mass in local languages, rather than in Latin) and a greater role for laypeople, and especially women, in the Church. Many Catholic churches adopted more informal, contemporary styles. Many conservative Catholics recoiled at what they perceived as rapid and dangerous changes, but Vatican II’s reforms in many ways created the modern Catholic Church.

Losing membership and influence throughout the world, leaders of the Catholic Church met in 1965 institute new measures to modernize and open the church. This ecumenical council would become known as the Second Vatican Council or Vatican II. Photograph of the grand procession of the Council Fathers at St. Peter's Basilica, October 11, 1962. Wikimedia, http://commons.wikimedia.org/wiki/File:Konzilseroeffnung_1.jpg.

Losing membership and influence throughout the world, leaders of the Catholic Church met in 1965 institute new measures to modernize and open the church. This ecumenical council would become known as the Second Vatican Council or Vatican II. Photograph of the grand procession of the Council Fathers at St. Peter’s Basilica, October 11, 1962. Wikimedia.

 

VIII. Conclusion

In 1969, Americans hailed the moon landing as a profound victory in the “space race” against the Soviet Union that fulfilled the promise of the late John F. Kennedy, who had declared in 1961 that the U.S. would put a man on the moon by the end of the decade. But while Neil Armstrong said his steps marked “one giant leap for mankind,” and Americans marveled at the achievement, the brief moment of wonder only punctuated years of turmoil. The Vietnam War disillusioned a generation, riots rocked cities, protests hit campuses, and assassinations robbed the nation of many of its leaders. The forward-thinking spirit of a complex decade had waned. Uncertainty loomed.

 

Contributors

This chapter was edited by Samuel Abramson, with content contributions by Samuel Abramson, Marsha Barrett, Brent Cebul, Michell Chresfield, William Cossen, Jenifer Dodd, Michael Falcone, Leif Fredrickson, Jean-Paul de Guzman, Jordan Hill, William Kelly, Lucie Kyrova, Maria Montalvo, Emily Prifogle, Ansley Quiros, Tanya Roth, and Robert Thompson.

 

Recommended Reading

  1. Branch, Taylor. Parting the Waters: America in the King Years, 1954–1963. New York: Simon & Schuster, 1988.
  2. Branch, Taylor. Pillar of Fire: America in the King Years, 1963-65. New York: Simon & Schuster, 1998.
  3. Brick, Howard. The Age of Contradictions: American Thought and Culture in the 1960s. Ithaca: Cornell University Press, 2000.
  4. Carson, Clayborne. In Struggle: SNCC and the Black Awakening of the 1960s. Cambridge: Harvard University Press, 1981.
  5. Dallek, Robert. Flawed Giant: Lyndon Johnson and His Times, 1961-1973. New York: Oxford University Press, 1993.
  6. Hall, Jacquelyn Dowd. “The Long Civil Rights Movement and the Political Uses of the Past,” Journal of American History 91 (March 2005): 1233-1263.
  7. Gitlin, Todd. The Sixties: Years of Hope, Days of Rage. New York: Bantam Books, 1987
  8. Isserman, Maurice. If I Had a Hammer: The Death of the Old Left and the Birth of the New Left. Champaign: University of Illinois Press, 1987.
  9. McGirr, Lisa. Suburban Warriors: The Origins of the New American Right. Princeton: Princeton University Press, 2001.
  10. Patterson, James T. Grand Expectations: The United States, 1945–1974. New York: Oxford University Press, 1996.
  11. Johnson, Troy R. The American Indian Occupation of Alcatraz Island: Red Power and Self-Determination. Lincoln: University of Nebraska Press, 2008.
  12. Joseph, Peniel. Waiting ‘til the Midnight Hour: A Narrative History of Black Power in America. New York: Holt, 2006.
  13. Kazin, Michael, and Isserman, Maurice. America Divided: The Civil War of the 1960s. New York: Oxford University Press, 2007.
  14. Perlstein, Rick. Before the Storm : Barry Goldwater and the Unmaking of the American Consensus. New York: Hill & Wang, 2001.
  15. Sugrue, Thomas, The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit. Princeton, NJ: Princeton University Press, 2005.

 

Notes

26. The Affluent Society

Migrant Farm Workers, 1959, Michael Rougier—Time & Life Pictures/Getty Images. Via Life.

Migrant Farm Workers, 1959, Michael Rougier—Time & Life Pictures/Getty Images. Via Life.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

In 1958, Harvard economist and public intellectual John Kenneth Galbraith published The Affluent Society. Galbraith’s celebrated book examined America’s new post-World War II consumer economy and political culture. The book, which popularized phrases such as “conventional wisdom,” noted the unparalleled riches of American economic growth but criticized the underlying structures of an economy dedicated to increasing production and the consumption of goods. Galbraith argued that the United States’ economy, based on an almost hedonistic consumption of luxury products, would and must inevitably lead to economic inequality as the private sector interests enriched themselves at the expense of the American public. Galbraith warned that an economy where “wants are increasingly created by the process by which they are satisfied,” was unsound, unsustainable, and, ultimately, immoral. “The Affluent Society,” he said, was anything but. ((John Kenneth Galbraith, The Affluent Society (New York: Houghton Mifflin, 1958), 129.))

The contradictions that Galbraith noted mark the decade of the 1950s. While economists and scholars continue to debate the merits of Galbraith’s warnings and predictions, his analysis was so insightful that the title of his book has come to serve as a ready label for postwar American society. In the almost two decades after the end of World War II, the American economy witnessed massive and sustained growth that reshaped American culture through the abundance of consumer goods. Standards of living climbed to unparalleled heights. All income levels shared and inequality plummeted in what some economists have called “the Great Compression.” ((See, for example, Claudia Goldin and Robert A. Margo, “The Great Compression: The Wage Structure in the United States at Mid-Century, Quarterly Journal of Economics 107 (February, 1992), 1-34.))

And yet, as Galbraith noted, the Affluent Society had fundamental flaws. The new consumer economy that lifted millions of Americans into its burgeoning middle class also produced inequality. Women struggled to claim equal rights as full participants in American society. The poor struggled to win access to good schools, good healthcare, and good jobs. The same suburbs that gave middle class Americans new space left cities withering in spirals of poverty and crime.The Jim Crow South tenaciously defended segregation and American blacks and other minorities suffered discrimination all across the country.

The contradictions of the Affluent Society defined the decade: unrivaled prosperity with crippling poverty, expanded opportunity with entrenched discrimination, and new lifestyles with stifling conformity.

 

II. The Rise of the Suburbs

Levittown, early 1950s, via Flickr user markgregory.

Levittown, early 1950s, via Flickr user markgregory.

While the electric streetcar of the late-nineteenth century facilitated the outward movement of the well to do, the seeds of a suburban nation were planted in the mid-twentieth century. At the height of the Great Depression, in 1932, some 250,000 households lost their property to foreclosure. A year later, half of all U.S. mortgages were in default. The foreclosure rate stood at more than a 1,000 per day. In response, FDR’s New Deal created the Home Owners Loan Corporation (HOLC), which began purchasing and refinancing existing mortgages at risk of default. HOLC introduced the amortized mortgage, allowing borrowers to pay back interest and principle over twenty to thirty years instead of the then standard five-year mortgage that carried large balloon payments at the end of the contract. Though homeowners paid more for their homes under this new system, home-ownership was opened to the multitudes who could now gain residential stability, lower monthly mortgage payments, and accrue equity and wealth as property values rose over time. ((Price Fishback, Jonathan Rose, and Kenneth Snowden, Well Worth Saving: How the New Deal Safeguarded Home Ownership (Chicago: University of Chicago Press, 2013).))

Additionally, the Federal Housing Administration (FHA), another New Deal organization, increased access to homeownership by insuring mortgages and protecting lenders from financial loss in the event of a default. Though only slightly more than a third of homes had an FHA backed mortgage by 1964, FHA backed loans had a ripple effect with private lenders granting more and more home loans even to non-FHA backed mortgages. Though started in the midst of the Great Depression, the effects of government programs and subsidies like HOLC and the FHA were fully felt in the postwar economy and fueled the growth of homeownership and the rise of the suburbs.

Though domestic spending programs like HOLC and FHA helped create the outlines of the new consumer economy, United States involvement and the Allied victory in World War II pushed the country out of depression and into a sustained economic boom. Wartime spending exploded and, after the war, sustained spending fueled further growth. Government expenditures provided loans to veterans, subsidized corporate research and development, and built the Interstate Highway System. In the decades after World War II, business boomed, unionization peaked, wages rose, and sustained growth buoyed a new consumer economy. The Servicemen’s Readjustment Act (The G.I. Bill), passed in 1944, offered low-interest home loans, a stipend to attend college, loans to start a business, and unemployment benefits.

The rapid growth of homeownership and the rise of suburban communities helped drive the postwar economic boom. Builders created sprawling neighborhoods of single-family homes on the outskirts of American cities. William Levitt built the first Levittown, the prototypical suburban community, in 1946 in Long Island, New York. Purchasing large acreage, “subdividing” lots, and contracting crews to build countless homes at economies of scale, Levitt offered affordable suburban housing to veterans and their families. Levitt became the prophet of the new suburbs. His developments heralded a massive internal migration. The country’s suburban share of the population rose from 19.5% in 1940 to 30.7% by 1960. Homeownership rates rose from 44% in 1940 to almost 62% in 1960. Between 1940 and 1950, suburban communities of greater than 10,000 people grew 22.1%, and planned communities grew at an astonishing rate of 126.1%. ((Leo Schnore, “The Growth of Metropolitan Suburbs,” American Sociological Review 22 (April, 1957), 169.)) As historian Lizabeth Cohen notes, these new suburbs “mushroomed in territorial size and the populations they harbored.” ((Lizabeth Cohen, A Consumers’ Republic: The Politics of Mass Consumption in Postwar America (New York: Random House, 2002), 202.)) Between 1950 and 1970, America’s suburban population nearly doubled to 74 million. 83 percent of all population growth occurred in suburban places. ((Elaine Tyler May, Homeward Bound: American Families in the Cold War Era (New York, Basic Books, 1999), 152.))

The postwar construction boom fed into countless industries. As manufacturers converted back to consumer goods after the war, and as the suburbs developed, appliance and automobile sales rose dramatically. Flush with rising wages and wartime savings, homeowners also used newly created installment plans to buy new consumer goods at once instead of saving for years to make major purchases. The mass-distribution of credit cards, first issued in 1950, further increased homeowners’ access to credit. Fueled by credit and no longer stymied by the Depression or wartime restrictions, consumers bought countless washers, dryers, refrigerators, freezers, and, suddenly, televisions. The percentage of Americans that owned at least one television increased from 12% in 1950 to more than 87% in 1960. This new suburban economy also led to increased demand for automobiles. The percentage of American families owning cars increased from 54% in 1948 to 74% in 1959. Motor fuel consumption rose from some 22 million gallons in 1945 to around 59 million gallons in 1958. ((Leo Fishman, The American Economy (Princeton: D. Van Nostrand, 1962), 560.))

While the car had been around for decades by the 1950s, car culture really took off as a national fad during the decade. Arthur C. Base, August 1950 issue of Science and Mechanics. Wikimedia, http://commons.wikimedia.org/wiki/File:Car_of_the_Future_1950.jpg.

While the car had been around for decades by the 1950s, car culture really took off as a national fad during the decade. Arthur C. Base, August 1950 issue of Science and Mechanics. Wikimedia.

The rise of the suburbs transformed America’s countryside. Suburban growth claimed millions of acres of rural space and turned agrarian communities into suburban landscapes. Suburban development wrenched more and more agricultural workers off the land and often pushed them into the very cities suburbanites were busy fleeing.

The process of suburbanization drove the movement of Americans and turned the wheels of the new consumer economy. Seen from a macroeconomic level, the postwar economic boom turned America into a land of economic abundance. For advantaged buyers, loans had never been easier to attain, consumer goods had never been more accessible, and well-paying jobs had never been more abundant. And yet, beneath the aggregate numbers, patterns of racial disparity, sexual discrimination, and economic inequality persevered, undermining many of the assumptions of an Affluent Society.

In 1939 real estate appraisers arrived in sunny Pasadena, California. Armed with elaborate questionnaires to evaluate the city’s building conditions, the appraisers were well-versed in the policies of the Home Owners Loan Corporation (HOLC). In one neighborhood, the majority of structures were rated in “fair” repair and it was noted that there was a lack of “construction hazards or flood threats.” However, appraisers concluded that the area “is detrimentally affected by 10 owner occupant Negro families.” While “the Negroes are said to be of the better class,” the appraisers concluded, “it seems inevitable that ownership and property values will drift to lower levels.” ((David Kushner, Levittown: Two Families, One Tycoon, and the Fight for Civil Rights in America’s Legendary Suburb (New York: Bloomsbury, 2009), 17.))

While suburbanization and the new consumer economy produced unprecedented wealth and affluence, the fruits of this economic and spatial abundance did not reach all Americans equally. The new economic structures and suburban spaces of the postwar period produced perhaps as much inequality as affluence. Wealth created by the booming economy filtered through social structures with built-in privileges and prejudices. Just when many middle and lower class white American families began their journey of upward mobility by moving to the suburbs with the help of government spending and government programs such as the FHA and the GI Bill, many African Americans and other racial minorities found themselves systematically shut out.

A look at the relationship between federal organizations such as the HOLC and FHA and private banks, lenders, and real estate agents tells the story of standardized policies that produced a segregated housing market. At the core of HOLC appraisal techniques, which private parties also adopted, was the pernicious insistence that mixed-race and minority dominated neighborhoods were credit risks. In partnership with local lenders and real estate agents, HOLC created Residential Security Maps to identify high and low risk-lending areas. People familiar with the local real estate market filled out uniform surveys on each neighborhood. Relying on this information, HOLC assigned every neighborhood a letter grade from A to D and a corresponding color code. The least secure, highest risk neighborhoods for loans received a D grade and the color red. Banks refused to loan money in these “redlined” areas. ((Thomas Sugrue, The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit (Princeton, NJ: Princeton University Press, 2005).))

Pair with 1938 Brooklyn Redline map.

Black communities in cities like Detroit, Chicago, Brooklyn, and Atlanta experienced “redlining,” the process by which banks and other organizations demarcated minority neighborhoods on a map with a red line. Doing so made visible the areas they believed were unfit for their services, denying black residents loans, housing, groceries, and other necessities of modern life. Redlined Map of Greater Atlanta, http://blog.historian4hire.net/wp-content/uploads/2014/01/HOLC-RedlineMap-NARA.jpg.

Pair with Redlined Map of Greater Atlanta

1938 Brooklyn Redline map. UrbanOasis.org via National Archives (NARA II RG 195 Entry 39 Folder “Brooklyn (Kings Co.)” Box 58). Edited by ASommer, PlaNYourCity. https://planyourcity.files.wordpress.com/2014/04/bk-map.jpg.

Phrases like “subversive racial elements” and “racial hazards” pervade the redlined area description files of surveyors and HOLC officials. Los Angeles’ Echo Park neighborhood, for instance, had concentrations of Japanese and African Americans and a “sprinkling of Russians and Mexicans.” The HOLC security map and survey noted that the neighborhood’s “adverse racial influences which are noticeably increasing inevitably presage lower values, rentals and a rapid decrease in residential desirability.” ((Becky M. Nicolaides, My Blue Heaven: Life and Politics in the Working-Class Suburbs of Los Angeles, 1920-1965 (Chicago: University of Chicago Press, 2002), 193.))

While the HOLC was a fairly short-lived New Deal agency, the influence of its security maps lived on in the Federal Housing Authority (FHA) and the GI Bill dispensing Veteran’s Administration (VA). Both of these government organizations, which set the standard that private lenders followed, refused to back bank mortgages that did not adhere to HOLC’s security maps. On the one hand FHA and VA backed loans were an enormous boon to those who qualified for them. Millions of Americans received mortgages that they otherwise would not have qualified for. But FHA-backed mortgages were not available to all. Racial minorities could not get loans for property improvements in their own neighborhoods—seen as credit risks—and were denied mortgages to purchase property in other areas for fear that their presence would extend the red line into a new community. Levittown, the poster-child of the new suburban America, only allowed whites to purchase homes. Thus HOLC policies and private developers increased home ownership and stability for white Americans while simultaneously creating and enforcing racial segregation.

The exclusionary structures of the postwar economy prompted protest from the African Americans and other minorities that were excluded. Over time the federal government attempted to rectify the racial segregation created, or at least facilitated, in part by its own policies. In 1948, the U.S. Supreme Court case Shelley v. Kraemer struck down racially restrictive neighborhood housing covenants, making it illegal to explicitly consider race when selling a house. Discrimination and segregation continued, however, and fair housing would emerge as a major plank of the brewing civil rights movement.

During the 1950s and early 1960s many Americans retreated to the suburbs to enjoy the new consumer economy and search for some normalcy and security after the instability of depression and war. But many could not. It was both the limits and opportunities of housing, then, that shaped the contours of postwar American society.

 

III. Race and Education

School desegregation was a tense experience for all involved, but none more so than the African American students brought into white schools. The “Little Rock Nine” were the first to do this in Arkansas; their escorts, the 101st Airborne Division of the U.S. Army, provided protection to these students who so bravely took that first step. Photograph, 1957. Wikimedia, http://commons.wikimedia.org/wiki/File:101st_Airborne_at_Little_Rock_Central_High.jpg.

School desegregation was a tense experience for all involved, but none more so than the African American students brought into white schools. The “Little Rock Nine” were the first to do this in Arkansas; their escorts, the 101st Airborne Division of the U.S. Army, provided protection to these students who so bravely took that first step. Photograph, 1957. Wikimedia, http://commons.wikimedia.org/wiki/File:101st_Airborne_at_Little_Rock_Central_High.jpg.

Older battles over racial exclusion also confronted postwar American society. One long-simmering struggle targeted segregated schooling. Since the Supreme Court’s decision in Plessy v. Ferguson (1896), black Americans, particularly in the American South, had fully felt the deleterious effects of segregated education. Their battle against Plessy for inclusion in American education stretched across half a century when the Supreme Court again took up the merits of “separate but equal.”

On May 17, 1954, after two years of argument, re-argument, and deliberation, Chief Justice Earl Warren announced the Supreme Court’s decision on segregated schooling in Brown v. Board of Education (1954). The court found by a unanimous 9-0 vote that racial segregation violated the Equal Protection Clause of the Fourteenth Amendment. The court’s decision declared, “Separate educational facilities are inherently unequal.” “Separate but equal” was made unconstitutional. ((Oliver Brown, et al. v. Board of Education of Topeka, et al., 347 U.S. 483 (1954).))

Decades of African American-led litigation, local agitation against racial inequality, and liberal Supreme Court justices made Brown v. Board possible. In the early 1930s, the National Association for the Advancement of Colored People (NAACP) began a concerted effort to erode the legal underpinnings of segregation in the American South. Legal, or de jure, segregation subjected racial minorities to discriminatory laws and policies. Law and custom in the South hardened anti-black restrictions. But through a series of carefully chosen and contested court cases concerning education, disfranchisement, and jury selection, NAACP lawyers such as Charles Hamilton Houston, Robert L. Clark, and future Supreme Court Justice Thurgood Marshall undermined Jim Crow’s constitutional underpinnings. Initially seeking to demonstrate that states systematically failed to provide African American students “equal” resources and facilities, and thus failed to live up to Plessy, by the late 1940s activists began to more forcefully challenge the assumptions that “separate” was constitutional at all.

The NAACP was a central organization in the fight to end segregation, discrimination, and injustice based on race. NAACP leaders, including Thurgood Marshall (who would become the first African American Supreme Court Justice), hold a poster decrying racial bias in Mississippi in 1956. Photograph, 1956. Library of Congress, http://www.loc.gov/pictures/item/99401448/.

The NAACP was a central organization in the fight to end segregation, discrimination, and injustice based on race. NAACP leaders, including Thurgood Marshall (who would become the first African American Supreme Court Justice), hold a poster decrying racial bias in Mississippi in 1956. Photograph, 1956. Library of Congress, http://www.loc.gov/pictures/item/99401448/.

Though remembered as just one lawsuit, Brown consolidated five separate cases that had originated in the southeastern United States: Briggs v. Elliott (South Carolina), Davis v. County School Board of Prince Edward County (Virginia), Beulah v. Belton (Delaware), Boiling v. Sharpe (Washington, D. C.), and Brown v. Board of Education (Kansas). Working with local activists already involved in desegregation fights, the NAACP purposely chose cases with a diverse set of local backgrounds to show that segregation was not just an issue in the Deep South, and that a sweeping judgment on the fundamental constitutionality of Plessy was needed.

Briggs v. Elliott, the first case accepted by the NAACP,  illustrated the plight of segregated black schools. Briggs originated in rural Clarendon County, South Carolina, where taxpayers in 1950 spent $179 to educate each white student and $43 for each black student. The district’s twelve white schools were cumulatively worth $673,850; the value of its sixty-one black schools (mostly dilapidated, over-crowded shacks), was $194,575. ((James T. Patterson and William W. Freehling, Brown v. Board of Education: A Civil Rights Milestone and Its Troubled Legacy (New York: Oxford University Press, 2001), 25; Pete Daniel, Standing at the Crossroads: Southern Life in the Twentieth Century (Baltimore: Johns Hopkins University Press, 1996), 164-164.)) While Briggs underscored the South’s failure to follow Plessy, the Brown v. Board suit focused less on material disparities between black and white schools (which were significantly less than in places like Clarendon County) and more on the social and spiritual degradation that accompanied legal segregation. This case cut to the basic question of whether or not “separate” was itself inherently unequal. The NAACP said the two notions were incompatible. As one witness before the U. S. District Court of Kansas said, “the entire colored race is craving light, and the only way to reach the light is to start [black and white] children together in their infancy and they come up together.” ((Ibid., xxv.))

To make its case, the NAACP marshalled historical and social scientific evidence. The Court found the historical evidence inconclusive, and drew their ruling more heavily from the NAACP’s argument that segregation psychologically damaged black children. To make this argument, association lawyers relied upon social scientific evidence, such as the famous doll experiments of Kenneth and Mamie Clark. The Clarks demonstrated that while young white girls would naturally choose to play with white dolls, young black girls would, too. The Clarks argued that black children’s aesthetic and moral preference for white dolls demonstrated the pernicious effects and self-loathing produced by segregation.

Identifying and denouncing injustice, though, is different from rectifying it. Though Brown repudiated Plessy, the Court’s orders did not extend to segregation in places other than public schools and, even then, while recognizing the historical importance of the decision, the justices set aside the divisive yet essential question of remediation and enforcement to preserve a unanimous decision. Their infamously ambiguous order in 1955 (what came to be known as Brown II) that school districts desegregate “with all deliberate speed” was so vague and ineffectual that it left the actual business of desegregation in the hands of those who opposed it.

In most of the South, as well as the rest of the country, school integration did not occur on a wide scale until well after Brown. Only in the 1964 Civil Rights Act did the federal government finally implement some enforcement of the Brown decision by threatening to withhold funding from recalcitrant school districts, financially compelling desegregation, but even then southern districts found loopholes. Court decisions such as Green v. New Kent County (1968) and Alexander v. Holmes (1969) finally closed some of those loopholes, such as “freedom of choice” plans, to compel some measure of actual integration.

When Brown finally was enforced in the South, the quantitative impact was staggering. In the early 1950s, virtually no southern black students attended white schools. By 1968, fourteen years after Brown, some eighty percent of black southerners remained in schools that were ninety- to one-hundred-percent nonwhite. By 1972, though, just twenty-five percent were in such schools, and fifty-five percent remained in schools with a simple nonwhite minority. By many measures, the public schools of the South became, ironically, the most integrated in the nation. ((Charles T. Clotfelter, After Brown: The Rise and Retreat of School Desegregation (Princeton: Princeton University Press, 2011), 6.))

As a landmark moment in American history, Brown’s significance perhaps lies less in what immediate tangible changes it wrought in African American life—which were slow, partial, and inseparable from a much longer chain of events—than in the idealism it expressed and the momentum it created. The nation’s highest court had attacked one of the fundamental supports of Jim Crow segregation and offered constitutional cover for the creation of one of the greatest social movements in American history.

 

IV. Civil Rights in an Affluent Society

Segregation extended beyond private business property; this segregated drinking fountain was located on the ground of the Halifax county courthouse in North Carolina. Photograph, April 1938. Wikimedia, http://commons.wikimedia.org/wiki/File:Segregation_1938b.jpg.

Segregation extended beyond private business property; this segregated drinking fountain was located on the ground of the Halifax county courthouse in North Carolina. Photograph, April 1938. Wikimedia, http://commons.wikimedia.org/wiki/File:Segregation_1938b.jpg.

Education was but one aspect of the nation’s Jim Crow machinery. African Americans had been fighting against a variety of racist policies, cultures and beliefs in all aspects of American life. And while the struggle for black inclusion had few victories before World War II, the war and the “Double V” campaign as well as the postwar economic boom led to rising expectations for many African Americans. When persistent racism and racial segregation undercut the promise of economic and social mobility, African Americans began mobilizing on an unprecedented scale against the various discriminatory social and legal structures.

While many of the civil rights movement’s most memorable and important moments, such as the sit-ins freedom rides and especially the March on Washington, occurred in the 1960s, the 1950s were a significant decade in the sometimes-tragic, sometimes-triumphant march of civil rights in the United States. In 1953, years before Rosa Parks’ iconic confrontation on a Montgomery city bus, an African American woman named Sarah Keys publicly challenged segregated public transportation. Keys, then serving in the Women’s Army Corps, traveled from her army base in New Jersey back to North Carolina to visit her family. When the bus stopped in North Carolina, the driver asked her to give up her seat for a white customer. Her refusal to do so landed her in jail in 1953 and led to a landmark 1955 decision, Sarah Keys v. Carolina Coach Company, in which the Interstate Commerce Commission ruled that “separate but equal” violated the Interstate Commerce Clause of the U.S. Constitution. Poorly enforced, it nevertheless gave legal coverage for the freedom riders years later and motivated further assaults against Jim Crow.

But if some events encouraged civil rights workers with the promise of progress, others were so savage they convinced activists that they could do nothing but resist. In the summer of 1955, two white men in Mississippi kidnapped and brutally murdered a fourteen-year-old boy Emmett Till. Till, visiting from Chicago and perhaps unfamiliar with the etiquette of Jim Crow, allegedly whistled at a white woman named Carolyn Bryant. Her husband, Roy Bryant, and another man, J.W. Milam, abducted Till from his relatives’ home, beat him, mutilated him, shot him, and threw his body in the Tallahatchie River. But the body was found. Emmett’s mother held an open-casket funeral so that Till’s disfigured body could make national news. The men were brought to trial. The evidence was damning, but an all-white jury found the two not guilty. Only months after the decision the two boasted of their crime in Look magazine. For young black men and women soon to propel the civil rights movement, the Till case was an indelible lesson.

On December 1, 1955, four months after Till’s death and six days after the Keys v. Carolina Coach Company decision, Rosa Parks refused to surrender her seat on a Montgomery city bus and was arrested. Montgomery’s public transportation system had longstanding rules requiring African American passengers to sit in the back of the bus and to give up their seats to white passengers if the buses filled. Parks was not the first to protest the policy by staying seated, but she was the first around whom Montgomery activists rallied.

Montgomery’s black population, under the leadership of a recently arrived, twenty-six-year-old Baptist minister named Martin Luther King Jr., formed the Montgomery Improvement Association (MIA) and coordinated an organized boycott of the city’s buses. The Montgomery Bus Boycott lasted from December 1955 until December 20, 1956, when the Supreme Court ordered their integration. The boycott not only crushed segregation in Montgomery’s public transportation, it energized the entire civil rights movement and established the leadership of Martin Luther King, Jr.

Motivated by the success of the Montgomery boycott, King and other African American leaders looked to continue the fight. In 1957, King helped create the Southern Christian Leadership Conference (SCLC) to coordinate civil rights groups across the South and buoy their efforts organizing and sustaining boycotts, protests, and other assaults against southern Jim Crow laws.

As pressure built, congress passed the Civil Rights Act of 1957, the first such measure passed since Reconstruction. The act was compromised away nearly to nothing, although it did achieve some gains, such as creating the Department of Justice’s Civil Rights Commission, which was charged with investigating claims of racial discrimination. And yet, despite its weakness, the Act signaled that pressure was finally mounting on Americans to confront the legacy of discrimination.

Despite successes at both the local and national level, the civil rights movement faced bitter opposition. Those opposed to the movement often used violent tactics to scare and intimidate African Americans and subvert legal rulings and court orders. For example, a year into the Montgomery bus boycott, angry white southerners bombed four African American churches as well as the homes of King and fellow civil rights leader E. D. Nixon. Though King, Nixon and the MIA persevered in the face of such violence, it was only a taste of things to come. Such unremitting hostility and violence left the outcome of the burgeoning civil rights movement in doubt. Despite its successes, civil rights activists looked back on the 1950s as a decade of at best mixed results and incomplete accomplishments. While the bus boycott, Supreme Court rulings and other civil rights activities signaled progress, church bombings, death threats, and stubborn legislators demonstrated the distance that still needed to be traveled.

 

V. Gender and Culture in the Affluent Society

Pair with http://everythingcroton.blogspot.com/2012/10/more-everything-crotons-fabulous_13.html.

New inventions to make housework easier and more fun for women proliferated in the post-war era, creating an advertising and marketing frenzy to attract female consumers to certain products. http://envisioningtheamericandream.files.wordpress.com/2013/03/housewives-chores.jpg.

America’s consumer economy reshaped how Americans experienced culture and shaped their identities. The Affluent Society gave Americans new experiences, new outlets, and new ways to understand and interact with one another.

“The American household is on the threshold of a revolution,” the New York Times declared in August 1948. “The reason is television.” ((Lewis L. Gould, Watching Television Come of Age: The New York Times Reviews by Jack Gould (Austin: University of Texas Press, 2002), 186.)) A distinct post-war phenomenon, television was actually several years in the making before it transformed postwar American culture. Presented to the American public at New York World’s Fair in 1939, the commercialization of television in the United States lagged during the war year. In 1947, though, regular full-scale broadcasting became available to the public. Television was instantly popular, so much so that by early 1948 Newsweek reported that it was “catching on like a case of high-toned scarlet fever.” ((Gary Edgerton, Columbia History of American Television (New York: Columbia University Press, 2009), 90.)) Indeed, between 1948 and 1955 close to two-thirds of the nation’s households purchased a television set. By the end of the 1950s, 90 percent of American families had one and the average viewer was tuning in for almost 5 hours a day. ((Ibid., 178.))

The technological ability to transmit images via radio waves gave birth to television. Television borrowed radio’s organizational structure, too. The big radio broadcasting companies, NBC, CBS, and ABC, used their technical expertise and capital reserves to conquer the airwaves. They acquired licenses to local stations and eliminated their few independent competitors. The Federal Communication Commission’s (FCC) refusal to issue any new licenses between 1948 and 1955 was a de facto endorsement of the big three’s stranglehold on the market.

In addition to replicating radio’s organizational structure, television also looked to radio for content. Many of the early programs were adaptations of popular radio variety and comedy shows, including the Ed Sullivan Show and Milton Berle’s Texaco Star Theater. These were accompanied by live plays, dramas, sports, and situation comedies. Due to the cost and difficulty of recording, most programs were broadcast live, forcing stations across the country to air shows at the same time. And since audiences had a limited number of channels to choose from, viewing experiences were broadly shared. Upwards of two thirds of television-owning households, for instance, watched popular shows such as I Love Lucy.

The limited number of channels and programs meant that networks selected programs that appealed to the widest possible audience to draw viewers and, more importantly, television’s greatest financers: advertisers. By the mid-1950s, an hour of primetime programming cost about $150,000 (about $1.5 million in today’s dollars) to produce. This proved too expensive for most commercial sponsors, who began turning to a joint financing model of 30-second spot ads. The commercial need to appeal to as many people as possible promoted the production of shows aimed at the entire family. Programs such as Father Knows Best and Leave it to Beaver featured light topics, humor, and a guaranteed happy ending the whole family could enjoy. ((Christopher H. Sterling and John Michael Kittross, Stay Tuned: A History of American Broadcasting (New York: Routledge, 2001), 364.))

Advertising began creeping up everywhere in the 1950s. No longer confined to commercials or newspapers, advertisements were subtly (or not so subtly in this case) worked into TV shows like the Quiz Show “21”. (Geritol is a dietary supplement.) Orlando Fernandez, “[Quiz show "21" host Jack Barry turns toward contestant Charles Van Doren as fellow contestant Vivienne Nearine looks on],” 1957. Library of Congress, http://www.loc.gov/pictures/item/00652124/.

Advertising was everywhere in the 1950s, including on TV shows like the quiz show “21,” sponsored by Geritol, a dietary supplement. Orlando Fernandez, “[Quiz show “21” host Jack Barry turns toward contestant Charles Van Doren as fellow contestant Vivienne Nearine looks on],” 1957. Library of Congress.

Television’s broad appeal, however, was about more than money and entertainment. Shows of the 1950s, such as Father Knows Best and I Love Lucy, idealized the nuclear family, “traditional” gender roles, and white, middle-class domesticity. Leave It to Beaver, which became the prototypical example of the 1950s’ television family, depicted its breadwinner-father and homemaker-mother guiding their children through life lessons. Such shows, and Cold War America more broadly, reflected a popular consensus that such lifestyles were not only beneficial, but the most effective way to safeguard American prosperity against communist threats and social “deviancy.”

 

The marriage of the suburban consumer culture and Cold War security concerns facilitated, and in turn was supported by, the ongoing postwar baby boom. From 1946 to 1964, American fertility experienced an unprecedented spike. A century of declining birth rates abruptly reversed. Although popular memory credits the cause of the baby boom to the return of virile soldiers from battle, the real story is more nuanced. After years of economic depression families were now wealthy enough to support larger families and had homes large enough to accommodate them, while women married younger and American culture celebrated the ideal of a large, insular family.

Underlying this “reproductive consensus” was the new cult of professionalism that pervaded postwar American culture, including the professionalization of homemaking. Mothers and fathers alike flocked to the experts for their opinions on marriage, sexuality, and, most especially, child-rearing. Psychiatrists held an almost mythic status as people took their opinions and prescriptions, as well as their vocabulary, into their everyday life. Books like Dr. Spock’s Baby and Child Care (1946) were diligently studied by women who took their careers as house-wife as just that: a career, complete with all the demands and professional trappings of job development and training. And since most women had multiple children roughly the same age as their neighbors’ children, a cultural obsession with kids flourished throughout the decade. Women bore the brunt of this pressure, chided if they did not give enough of their time to the children—especially if it was at the expense of a career—yet cautioned that spending too much time would lead to “Momism,” producing “sissy” boys who would be incapable of contributing to society and extremely susceptible to the communist threat.

A new youth culture exploded in American popular culture. On the one hand, the anxieties of the atomic age hit America’s youth particularly hard. Keenly aware of the discontent bubbling beneath the surface of the Affluent Society, for instance, many youth embraced rebellion. The 1955 film Rebel Without a Cause demonstrated the restlessness and emotional incertitude of the postwar generation raised in increasing affluence yet increasingly unsatisfied with their comfortably lives. At the same time, perhaps yearning for something beyond the “massification” of American culture yet having few other options to turn to beyond popular culture, American youth embraced rock ‘n’ roll. They listened to Little Richard, Buddy Holly, and especially Elvis Presley (whose sexually suggestive hip movements were judged subversive).

While an accepted part of culture in the twenty-first century, Rock and Roll music was seen by many as devilish, having a corruptive influence on the youth of America. Chuck Berry defined the rhythm and style that made Rock and Roll so distinctive and irresistible. Publicity photo, c, 1971. Wikimedia, http://commons.wikimedia.org/wiki/File:Chuck_Berry_1971.JPG.

Rock and roll music was seen by many as devilish, having a corruptive influence on the youth of America. Chuck Berry defined the rhythm and style that made rock and roll so distinctive and irresistible. Publicity photo, c, 1971. Wikimedia.

The popularity of rock and roll, which emerged in the postwar years, had not yet blossomed into the countercultural musical revolution of the coming decade, but it provided a magnet for teenage restlessness and rebellion. “Television and Elvis,” the musician Bruce Springsteen would recollect, “gave us full access to a new language, a new form of communication, a new way of being, a new way of looking, a new way of thinking; about sex, about race, about identity, about life; a new way of being an American, a human being; and a new way of hearing music.” American youth had seen so little of Elvis’ energy and sensuality elsewhere in their culture. “Once Elvis came across the airwaves,” Springsteen said, “once he was heard and seen in action, you could not put the genie back in the bottle. After that moment, there was yesterday, and there was today, and there was a red hot, rockabilly forging of a new tomorrow, before your very eyes.” ((Bruce Springsteen, “SXSW Keynote Address, Rolling Stone (March 28, 2012), http://www.rollingstone.com/music/news/exclusive-the-complete-text-of-bruce-springsteens-sxsw-keynote-address-20120328.))

While black musicians like Chuck Berry created Rock and Roll, it was brought into the mainstream (white) American culture through performers like Elvis. His good looks, sensual dancing, and sonorous voice stole the hearts of millions of American teenage girls, which was at that moment becoming a central segment of the consumer population. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/3/35/Elvis_Presley_Jailhouse_Rock.jpg.

While black musicians like Chuck Berry created Rock and Roll, it was brought into the mainstream (white) American culture through performers like Elvis. His good looks, sensual dancing, and sonorous voice stole the hearts of millions of American teenage girls, which was at that moment becoming a central segment of the consumer population. Wikimedia.

Other Americans took larger steps to reject the expected conformity of the Affluent Society. The writers and poets and musicians of the Beat Generation, disillusioned with capitalism, consumerism, and traditional gender roles, sought a deeper meaning in life. Beats traveled across the country, studied Eastern religions, and experimented with drugs and sex and art.

Behind the scenes, Americans were challenging sexual mores. The gay rights movement, for instance, stretched back into the Affluent Society. While the country proclaimed homosexuality a mental disorder, gay men established the Mattachine Society in Los Angeles and gay women formed the Daughters of Bilitis in San Francisco as support groups. They held meetings, distributed literature, provided legal and counseling services, and formed chapters across the country. Much of their work, however, remained secretive because homosexuals risked arrest and abuse, if discovered. ((John D’Emilio, Sexual Politics, Sexual Communities, Second Edition (Chicago: University of Chicago Press, 2012), 102-103.))

Society’s “consensus,” on everything from the consumer economy to gender roles, did not go unchallenged. Much discontent was channeled through the machine itself: advertisers sold rebellion no less than they sold baking soda. And yet others were rejecting the old ways, choosing new lifestyles, challenging old hierarchies, and embarking upon new paths.

 

VI. Politics and Ideology in the Affluent Society

Postwar economic prosperity and the creation of new suburban spaces inevitably shaped Americans’ politics. In stark contrast to the Great Depression, the new prosperity renewed belief in the superiority of capitalism, cultural conservatism, and religion.

In the 1930s, the economic ravages of the international economic catastrophe knocked the legs out from under the intellectual justifications for keeping government out of the economy. And yet, despite the inhospitable intellectual and cultural climate, there were pockets of true believers who kept the gospel of the free market alive. The single most important was the National Association of Manufacturers (NAM). In the midst of the depression, NAM, under the leadership of a group known as the “Brass Hats” reinvented itself and went on the offensive, initiating advertising campaigns supporting “free enterprise” and “The American Way of Life.” ((See Richard Tedlow, “The National Association of Manufacturers and Public Relations During the New Deal,” The Business History Review 50 (Spring 1976), 25-45; and Wendy Wall, Inventing the “American Way”: The Politics of Consensus from the New Deal to the Civil Rights Movement (New York: Oxford University Press, 2008).)) More importantly, NAM became a node for business leaders, such as J. Howard Pew of Sun Oil and Jasper Crane of DuPont Chemical Co., to network with like-minded individuals and take the message of free enterprise to the American people. The network of business leaders that NAM brought together in the midst of the Great Depression formed the financial, organizational and ideological underpinnings of the free market advocacy groups that emerged and found ready adherents in America’s new suburban spaces in the post-war decades.

One of the most important advocacy groups that sprang up after the war was Leonard Read’s Foundation for Economic Education. Read founded FEE in 1946 on the premise that “The American Way of Life” was essentially individualistic and that the best way to protect and promote that individualism was through libertarian economics. FEE, whose advisory board and supporters came mostly from the NAM network of Pew and Crane, became a key ideological factory, supplying businesses, service clubs, churches, schools and universities with a steady stream of libertarian literature, much of it authored by Austrian economist Ludwig Von Mises. ((Gregory Eow, “Fighting a New Deal: Intellectual Origins of the Reagan Revolution, 1932-1952 (Ph.D., Dissertation, Rice University, 2007); Brian Doherty, Radicals for Capitalism: A Freewheeling History of the Modern American Libertarian Movement (New York: Public Affairs, 2007); and Kim Phillips Fein, Invisible Hands: The Businessmen’s Crusade Against the New Deal (New York: W. W. Norton, 2009), 43-55.))

Shortly after FEE’s formation, Austrian economist and libertarian intellectual Friedrich Hayek founded the Mont Pelerin Society (MPS) in 1947. Unlike FEE, whose focus was more ideological in nature, the MPS’s focus on the intellectual work of promoting and improving capitalism brought together intellectuals from both sides of the Atlantic in common cause. Like FEE, many of the lay supporters of the MPS, such as Pew and Jasper Crane, also came from the NAM network. The MPS successfully challenged liberal, Keynesian economics on its home turf, academia, particularly when the brilliant University of Chicago economist Milton Friedman became its president. Friedman’s willingness to advocate for and apply his libertarian economics in the political realm made him, and the MPS, one of the most influential free market advocates in the world. Together with the Chicago School of Economics, the MPS carved out a critical space in academia that legitimized the libertarian ideology so successfully evangelized by FEE, its descendant organizations, and libertarian populizers such as Ayn Rand. ((Angus Burgin, The Great Presuasion: Reinventing Free Markets Since the Great Depression (Cambridge: Harvard University Press, 2012); Jennifer Burns, Goddess of the Market: Ayn Rand and the American Right (New York: Oxford University Press, 2009).))

Libertarian politics and evangelical religion were shaping the origins of a conservative, suburban constituency. Suburban communities’ distance from government and other top-down community-building mechanisms left a social void that evangelical churches eagerly filled. More often than not the theology and ideology of these churches reinforced socially conservative views while simultaneously reinforcing congregants’ belief in economic individualism. These new communities and the suburban ethos of individualism that accompanied them became the building blocks for a new political movement. And yet, while the growing suburbs, and the conservative ideology that found a ready home there, eventually proved immensely important in American political life, their impact was not immediately felt. They did not yet have a champion.

In the post-World War II years the Republican Party faced a fork in the road. Its complete lack of electoral success since the Depression led to a battle within the party about how to revive its electoral prospects. The more conservative faction, represented by Ohio Senator Robert Taft (son of former President William Howard Taft) and backed by many party activists and financers such as J. Howard Pew, sought to take the party further to the right, particularly in economic matters, by rolling back New Deal programs and policies. On the other hand, the more moderate wing of the party led by men such as New York Governor Thomas Dewey and Nelson Rockefeller sought to embrace and reform New Deal programs and policies. There were further disagreements among party members about how involved the United States should be in the world. Issues such as foreign aid, collective security, and how best to fight Communism divided the party.

 Just like the internet, don’t always trust what you read in newspapers. This obviously incorrect banner from the front page of the Chicago Tribune on November 3, 1948 made its own headlines as the newspaper’s most embarrassing gaff. Photograph, 1948. http://media-2.web.britannica.com/eb-media/14/65214-050-D86AAA4E.jpg.

Just like the internet, don’t always trust what you read in newspapers. This obviously incorrect banner from the front page of the Chicago Tribune on November 3, 1948 made its own headlines as the newspaper’s most embarrassing gaff. Photograph, 1948. http://media-2.web.britannica.com/eb-media/14/65214-050-D86AAA4E.jpg.

Initially, the moderates, or “liberals,” won control of the party with the nomination of Thomas Dewey in 1948. Dewey’s shocking loss to Truman, however, emboldened conservatives, who rallied around Taft as the 1952 presidential primaries approached. With the conservative banner riding high in the party, General Dwight Eisenhower, most recently NATO supreme commander, felt obliged to join the race in order to beat back the conservatives and “prevent one of our great two Parties from adopting a course which could lead to national suicide.” In addition to his fear that Taft and the conservatives would undermine collective security arrangements such as NATO, he also berated the “neanderthals” in his party for their anti-New Deal stance. Eisenhower felt that the best way to stop Communism was to undercut its appeal by alleviating the conditions under which it was most attractive. That meant supporting New Deal programs. There was also a political calculus to Eisenhower’s position. He observed, “Should any political party attempt to abolish social security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history.” ((Allan J. Lichtman, White Protestant Nation: The Rise of the American Conservative Movement (New York: Atlantic Monthly Press, 2008), 180, 201, 185.))

The primary contest between Taft and Eisenhower was close and controversial, with Taft supporters claiming that Eisenhower stole the nomination from Taft at the convention. Eisenhower, attempting to placate the conservatives in his party picked California Congressman and virulent anti-Communist Richard Nixon as his running mate. With the Republican nomination sewn up, the immensely popular Eisenhower swept to victory in the 1952 general election, easily besting Truman’s hand-picked successor, Adlai Stevenson. Eisenhower’s popularity boosted Republicans across the country, leading them to majorities in both houses of Congress.

The Republican sweep in the 1952 election proved less momentous than its supporters hoped. Eisenhower’s popularity helped elect a congress that was more conservative then he had hoped. Within two years of his election, Eisenhower saw his legislative proposals routinely defeated by an unlikely alliance of conservative Republicans, who thought Eisenhower was going too far, and liberal Democrats, who thought he was not going far enough. For example, in 1954 Eisenhower proposed a national Health Care plan that would have provided Federal support for increasing health care coverage across the nation without getting the government directly involved in regulating the health care industry. The proposal was defeated in the house by a 238-134 vote with a swing bloc of 75 conservative Republicans joining liberal Democrats voting against the plan. ((Steven Wagner, Eisenhower Republicanism Pursuing the Middle Way (DeKalb: Northern Illinois University Press, 2006), 15.)) Eisenhower’s proposals in education and agriculture often suffered similar defeats. By the end of his presidency, Ike’s domestic legislative achievements were largely limited to expanding social security, making Housing, Education and Welfare (HEW) a cabinet position, passing the National Defense Education Act, and bolstering federal support to education, particularly in math and science.

Like any president, Eisenhower’s record was as much about his impact outside of the legislative arena. Ike’s “Middle-of-the-Road” philosophy guided his foreign as much as his domestic policy. Indeed, like his attempts to use federal dollars to give state and local governments as well as individuals the power to act at home, his foreign policy sought to keep the United States from intervening abroad by bolstering its allies. Thus Ike funneled money to the French in Vietnam fighting the Ho Chi Minh led Communists, walked a tight line between helping Chiang Kai-Shek’s Taiwan without overtly provoking Mao Tse-Tung’s China, and materially backed native actors who destabilized “unfriendly” governments in Iran and Guatemala. The centerpiece of Ike’s foreign policy was “massive retaliation,” or the threat of nuclear force in the face of Communist expansion, thus getting more “bang” for his government “buck.” While Ike’s “mainstream” “middle-way” won broad popular support, his own party was slowly moving away from his positions. By 1964 the party had moved far enough to the Right to nominate Arizona Senator Barry Goldwater, the most conservative candidate in a generation. The political moderation of the Affluent Society proved little more than a way station on the road to liberal reform and a future conservative ascendancy.

 

VII. Conclusion

The postwar American “consensus” held great promise. Despite the looming threat of nuclear war, millions experienced an unprecedented prosperity and an increasingly proud American identity. Prosperity seemed to promise ever higher standards of living. But things fell apart, and the center could not hold: wracked by contradiction, dissent, discrimination, and inequality, the Affluent Society stood on the precipice of revolution.

 

Contributors

This chapter was edited by James McKay, with content contributions by Edwin C. Breeden, Maggie Flamingo, Destin Jenkins, Kyle Livie, Jennifer Mandel, James McKay, Laura Redford, Ronny Regev, and Tanya Roth.

 

Recommended Reading

  1. Boyle, Kevin. The UAW and the Heyday of American Liberalism, 1945-1968. Ithaca: Cornell University Press, 1995.
  2. Branch, Taylor, Parting the Waters: America in the King Years, 1954–1963. New York: Simon & Schuster, 1988.
  3. Cohen, Lizabeth. A Consumer’s Republic: The Politics of Mass Consumption in Postwar America. New York: Vintage, 2003.
  4. Horowitz, Daniel. Betty Friedan and the Making of the Feminine Mystique: The American Left, the Cold War, and Modern Feminism. Amherst: University of Massachusetts Press, 1998.
  5. Jackson, Kenneth T. Crabgrass Frontier: The Suburbanization of the United States. New York: Oxford University Press, 1985.
  6. Jumonville, Neil. Critical Crossings: The New York Intellectuals in Postwar America. Berkeley: University of California Press, 1991.
  7. May, Elaine Tyler. Homeward Bound: American Families in the Cold War Era. New York: Basic Books, 1988
  8. McGirr, Lisa. Suburban Warriors: The Origins of the New American Right. Princeton: Princeton University Press, 2001.
  9. Patterson, James T. Grand Expectations: The United States, 1945–1974. New York: Oxford University Press, 1996.
  10. Self, Robert. American Babylon: Race and the Struggle for Postwar Oakland. Princeton: Princeton University Press, 2005.
  11. Sugrue, Thomas, The Origins of the Urban Crisis: Race and Inequality in Postwar Detroit. Princeton: Princeton University Press, 2005.
  12. Von Eschen, Penny. Satchmo Blows Up the World: Jazz Ambassadors Play the Cold War. Cambridge: Harvard University Press, 2004.
  13. Wagnleitner, Reinhold. Coca-Colonization and the Cold War: The Cultural Mssion of the United States in Austria after the Second World War. Chapel Hill: The University of North Carolina Press, 1994.
  14. Wall, Wendy. Inventing the “American Way”: The Politics of Consensus from the New Deal to the Civil Rights Movement. New York: Oxford University Press, 2008.
  15. Whitfield, Stephen. The Culture of the Cold War. Baltimore: The Johns Hopkins University Press, 1991.

 

Notes

25. The Cold War

Test of the tactical nuclear weapon, "Small Boy" at the Nevada Test Site on July 14, 1962. National Nuclear Security Administration, #760-5-NTS.

Test of the tactical nuclear weapon “Small Boy” at the Nevada Test Site, July 14, 1962. National Nuclear Security Administration, #760-5-NTS.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

Relations between the United States and the Soviet Union–erstwhile allies in the Second World War–soured quickly in the early months of the postwar peace. On February 22, frustrated that the Truman Administration still officially sought U.S.-Soviet cooperation, the Charge d’Affaires of the U.S. Embassy in Moscow, George Kennan, sent a famously lengthy telegram–literally referred to as the “Long Telegram” –to the State Department denouncing the Soviet Union. “World communism is like a malignant parasite which feeds only on diseased tissue,” he wrote, and “the steady advance of uneasy Russian nationalism … in [the] new guise of international Marxism … is more dangerous and insidious than ever before.” ((Martin McCauley, Origins of the Cold War 1941-49: Revised 3rd Edition (New York: Routledge, 2013), 1401-141.)) There could be no cooperation between the United States and the Soviet Union, Kennan wrote. Instead, the Soviets had to be “contained.” Less than two weeks later, on March 5, former British Prime Minister Winston Churchill visited President Harry Truman in his home state of Missouri and declared that Europe had been cut in half, divided by an “iron curtain” that had “descended across the Continent.” ((Ibid., 141.)) Aggressively anti-Soviet sentiment seized the American government and soon the American people. ((For Kennan, see especially John Lewis Gaddis, George F. Kennan: An American Life (New York: Penguin Press, 2011); John Lukacs, editor, George F. Kennan and the Origins of Containment, 1944–1946: The Kennan-Lukacs Correspondence (Columbia: University of Missouri Press, 1997).))

The Cold War was a global political and ideological struggle between capitalist and communist countries, particularly between the two surviving superpowers of the postwar world: the United States and the Union of Soviet Socialist Republics (USSR). “Cold” because it was never a “hot,” direct shooting war between the United States and the Soviet Union, the generations-long, multifaceted rivalry nevertheless bent the world to its whims. Tensions ran highest, perhaps, during the “first Cold War,” which lasted from the mid-1940s through the mid-1960s, after which followed a period of relaxed tensions and increased communication and cooperation, known by the French term détente, until the “second Cold War” interceded from roughly 1979 until the collapse of the Berlin Wall in 1989 and the dissolution of the Soviet Union in 1991. The Cold War reshaped the world, and in so doing forever altered American life and the generations of Americans that lived within its shadow.

 

II. Political, Economic, and Military Dimensions

The Cold War grew out of a failure to achieve a durable settlement among leaders from the “Big Three” Allies—the US, Britain, and the Soviet Union—as they met at Yalta in Russian Crimea and at Potsdam in occupied Germany to shape the postwar order. The Germans had pillaged their way across Eastern Europe and the Soviets had pillaged their way back across it at the cost of millions of lives. Stalin considered the newly conquered territory part of a Soviet “sphere of influence.” With Germany’s defeat imminent, the Allies set terms for unconditional surrender, while deliberating over reparations, tribunals, and the nature of an occupation regime that would initially be divided into American, British, French, and Soviet zones. Even as plans were made to end the fighting in the Pacific, and it was determined that the Soviets would declare war on Japan within ninety days of Germany’s surrender, suspicion and mistrust were already mounting. The political landscape was altered drastically by Franklin Roosevelt’s sudden death in April 1945, just days before the inaugural meeting of the United Nations (UN). Although Roosevelt was skeptical of Stalin, he always held out hope that the Soviets could be brought into the “Free World.” Truman, like Churchill, had no such illusions. He committed the United States to a hard-line, anti-Soviet approach. ((Fraser J. Harbutt, Yalta 1945: Europe and America at the Crossroads (Cambridge: Cambridge University Press, 2010).))

At the Potsdam Conference, held on the outskirts of Berlin from mid-July to early August, the allies debated the fate of Soviet-occupied Poland. Toward the end of the meeting, the American delegation received word that Manhattan Project scientists had successfully tested an atomic bomb. On July 24, when Truman told Stalin about this “new weapon of unusual destructive force,” the Soviet leader simply nodded his acknowledgement and said that he hoped the Americans would make “good use” of it. ((Herbert Feis, Between War and Peace: The Potsdam Conference (Princeton: Princeton University Press, 1960).))

The Cold War had long roots. An alliance of convenience during World War II to bring down Hitler’s Germany was not enough to erase decades of mutual suspicions. The Bolshevik Revolution had overthrown the Russian Tsarists during World War I. Bolshevik leader Vladimir Lenin urged an immediate worldwide peace that would pave the way for world socialism just as Woodrow Wilson brought the United States into the war with promises of global democracy and free trade. The United States had intervened militarily against the Red Army during the Russian civil war, and when the Soviet Union was founded in 1922 the United States refused to recognize it. The two powers were brought together only by their common enemy, and, without that common enemy, there was little hope for cooperation. (((For overviews of the Cold War, see especially John Lewis Gaddis, Strategies of Containment. A Critical Appraisal of Postwar American National Security Policy (New York: Oxford University Press, 1982); John Lewis Gaddis, The United States and the Origins of the Cold War, (New York: Columbia University Press, 2000); John Lewis Gaddis, The Cold War: A New History (New York: Penguin Press, 2005); Melvyn P. Leffler, A Preponderance of Power: National Security, the Truman Administration, and the Cold War (Palo Alto: Stanford University Press, 1992).))

On the eve of American involvement in World War II, on August 14, 1941, Roosevelt and Churchill had issued a joint declaration of goals for postwar peace, known as the Atlantic Charter. An adaptation of Wilson’s Fourteen Points, the Atlantic Charter established the creation of the United Nations. The Soviet Union was among the fifty charter UN member-states and was given one of five seats—alongside the US, Britain, France, and China—on the select Security Council. The Atlantic Charter, though, also set in motion the planning for a reorganized global economy. The July 1944 United Nations Financial and Monetary Conference, more popularly known as the Bretton Woods Conference, created the International Monetary Fund (IMF) and the forerunner of the World Bank, the International Bank for Reconstruction and Development (IBRD). The “Bretton Woods system” was bolstered in 1947 with the addition of the General Agreements on Tariffs and Trade (GATT), forerunner of the World Trade Organization (WTO). The Soviets rejected it all.

Many officials on both sides knew that the Soviet-American relationship would dissolve into renewed hostility upon the closing of the war, and events proved them right. In a 1947 article for Foreign Affairs—written under the pseudonym “Mr. X”—George Kennan warned that Americans should “continue to regard the Soviet Union as a rival, not a partner,” since Stalin harbored “no real faith in the possibility of a permanent happy coexistence of the Socialist and capitalist worlds.” He urged US leaders to pursue “a policy of firm containment, designed to confront the Russians.” ((George Kennan, “The Sources of Soviet Conduct,” Foreign Affairs (July 1947), 566-582.))

Truman, on March 12, 1947, announced $400 million in aid to Greece and Turkey, where “terrorist activities…led by Communists” jeopardized “democratic” governance. With Britain “reducing or liquidating its commitments in several parts of the world, including Greece,” it fell on the US, Truman said, “to support free peoples…resisting attempted subjugation by…outside pressures.” ((Joyce P. Kaufman, A Concise History of U.S. Foreign Policy (Rowman & Littlefield, Jan 16, 2010), 86.)) The so-called “Truman Doctrine” became a cornerstone of the American policy of containment. ((Denise. M. Bostdorff, Proclaiming the Truman Doctrine: The Cold War Call to Arms (College Station: Texas A&M University Press, 1998).))

In the harsh winter of 1946-47, famine loomed in much of continental Europe. Blizzards and freezing cold halted coal production. Factories closed. Unemployment spiked. Amid these conditions, the Communist parties of France and Italy gained nearly a third of the seats in their respective Parliaments. American officials worried that Europe’s impoverished masses were increasingly vulnerable to Soviet propaganda. The situation remained dire through the spring, when Secretary of State General George Marshall gave an address at Harvard University, on June 5, 1947, suggesting that “the United States should do whatever it is able to do to assist in the return of normal economic health to the world, without which there can be no political stability and no assured peace.” ((Michael Beschloss, Our Documents: 100 Milestone Documents from the National Archives (New York: Oxford University Press, 2006), 199.)) Although Marshall had stipulated to potential critics that his proposal was “not directed against any country, but against hunger, poverty…and chaos,” Stalin clearly understood the development of the ERP as an assault against Communism in Europe: he saw it as a “Trojan Horse” designed to lure Germany and other countries into the capitalist web. ((Charles L. Mee, The Marshall Plan: The Launching of the Pax Americana (New York: Simon & Schuster, 1984).))

The European Recovery Program (ERP), popularly known as the Marshall Plan, pumped enormous sums into Western Europe. From 1948-1952 the US invested $13 billion toward reconstruction while simultaneously loosening trade barriers. To avoid the postwar chaos of World War I, the Marshall Plan was designed to rebuild Western Europe, open markets, and win European support for capitalist democracies. The Soviets countered with their rival Molotov Plan, a symbolic pledge of aid to Eastern Europe. Polish leader Józef Cyrankiewicz was rewarded with a five-year, $450 million dollar trade agreement from Russia for boycotting the Marshall Plan. Stalin was jealous of Eastern Europe. When Czechoslovakia received $200 million of American assistance, Stalin summoned Czech foreign minister Jan Masaryk to Moscow. Masaryk later recounted that he “went to Moscow as the foreign minister of an independent sovereign state,” but “returned as a lackey of the Soviet Government.” Stalin exercised ever tighter control over Soviet “satellite” countries in Central and Eastern Europe. ((Melvyn P. Leffler and Odd Arne Westad, editors, The Cambridge History of the Cold War: Volume 1, Origins (Cambridge: Cambridge University Press, 2010), 189.))

The situation in Germany meanwhile deteriorated. Berlin had been divided into communist and capitalist zones. In June 1948, when the US, British, and French officials introduced a new currency, the Soviet Union initiated a ground blockade, cutting off rail and road access to West Berlin (landlocked within the Soviet occupation zone) to gain control over the entire city. The United States organized and coordinated a massive airlift that flew essential supplies into the beleaguered city for eleven months, until the Soviets lifted the blockade on May 12, 1949. Germany was officially broken in half. On May 23, the western half of the country was formally renamed the Federal Republic of Germany (FRG) and the eastern Soviet zone became the German Democratic Republic (GDR) later that fall. Berlin, which lay squarely within the GDR, was divided into two sections (and, from August 1961 until November 1989, famously separated by physical walls). ((Daniel F. Harrington, Berlin on the Brink: The Blockade, the Airlift, and the Early Cold War  (Lexington: University of Kentucky Press, 2012).))

The Berlin Blockade and resultant Allied airlift was one of the first major crises of the Cold War. Photograph, U.S. Navy Douglas R4D and U.S. Air Force C-47 aircraft unload at Tempelhof Airport during the Berlin Airlift, c. 1948-1949. Wikimedia, http://commons.wikimedia.org/wiki/File:C-47s_at_Tempelhof_Airport_Berlin_1948.jpg.

The Berlin Blockade and resultant Allied airlift was one of the first major crises of the Cold War. Photograph, U.S. Navy Douglas R4D and U.S. Air Force C-47 aircraft unload at Tempelhof Airport during the Berlin Airlift, c. 1948-1949. Wikimedia, http://commons.wikimedia.org/wiki/File:C-47s_at_Tempelhof_Airport_Berlin_1948.jpg.

In the summer of 1949, American officials launched the North Atlantic Treaty Alliance (NATO), a mutual defense pact in which the US and Canada were joined by England, France, Belgium, Luxembourg, the Netherlands, Italy, Portugal, Norway, Denmark, and Iceland. The Soviet Union would formalize its own collective defensive agreement in 1955, the Warsaw Pact, which included Albania, Romania, Bulgaria, Hungary, Czechoslovakia, Poland, and East Germany.

Liberal journalist Walter Lippmann was largely responsible for popularizing the term “the Cold War” in his book, The Cold War: A Study in U.S. Foreign Policy, published in 1947. Lippmann envisioned a prolonged stalemate between the US and the USSR, a war of words and ideas in which direct shots would not necessarily be fired between the two. Lippmann agreed that the Soviet Union would only be “prevented from expanding” if it were “confronted with…American power,” but he felt “that the strategical conception and plan” recommended by Mr. X (George Kennan) was “fundamentally unsound,” as it would require having “the money and the military power always available in sufficient amounts to apply ‘counter-force’ at constantly shifting points all over the world.” Lippmann cautioned against making far-flung, open-ended commitments, favoring instead a more limited engagement that focused on halting the influence of communism in the “heart” of Europe; he believed that if the Soviet system were successfully restrained on the continent, it could otherwise be left alone to collapse under the weight of its own imperfections. ((Walter Lippman, The Cold War: A Study in U.S. Foreign Policy (New York: Harper, 1947), 10, 15.))

A new chapter in the Cold War began on October 1, 1949, when the Chinese Communist Party (CCP) led by Mao Tse-tung declared victory against “Kuomintang” Nationalists led by the Western-backed Chiang Kai-shek. The Kuomintang retreated to the island of Taiwan and the CCP took over the mainland under the red flag of the People’s Republic of China (PRC). Coming so soon after the Soviet Union’s successful test of an atomic bomb, on August 29, the “loss of China,” the world’s most populous country, contributed to a sense of panic among American foreign policymakers, whose attention began to shift from Europe to Asia. After Dean Acheson became Secretary of State in 1949, Kennan was replaced in the State Department by former investment banker Paul Nitze, whose first task was to help compose, as Acheson later described in his memoir, a document designed to “bludgeon the mass mind of ‘top government’” into approving a “substantial increase” in military expenditures. ((James Chace, Acheson: The Secretary Of State Who Created The American World (New York: Simon & Schuster, 2008), 441).))

The communist world system rested, in part, on the relationship between the two largest communist nations -- the Soviet Union and the People’s Republic of China. This 1950 Chinese Stamp depicts Joseph Stalin shaking hands with Mao Zedong. Wikimedia, http://commons.wikimedia.org/wiki/File:Chinese_stamp_in_1950.jpg.

The communist world system rested, in part, on the relationship between the two largest communist nations — the Soviet Union and the People’s Republic of China. This 1950 Chinese Stamp depicts Joseph Stalin shaking hands with Mao Zedong. Wikimedia, http://commons.wikimedia.org/wiki/File:Chinese_stamp_in_1950.jpg.

“National Security Memorandum 68: United States Objectives and Programs for National Security,” a national defense memo known as “NSC-68,” achieved its goal. Issued in April 1950, the nearly sixty-page classified memo warned of “increasingly terrifying weapons of mass destruction,” which served to remind “every individual” of “the ever-present possibility of annihilation.” It said that leaders of the USSR and its “international communist movement” sought only “to retain and solidify their absolute power.” As the central “bulwark of opposition to Soviet expansion,” America had become “the principal enemy” that “must be subverted or destroyed by one means or another.”  NSC-68 urged a “rapid build-up of political, economic, and military strength” in order to “roll back the Kremlin’s drive for world domination.” Such a massive commitment of resources, amounting to more than a threefold increase in the annual defense budget, was necessary because the USSR, “unlike previous aspirants to hegemony,” was “animated by a new fanatic faith,” seeking “to impose its absolute authority over the rest of the world.” ((Quotes from Curt Cardwell, NSC 68 and the Political Economy of the Early Cold War (Cambridge: Cambridge University Press, 2011), 10-12.)) Both Kennan and Lippmann were among a minority in the foreign policy establishment who argued to no avail that such a “militarization of containment” was tragically wrongheaded. ((Gaddis, Strategies of Containment.))

On June 25, 1950, as US officials were considering the merits of NSC 68’s proposals, including “the intensification of … operations by covert means in the fields of economic … political and psychological warfare” designed to foment “unrest and revolt in … [Soviet] satellite countries,” fighting erupted in Korea between communists in the north and American-backed anti-communists in the south. ((Gregory Mitrovich, Undermining the Kremlin: America’s Strategy to Subvert the Soviet Bloc, 1947-1956 (Ithaca: Cornell University Press, 2000), 182.))

After Japan surrendered in September 1945, a US-Soviet joint occupation had paved the way for the division of Korea. In November 1947, the UN passed a resolution that a united government in Korea should be created but the Soviet Union refused to cooperate. Only the south held elections. The Republic of Korea (ROK), South Korea, was created three months after the election. A month later, communists in the north established the Democratic People’s Republic of Korea (DPRK). Both claimed to stand for a unified Korean peninsula. The UN recognized the ROK, but incessant armed conflict broke out between North and South. ((For the Korean War, see especially Bruce Cumings, The Origins of the Korean War, 2 vols. (Princeton: Princeton University Press, 1981, 1990); William W. Stueck, The Korean War: An International History (Princeton: Princeton University Press, 1995).))

In the spring of 1950, Stalin hesitantly endorsed North Korean leader Kim Il Sung’s plan to ‘liberate’ the South by force, a plan heavily influenced by Mao’s recent victory in China. While he did not desire a military confrontation with the US, Stalin thought correctly that he could encourage his Chinese comrades to support North Korea if the war turned against the DPRK. The North Koreans launched a successful surprise attack and Seoul, the capital of South Korea, fell to the communists on June 28. The UN passed resolutions demanding that North Korea cease hostilities and withdraw its armed forces to the 38th parallel and calling on member states to provide the ROK military assistance to repulse the Northern attack.

That July, UN forces mobilized under American General Douglass MacArthur. Troops landed at Inchon, a port city around 30 miles away from Seoul, and took the city on September 28. They moved on North Korea. On October 1, ROK/UN forces crossed the 38th parallel, and on October 26 they reached the Yalu River, the traditional Korea-China border. They were met by 300,000 Chinese troops who broke the advance and rolled up the offensive. On November 30, ROK/UN forces began a fevered retreat. They returned across the 38th parallel and abandoned Seoul on January 4, 1951. The United Nations forces regrouped, but the war entered into a stalemate. General MacArthur, growing impatient and wanting to eliminate the communist threats, requested authorization to use nuclear weapons against North Korea and China. Denied, MacArthur publicly denounced Truman. Truman, unwilling to threaten World War III and refusing to tolerate MacArthur’s public insubordination, dismissed the General in April. On June 23, 1951, the Soviet ambassador to the UN suggested a cease-fire, which the US immediately accepted. Peace talks continued for two years.

With the policy of “containing” communism and at home and abroad, the U.S. pressured the United Nations to support the South Koreans, ultimately supplying American troops to fight in the civil war. Though rather forgotten in the annals of American history, the Korean War caused over 30,000 American deaths and 100,000 wounded, leaving an indelible mark on those who served. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/1/1b/KoreanWarFallenSoldier1.jpg.

With the stated policy of “containing” communism at home and abroad, the U.S. pressured the United Nations to support the South Koreans and deployed American troops to the Korean Peninsula. Though overshadowed in the annals of American history, the Korean War caused over 30,000 American deaths and 100,000 wounded, leaving an indelible mark on those who served. Wikimedia.

General Dwight Eisenhower defeated Illinois Governor Adlai Stevenson in the 1952 presidential election and Stalin died in March 1953. The DPRK warmed to peace, and an armistice agreement was signed on July 27, 1953. Upwards of 1.5 million people had died during the conflict. ((Elizabeth Stanley, Paths to Peace: Domestic Coalition Shifts, War Termination and the Korean War (Stanford University Press, 2009), 208.))

Coming so soon after World War II and ending without clear victory, Korea became for many Americans a “forgotten war.” Decades later, though, the nation’s other major intervention in Asia would be anything but forgotten. The Vietnam War had deep roots in the Cold War world. Vietnam had been colonized by France and seized by Japan during World War II. The nationalist leader Ho Chi Minh had been backed by the US during his anti-Japanese insurgency and, following Japan’s surrender in 1945, “Viet Minh” nationalists, quoting Thomas Jefferson, declared an independent Democratic Republic of Vietnam (DRV). Yet France moved to reassert authority over its former colony in Indochina, and the United States sacrificed Vietnamese self-determination for France’s colonial imperatives. Ho Chi Minh turned to the Soviet Union for assistance in waging war against the French colonizers in a protracted war.

After French troops were defeated at the ‘Battle of Dien Bien Phu’ in May 1954, US officials helped broker a temporary settlement that partitioned Vietnam in two, with a Soviet/Chinese-backed state in the north and an American-backed state in the south. To stifle communist expansion southward, the United States would send arms, offer military advisors, prop up corrupt politicians, stop elections, and, eventually, send over 500,000 troops, of whom nearly 60,000 would be lost before the communists finally reunified the country.

 

III. The Arms Buildup, the Space Race, and Technological Advancement

The world was never the same after the United States leveled Hiroshima and Nagasaki in August 1945 with nuclear bombs. Not only had perhaps 180,000 civilians been killed, the nature of warfare was forever changed. The Soviets accelerated their nuclear research, expedited in no small part by spies such as Klaus Fuchs, who had stolen nuclear secrets from the American’s secret Manhattan Project. Soviet scientists successfully tested an atomic bomb on August 29, 1949, years before American officials had estimated they would. This unexpectedly quick Russian success not only caught the United States off guard, caused tensions across the Western world, and propelled a nuclear “arms race” between the US and the USSR.

The United States detonated the first thermonuclear weapon, or hydrogen bomb (using fusion explosives of theoretically limitless power) on November 1, 1952. The blast measured over 10 megatons and generated an inferno five miles wide with a mushroom cloud 25 miles high and 100 miles across. The irradiated debris—fallout—from the blast circled the Earth, occasioning international alarm about the effects of nuclear testing on human health and the environment. It only hastened the arms race, with each side developing increasingly advanced warheads and delivery systems. The USSR successfully tested a hydrogen bomb in 1953, and soon thereafter Eisenhower announced a policy of “massive retaliation.” The US would henceforth respond to threats or acts of aggression with perhaps its entire nuclear might. Both sides, then, would theoretically be deterred from starting a war, through the logic of “mutually-assured destruction,” (MAD). Oppenheimer likened the state of “nuclear deterrence” between the US and the USSR to “two scorpions in a bottle, each capable of killing the other,” but only by risking their own lives. ((J. Robert Oppenheimer, “Atomic Weapons and American Policy,” Foreign Affairs (July 1953), 529.))

In response to the Soviet Union’s test of a pseudo-hydrogen bomb in 1953, the United States began Castle Bravo --  the first U.S. test of a dry fuel, hydrogen bomb. Detonated on March 1, 1954, it was the most powerful nuclear device ever tested by the U.S. But the effects were more gruesome than expected, causing nuclear fall-out and radiation poisoning in nearby Pacific islands. Photograph, March 1, 945. Wikimedia, http://commons.wikimedia.org/wiki/File:Castle_Bravo_Blast.jpg.

In response to the Soviet Union’s test of a pseudo-hydrogen bomb in 1953, the United States began Castle Bravo — the first U.S. test of a dry fuel, hydrogen bomb. Detonated on March 1, 1954, it was the most powerful nuclear device ever tested by the U.S. But the effects were more gruesome than expected, causing nuclear fall-out and radiation poisoning in nearby Pacific islands. Photograph, March 1, 945. Wikimedia, http://commons.wikimedia.org/wiki/File:Castle_Bravo_Blast.jpg.

Fears of nuclear war produced a veritable atomic culture. Films such as Godzilla, On the Beach, Fail-Safe, and Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb plumbed the depths of American anxieties with plots featuring radioactive monsters, nuclear accidents, and doomsday scenarios. Anti-nuclear protests in the United States and abroad warned against the perils of nuclear testing and highlighted the likelihood that a thermonuclear war would unleash a global environmental catastrophe. Yet at the same time, peaceful nuclear technologies, such as fission and fusion-based energy, seemed to herald a utopia of power that would be clean, safe, and “too cheap to meter.” In 1953, Eisenhower proclaimed at the UN that the US would share the knowledge and means for other countries to use atomic power. Henceforth, “the miraculous inventiveness of man shall not be dedicated to his death, but consecrated to his life.” The ‘Atoms for Peace’ speech brought about the establishment of International Atomic Energy Agency (IAEA), along with worldwide investment in this new economic sector. ((Andrew J. Dunar, America in the Fifties (Ithaca, Syracuse University Press, 2006), 134.))

As Germany fell at the close of World War II, the United States and the Soviet Union each sought to acquire elements of the Nazi’s V-2 superweapon program. A devastating rocket that had terrorized England, the V-2 was capable of delivering its explosive payload up to a distance of nearly 600 miles, and both nations sought to capture the scientists, designs, and manufacturing equipment to make it work. A former top German rocket scientist, Wernher Von Braun, became the leader of the American space program; the Soviet Union’s program was secretly managed by former prisoner Sergei Korolev. After the end of the war, American and Soviet rocket engineering teams worked to adapt German technology in order to create an intercontinental ballistic missile (ICBM). The Soviets achieved success first. They even used the same launch vehicle on October 4, 1957, to send Sputnik 1, the world’s first human-made satellite, into orbit. It was a decisive Soviet propaganda victory. ((Deborah Cadbury, Space Race: The Epic Battle Between America and the Soviet Union for Dominance of Space (New York: Harper Collins Publishers, 2006).))

In response, the US government rushed to perfect its own ICBM technology and launch its own satellites and astronauts into space. In 1958, the National Aeronautics and Space Administration (NASA) was created as a successor to the National Advisory Committee for Aeronautics (NACA). Initial American attempts to launch a satellite into orbit using the Vanguard rocket suffered spectacular failures, heightening fears of Soviet domination in space. While the American space program floundered, on September 13, 1959, the Soviet Union’s “Luna 2” capsule became the first human-made object to touch the moon. The “race for survival,” as it was called by the New York Times, reached a new level. ((Tom Wolfe, The Right Stuff : (New York: Farrar, Straus and Giroux), 115.)) The Soviet Union successfully launched a pair of dogs (Belka and Strelka) into orbit and returned them to Earth while the American Mercury program languished behind schedule. Despite countless failures and one massive accident that killed nearly one hundred Soviet military and rocket engineers, Russian ‘cosmonaut’ Yuri Gagarin was launched into orbit on April 12, 1961. Astronaut Alan Shepard accomplished a sub-orbital flight in the Freedom 7 capsule on May 5. John Kennedy would use America’s losses in the “space race” to bolster funding for a moon landing.

While outer space captivated the world’s imagination, the Cold War still captured its anxieties. The ever-escalating arms race continued to foster panic. In the early 1950s, the Federal Civil Defense Administration (FDCA) began preparing citizens for the worst. Schoolchildren were instructed, via a film featuring Bert the Turtle, to “duck and cover” beneath their desks in the event of a thermonuclear war. ((Kenneth D. Rose, One Nation Underground: The Fallout Shelter in American Culture (New York: New york University Press, 2004), 128.))

Although it took a back seat to space travel and nuclear weapons, the advent of modern computing was yet another major Cold War scientific innovation, the effects of which were only just beginning to be understood. In 1958, following the humiliation of the Sputnik launches, Eisenhower authorized the creation of an Advanced Research Projects Agency (ARPA) housed within the Department of Defense (later changed to DARPA). As a secretive military research and development operation, ARPA was tasked with funding and otherwise overseeing the production of sensitive new technologies. Soon, in cooperation with university-based computer engineers, ARPA would develop the world’s first system of “network packing switches” and computer networks would begin connecting to one another.

 

IV. The Cold War Red Scare, McCarthyism, and Liberal Anti-Communism

Joseph McCarthy, Republican Senator from Wisconsin, fueled fears during the early 1950s that communism was rampant and growing. This intensified Cold War tensions felt by every segment of society, from government officials to ordinary American citizens. Photograph of Senator Joseph R. McCarthy, March 14, 1950. National Archives and Records Administration, http://research.archives.gov/description/6802721.

Joseph McCarthy, Republican Senator from Wisconsin, fueled fears during the early 1950s that communism was rampant and growing. This intensified Cold War tensions felt by every segment of society, from government officials to ordinary American citizens. Photograph of Senator Joseph R. McCarthy, March 14, 1950. National Archives and Records Administration.

Joseph McCarthy burst onto the national scene during a speech in Wheeling, West Virginia on February 9, 1950. Waving a sheet of paper in the air, he proclaimed: “I have here in my hand a list of 205…names that were made known to the Secretary of State as being members of the Communist party and who nevertheless are still working and shaping [US] policy.” Since the Wisconsin Republican had no actual list, when pressed, the number changed to fifty-seven, then, later, eighty-one. Finally he promised to disclose the name of just one communist, the nation’s “top Soviet agent.” The shifting numbers brought ridicule, but it didn’t matter, not really: McCarthy’s claims won him fame and fueled the ongoing “red scare.” ((David M. Oshinsky, A Conspiracy So Immense: The World of Joe McCarthy (New York: Oxford University Press, 2005), 109.))

Within a ten-month span beginning in 1949, the USSR developed a nuclear bomb, China fell to Communism, and over 300,000 American soldiers were deployed to fight land war in Korea. Newspapers, meanwhile, were filled with headlines alleging Soviet espionage.

During the war, Julius Rosenberg worked briefly, and had access to classified information, at the US Army Signal Corps Laboratory in New Jersey. He and his wife Ethel, who had both been members of the American Communist Party (CPUSA) in the 1930s, were accused of passing secret bomb-related documents onto Soviet officials and were indicted in August 1950 on changes of giving “nuclear secrets” to the Russians. After a trial in March 1951, they were found guilty and executed on June 19, 1953. ((Ibid., 102-103, 172, 335.))

The environment of fear and panic instigated by McCarthyism led to the arrest of many innocent people. Still, some Americans accused of supplying top-secret information to the Soviets were in fact spies. The Rosenbergs were convicted of espionage and executed in 1953 for giving information about the atomic bomb to the Soviets. This was one case that has proven the test of time, for as recently as 2008 a co-conspirator of the Rosenbergs admitted to spying for the Soviet Union. Roger Higgins, “[Julius and Ethel Rosenberg, separated by heavy wire screen as they leave U.S. Court House after being found guilty by jury],” 1951. Library of Congress, http://www.loc.gov/pictures/item/97503499/.

The environment of fear and panic instigated by McCarthyism led to the arrest of many innocent people. Still, some Americans accused of supplying top-secret information to the Soviets were, in fact, spies. Julius and Ethel Rosenbergs were convicted of espionage and executed in 1953 for delivering information about the atomic bomb to the Soviets. “[Julius and Ethel Rosenberg, separated by heavy wire screen as they leave U.S. Court House after being found guilty by jury],” 1951. Library of Congress.

Alger Hiss, the highest-ranking government official linked to Soviet espionage, was another prize for conservatives. Hiss was a prominent official in the U.S. State Department and served as secretary-general of the UN Charter Conference in San Francisco from April to June 1945 before leaving the State Department in 1946. A young congressman and member of the House Un-American Activities Committee (HUAC), Richard Nixon, made waves by accusing Hiss of espionage. On August 3, 1948, Whittaker Chambers testified before HUAC that he and Hiss had worked together as part of the secret “communist underground” in Washington DC during the 1930s. Hiss, who always maintained his innocence, stood trial twice. After a hung jury in July 1949, he was convicted on two counts of perjury (the statute of limitations for espionage having expired). Although later evidence certainly suggested their guilt, the prominent convictions of a few suspected spies fueled an anti-communist frenzy. Some began seeing communists everywhere. ((Ibid., 98-100, 123-125.))

 

Alger Hiss and the Rosenbergs offered anti-communists such as Joseph McCarthy the evidence they needed to allege a vast Soviet conspiracy to infiltrate and subvert the US government and justify the smearing all left-liberals, even those who were resolutely anti-communist. Not long after his February 1950 speech in Wheeling, McCarthy’s sensational charges became a source of growing controversy. Forced to respond, President Truman arranged a partisan congressional investigation designed to discredit McCarthy. The Tydings Committee held hearings from early March through July, 1950, and issued a final report admonishing McCarthy for perpetrating a “fraud and a hoax” on the American public. American progressives saw McCarthy’s crusade as nothing less than a political witch hunt. In June 1950, The Nation magazine editor Freda Kirchwey characterized “McCarthyism” as “the means by which a handful of men, disguised as hunters of subversion, cynically subvert the instruments of justice…in order to help their own political fortunes.” ((Sara Alpern, Freda Kirchwey: A Woman of the Nation (Cambridge: Harvard University Press, 1987), 203.)) Truman’s liberal supporters, and leftists like Kirchwey, hoped that McCarthy and the new “ism” that bore his name would blow over quickly. But “McCarthyism” was a symptom of a massive and widespread anti-communist hysteria that engulfed Cold War America.

With growing anti-communist excitement and a tough election on the horizon, Truman gave in to pressure in March 1947 and issued Executive Order 9835, establishing loyalty reviews for federal employees. In the case of Foreign Service officers, the Federal Bureau of Investigation (FBI) was empowered to conduct closer examinations of all potential “security risks.” Congressional committees, namely the House Un-American Activities Committee (HUAC) and the Senate Permanent Subcommittee on Investigations (SPSI), were authorized to gather facts and hold hearings. After Truman’s “loyalty order,” anti-subversion committees emerged in over a dozen state legislatures and review procedures proliferated in public schools and universities across the country. At the University of California, for example, thirty-one professors were dismissed in 1950 for refusing to sign a loyalty oath. The Senate Internal Security (McCarran) Act, passed in September 1950, mandated all “communist organizations” to register with the government, gave the government greater powers to investigate sedition, made it possible to prevent suspected individuals from gaining or keeping their citizenship. Between 1949 and 1954, HUAC, SPSI, and a new McCarran Committee conducted over one hundred investigations into subversive activities. ((Oshinsky, 171-174.))

There had, of course, been a communist presence in the United States. The Communist Party of the USA (CPUSA) formed in the aftermath of the 1917 Russian Revolution when the Bolsheviks created a Communist International (the Comintern) and invited socialists from around the world to join. During its first two years of existence, the CPUSA functioned in secret, hidden from a surge of anti-radical and anti-immigrant hysteria, investigations, deportations, and raids at the end of World War I. The CPUSA began its public life in 1921, after the panic subsided, but communism remained on the margins of American life until the 1930s, when leftists and liberals began to see the Soviet Union as a symbol of hope amid the Great Depression. Then, many communists joined the “Popular Front,” an effort to make communism mainstream by adapting it to American history and American culture. . During the Popular Front era, communists were integrated into mainstream political institutions through alliances with progressives in the Democratic Party. The CPUSA enjoyed most of its influence and popularity among workers in unions linked to the newly formed Congress of Industrial Organizations (CIO). Communists also became strong opponents of Jim Crow segregation and developed a presence in both the NAACP and the American Civil Liberties Union (ACLU). The CPUSA, moreover, established “front” groups, such as the League of American Writers, in which intellectuals participated without even knowing of its ties to the Comintern. But even at the height of the global economic crisis, communism never attracted many Americans. Even at the peak of its membership, the CPUSA had just 80,000 national “card-carrying” members. From the mid-1930s through the mid-1940s, “the Party” exercised most of its power indirectly, through coalitions with liberals and reformers. When news broke of Hitler and Stalin’s 1939 non-aggression, many fled the Party, feeling betrayed. A bloc of left-liberal anti-communist, meanwhile, purged remaining communists in their ranks, and the Popular Front collapsed. ((Ellen Schrecker, Many are the Crimes: McCarthyism in America (Princeton: Princeton University, 1999).))

Lacking the legal grounds to abolish the CPUSA, officials instead sought to expose and contain CPUSA influence. Following a series of predecessor committees, the House Un-American Activities Committee (HUAC) was established in 1938, then reorganized after the war and given the explicit task of investigating communism. By the time the Communist Control Act was passed in August 1954, effectively criminalizing Party membership, the CPUSA had long ceased to have meaningful influence. Anti-communists were driven to eliminate remaining CPUSA influence from progressive institutions, including the NAACP and the CIO. The Taft-Hartley Act (1947) gave union officials the initiative to purge communists from the labor movement. A kind of “Cold War” liberalism took hold. In January 1947, anti-communist liberals formed Americans for Democratic Action (ADA), whose founding members included labor leader Walter Reuther and NAACP chairman Walter White, as well as historian Arthur Schlesinger Jr., theologian Reinhold Niebuhr, and former First Lady Eleanor Roosevelt. Working to help Truman defeat former vice-president Henry Wallace’s popular front-backed campaign in 1948, the ADA combined social and economic reforms with staunch anti-communism. ((For anti-communist liberals and the decline of American communism, see especially Schrecker, Many are the Crimes.))

The domestic Cold War was bipartisan, fueled by a consensus drawn from a left-liberal and conservative anti-communist alliance that included politicians and policymakers, journalists and scientists, business and civic/religious leaders, and educators and entertainers. Led by its imperious director, J. Edgar Hoover, the FBI took an active role in the domestic battle against communism. Hoover’s FBI helped incite panic by assisting the creation of blatantly propagandistic films and television shows, including The Red Menace (1949), My Son John, (1951), and I Led Three Lives (1953-1956). Such alarmist depictions of espionage and treason in a ‘free world’ imperiled by communism heightened a culture of fear experienced in the 1950s. In the fall of 1947, HUAC entered the fray with highly publicized hearings of Hollywood. Film mogul Walt Disney and actor Ronald Reagan, among others, testified to aid investigators’ attempts to expose communist influence in the entertainment industry. A group of writers, directors, and producers who refused to answer questions were held in contempt of Congress. This ‘Hollywood Ten’ created the precedent for a ‘blacklist’ in which hundreds of film artists were barred from industry work for the next decade.

HUAC made repeated visits to Hollywood during the 1950s, and their interrogation of celebrities often began with the same intimidating refrain: “Are you now, or have you ever been, a member of the Communist Party?” Many witnesses cooperated, and “named names,” naming anyone they knew who had ever been associated with communist-related groups or organizations. In 1956, black entertainer and activist Paul Robeson chided his HUAC inquisitors, claiming that they had put him on trial not for his politics, but because he had spent his life “fighting for the rights” of his people. “You are the un-Americans,” he told them, “and you ought to be ashamed of yourselves.” ((Paul Robeson, Paul Robeson Speaks: Writings, Speeches, and Interviews, a Centennial Celebration, edited by Philip Foner (New York: Citadel Press, 1978), 421, 433.)) As Robeson and other victims of McCarthyism learned first-hand, this “second red scare,” in the glow of nuclear annihilation and global “totalitarianism,” fueled an intolerant and skeptical political world, what Cold War liberal Arthur Schlesinger, in his The Vital Center (1949), called an “age of anxiety.” ((Arthur Schlesinger, Jr., The Vital Center: The Politics of Freedom (Boston: Houghton Mifflin Company, 1949), 1.))

Many accused of Communist sentiments vehemently denied such allegations, including the one of the most well-known Americans at the time, African American actor and signer Paul Robeson. Unwilling to sign an affidavit confirming he was Communist, his U.S. passport was revoked. During the Cold War, he was condemned by the American press and neither his music nor films could be purchased in the U.S. Photograph. http://i.ytimg.com/vi/zDb9nM_iiXw/maxresdefault.jpg.

Many accused of Communist sentiments vehemently denied such allegations, including the one of the most well-known Americans at the time, African American actor and signer Paul Robeson. Unwilling to sign an affidavit confirming he was Communist, his U.S. passport was revoked. During the Cold War, he was condemned by the American press and neither his music nor films could be purchased in the U.S. Photograph. http://i.ytimg.com/vi/zDb9nM_iiXw/maxresdefault.jpg.

Anti-communist ideology valorized overt patriotism, religious conviction, and faith in capitalism. Those who shunned such “American values” were open to attack. If communism was a plague spreading across Europe and Asia, anti-communist hyperbole infected cities, towns, and suburbs throughout the country. The playwright Arthur Miller, whose popular 1953 The Crucible compared the red scare to the Salem Witch Trials, wrote, “In America any man who is not reactionary in his views is open to the charge of alliance with the Red hell. Political opposition, thereby, is given an inhumane overlay which then justifies the abrogation of all normally applied customs of civilized intercourse. A political policy is equated with moral right, and opposition to it with diabolical malevolence. Once such an equation is effectively made, society becomes a congerie of plots and counterplots, and the main role of government changes from that of the arbiter to that of the scourge of God.” ((Arthur Miller, The Crucible (New York: Penguin Classics, 2003), 30.))

Rallying against communism, American society urged conformity. “Deviant” behavior became dangerous. Having entered the workforce en masse as part of a collective effort in World War II, middle class women were told to return to house-making responsibilities. Having fought and died abroad to for American democracy, blacks were told to return home and acquiesce to the American racial order. Homosexuality, already stigmatized, became dangerous. Personal secrets were seen as a liability that exposed one to blackmail. The same paranoid mindset that fueled the second red scare also ignited the Cold War “lavender scare.” ((Robert D. Dean, Imperial Brotherhood: Gender and the Making of Cold War Foreign Policy (Amherst: University of Massachusetts Press, 2003).))

American religion, meanwhile, was fixated on what McCarthy, in his 1950 Wheeling speech, called an “all-out battle between communistic atheism and Christianity.” Cold warriors in the US routinely referred to a fundamental incompatibility between “godless communism” and god-fearing Americanism. Religious conservatives championed the idea of traditional nuclear god-fearing family as a bulwark against the spread of atheistic totalitarianism. As Baptist minister Billy Graham sermonized in 1950, communism aimed to “destroy the American home and cause … moral deterioration,” leaving the country exposed to communist infiltration. ((William G. McLoughlin, Revivals, Awakenings, and Reform (Chicago: University of Chicago Press, 2013), 189.))

In an atmosphere in which ideas of national belonging and citizenship were so closely linked to religious commitment, Americans during the early Cold War years attended church, professed a belief in a supreme being, and stressed the importance of religion in their lives at higher rates than in any time in American history. Americans sought to differentiate themselves from godless communists through public displays of religiosity. Politicians infused government with religious symbols. The Pledge of Allegiance was altered to include the words “one nation, under God” in 1954. “In God We Trust” was adopted as the official national motto in 1956. In popular culture, one of the most popular films of the decade, The Ten Commandments (1956), retold the biblical Exodus story as a Cold War parable, echoing (incidentally) NSC 68’s characterization of the Soviet Union as a “slave state.” Monuments of the Ten Commandments went to court houses and city halls across the country.

While the link between American nationalism and religion grew much closer during the Cold War, many Americans began to believe that just believing in almost any religion was better than being an atheist. Gone was the overt anti-Catholic and anti-Semitic language of Protestants in the past. Now, leaders spoke of a common “Judeo-Christian” heritage. In December 1952, a month before his inauguration, Dwight Eisenhower said that “our form of government makes no sense unless it is founded in a deeply-felt religious faith, and I don’t care what it is.” ((Quoted in Gastón Espinosa, Religion and the American Presidency: George Washington to George W. Bush with Commentary and Primary Sources (New York: Columbia University Press, 2009), 298.))

Joseph McCarthy, an Irish Catholic, made common cause with prominent religious anti-communists, including southern evangelist Billy James Hargis of Christian Crusade, a popular radio and television ministry that peaked in the 1950s and 1960s. Cold War religion in America also crossed the political divide. During the 1952 campaign, Eisenhower spoke of US foreign policy as “a war of light against darkness, freedom against slavery, Godliness against atheism.” ((Peter Gries, The Politics of American Foreign Policy: How Ideology Divides Liberals and Conservatives Over Foreign Affairs (Stanford University Press, 2014), 215.)) His Democratic opponent, former Illinois Governor Adlai Stevenson said that America was engaged in a battle with the “Anti-Christ.” While Billy Graham became a spiritual adviser to Eisenhower as well as other Republican and Democratic presidents, the same was true of the liberal Protestant Reinhold Niebuhr, perhaps the nation’s most important theologian when he appeared on the cover of Life in March 1948.

Though publicly rebuked by the Tydings Committee, McCarthy soldiered on. In June 1951, on the floor of Congress, McCarthy charged that then-Secretary of Defense (and former secretary of state) Gen. George Marshall had fallen prey to “a conspiracy on a scale so immense as to dwarf any previous such venture in the history of man.” He claimed that Marshall, a war hero, had helped to “diminish the United States in world affairs,” enable the US to “finally fall victim to Soviet intrigue… and Russian military might.” The speech caused an uproar. During the 1952 campaign, Eisenhower, who was in all things moderate and politically cautious, refused to publicly denounce McCarthy. “I will not…get into the gutter with that guy,” he wrote privately. McCarthy campaigned for Eisenhower, who won a stunning victory. ((Oshinsky, 272.))

So did the Republicans, who regained Congress. McCarthy became chairman of the Senate Permanent Subcommittee on Investigations (SPSI). He targeted many, and turned his newfound power against the government’s overseas broadcast division, the Voice of America (VOA). McCarthy’s investigation in February-March 1953 resulted in several resignations or transfers. McCarthy’s mudslinging had become increasingly unrestrained. Soon he went after the U.S. Army. After forcing the Army to again disprove theories of a Soviet spy ring at Ft. Monmouth in New Jersey, McCarthy publicly berated officers suspected of promoting leftists. McCarthy’s badgering of witnesses created cover for critics to publicly denounce his abrasive fear-mongering.

On March 9, CBS anchor Edward Murrow, a cold war liberal, told his television audience that McCarthy’s actions had “caused alarm and dismay amongst … allies abroad, and given considerable comfort to our enemies.” Yet, Murrow explained, “He didn’t create this situation of fear; he merely exploited it—and rather successfully. Cassius was right. ‘The fault, dear Brutus, is not in our stars, but in ourselves.’” ((Ibid., 399.))

Twenty million people saw the “Army-McCarthy Hearings” unfold over thirty-six days in 1954. The Army’s head counsel, Joseph Welch, captured much of the mood of the country when he defended a fellow lawyer from McCarthy’s public smears, saying, “Let us not assassinate this lad further, Senator. You’ve done enough. Have you no sense of decency, sir? At long last, have you left no sense of decency?” In September, a senate subcommittee recommended that McCarthy be censured. On December 2, 1954, his colleagues voted 67-22 to “condemn” his actions. Humiliated, McCarthy faded into irrelevance and alcoholism and died in May 1957, at age 48. ((Ibid., 475.))

By the late 1950s, the worst of the second red scare was over. Stalin’s death, followed by the Korean War armistice, opened new space—and hope—for the easing of Cold War tensions. Détente and the upheavals of the late 1960s were on the horizon. But McCarthyism outlasted McCarthy and the 1950s. McCarthy made an almost unparalleled impact on Cold War American society. The tactics he perfected continued to be practiced long after his death. “Red-baiting,” the act of smearing a political opponent by linking them to communism or some other demonized ideology, persevered. McCarthy had hardly alone.

Congressman Richard Nixon, for instance, used his place on HUAC and his public role in the campaign against Alger Hiss to catapult himself into the White House alongside Eisenhower and later into the presidency. Ronald Reagan bolstered the fame he had won in Hollywood with his testimony before Congress and his anti-communist work for major American corporations such as General Electric. He too would use anti-communism to enter public life and chart a course to the presidency. In 1958, radical anti-communists founded the John Birch Society, attacking liberals and civil rights activists such as Martin Luther King Jr. as communists. Although joined by Cold War liberals, the weight of anti-communism was used as part of an assault against the New Deal and its defenders. Even those liberals, such as historian Arthur Schlesinger, who had fought against communism found themselves smeared by the red scare. Politics and culture both had been reshaped. The leftist American tradition was in tatters, destroyed by anti-communist hysteria. Movements for social justice, from civil rights to gay rights to feminism, were all suppressed under Cold War conformity.

 

V. Decolonization and the Global Reach of the ‘American Century’

In an influential 1941 Life magazine editorial titled “The American Century,” publishing magnate Henry Luce, envisioning the United States as a “dominant world power,” outlined his “vision of America as the principal guarantor of freedom of the seas” and “the dynamic leader of world trade.” In his embrace of an American-led international system, the conservative Luce was joined by liberals including historian Arthur Schlesinger, who in his 1949 Cold War tome The Vital Center, proclaimed that a “world destiny” had been “thrust” upon the United States, with perhaps no other nation becoming “a more reluctant great power.” Emerging from the war as the world’s preeminent military and economic force, the US was perhaps destined to compete with the Soviet Union for influence in the Third World, where a power vacuum had been created by the demise of European imperialism. As France and Britain in particular struggled in vain to control colonies in Asia, the Middle East, and North Africa, the United States assumed responsibility for maintaining order and producing a kind of “pax-Americana.” Little of the postwar world, however, would be so peaceful. ((Henry R. Luce, “The American Century,” LIFE (February 17, 1941), 61-65.))

Based on the logic of militarized containment established by NSC-68 and American Cold War strategy, interventions in Korea and Vietnam were seen as appropriate American responses to the ascent of communism in China. Unless Soviet power in Asia was halted, Chinese influence would ripple across the continent, and one country after another would “fall” to communism. Easily transposed onto any region of the world, the “Domino Theory” became a standard basis for the justification of US interventions abroad, such as in Cuba after 1959, which was seen as a communist beachhead that imperiled Latin America, the Caribbean, and perhaps eventually the United States. Like Ho Chi Minh, Cuban leader Fidel Castro was a revolutionary nationalist whose career as a communist began in earnest after he was rebuffed by the United States and American interventions targeted nations that never espoused official communist positions. Many interventions in Asia, Latin America, and elsewhere were driven by factors that were shaped by but also that transcended anti-communist ideology.

The Cuban Revolution seemed to confirm the fears of many Americans that the spread of communism could not be stopped. It is believed that American government intervened in the new government of Fidel Castro in covert ways, and many attribute the La Coubre explosion to the American Central Intelligence Agency. In this photograph, Castro and Cuban revolutionary Che Guevara march in a memorial for those killed in the explosion in March, 1960 in Havana Cuba. Wikimedia, http://commons.wikimedia.org/wiki/File:CheLaCoubreMarch.jpg.

The Cuban Revolution seemed to confirm the fears of many Americans that the spread of communism could not be stopped. It is believed that American government intervened in the new government of Fidel Castro in covert ways, and many attribute the La Coubre explosion to the American Central Intelligence Agency. In this photograph, Castro and Cuban revolutionary Che Guevara march in a memorial for those killed in the explosion in March, 1960 in Havana Cuba. Wikimedia, http://commons.wikimedia.org/wiki/File:CheLaCoubreMarch.jpg.

Instead of dismantling its military after World War II, as the United States had after every major conflict, the Cold War facilitated a new permanent defense establishment. Federal investments in national defense affected the entire country. Different regions housed various sectors of what sociologist C. Wright Mills, in 1956, called the “permanent war economy.” The aerospace industry was concentrated in areas like Southern California and Long Island, New York; Massachusetts was home to several universities that received major defense contracts; the Midwest became home base for intercontinental ballistic missiles pointed at the Soviet Union; many of the largest defense companies and military installations were concentrated in the South, so much so that in 1956 author William Faulkner, who was born in Mississippi, remarked, “Our economy is the Federal Government.” ((Bruce J. Schulman, From Cotton Belt to Sunbelt: Federal Policy, Economic Development, and the Transformation of the South, 1938-1980 (Durham: Duke University Press, 1994), 135.))

A radical critic of US policy, Mills was one of the first thinkers to question the effects of massive defense spending, which, he said, corrupted the ruling class, or “power elite,” who now had the potential to take the country into war for the sake of corporate profits. Yet perhaps the most famous critique of the entrenched war economy came from an unlikely source. During his farewell address to the nation in January 1961, President Eisenhower cautioned Americans against the “unwarranted influence” of a “permanent armaments industry of vast proportions” which could threaten “liberties” and “democratic processes.” While the “conjunction of an immense military establishment and a large arms industry” was a fairly recent development, this “military-industrial complex” had cultivated a “total influence,” which was “economic, political, even spiritual … felt in every city … Statehouse … [and] office of the Federal government.” There was, he said, great danger in failing to “comprehend its grave implications.” ((Dwight D. Eisenhower, Public Papers of the Presidents, Dwight D. Eisenhower, 1960, 1035- 1040.))

In Eisenhower’s formulation, the “military-industrial complex” referred specifically to domestic connections between arms manufactures, members of Congress, and the Department of Defense. Yet the new alliance between corporations, politicians, and the military was dependent on having an actual conflict to wage, without which there could be no ultimate financial gain. To critics, military-industrial partnerships at home were now linked to US interests abroad. Suddenly American foreign policy had to ensure foreign markets and secure favorable terms for American trade all across the globe. Seen in such a way, the Cold War was just a bi-product of America’s new role as the remaining Western superpower. Regardless, the postwar rise of US power correlated with what many historians describe as a “national security consensus” that has dominated American policy since World War II. And so the United States was now more intimately involved in world affairs than ever before.

Ideological conflicts and independence movements erupted across the postwar world. More than eighty countries achieved independence, primarily from European control. As it took center stage in the realm of global affairs, the United States played a complicated and often contradictory role in this process of “decolonization.” The sweeping scope of post-1945 US military expansion was unique in the country’s history. Critics believed that the advent of a “standing army,” so feared by many Founders, set a disturbing precedent. But in the postwar world, American leaders eagerly set about maintaining a new permanent military juggernaut and creating viable international institutions.

But what of independence movements around the world? Roosevelt had spoken for many in his remark to British Premier Winston Churchill, in 1941, that it was hard to imagine “fight[ing] a war against fascist slavery, and at the same time not work to free people all over the world from a backward colonial policy.” ((Fredrick Logevall, Embers of War: The Fall of an Empire adn the Making of America’s Vietnam (New York: Random House, 2012), 48.)) American postwar foreign policy planners therefore struggled to balance support for decolonization against the reality that national independence movements often posed a threat to America’s global interests.

As American strategy became consumed with thwarting Russian power and the concomitant global spread of communist, to foreign policy officials it increasingly made little difference whether insurgencies or independence movements had direct involvement with the Soviet Union, so long as a revolutionary movement or government could in some way be linked to international communism. The Soviet Union, too, was attempting sway the world. Stalin and his successors pushed an agenda that included not only the creation of Soviet client states in Eastern and Central Europe, but also a tendency to support leftwing liberation movements everywhere, particularly when they espoused anti-American sentiment. As a result, the US and the USSR engaged in numerous proxy wars in the “Third World.”

American planners felt that successful decolonization could demonstrate the superiority of democracy and capitalism against competing Soviet models. Their goal was in essence to develop an informal system of world power based as much as possible on consent (hegemony) rather than coercion (empire). European powers still pushed colonization. American officials feared that anti-colonial resistance would breed revolution and push nationalists into the Soviet sphere. And when faced with such movements, American policy dictated alliances with colonial regimes, alienating nationalist leaders in Asia and Africa.

The architects of American power needed to sway the citizens of decolonizing nations toward the United States. In 1948, Congress passed the Smith-Mundt Act to “promote a better understanding of the United States in other countries.” The legislation established cultural exchanges with various nations, including even the USSR, in order to showcase American values through its artists and entertainers. The Soviets did the same, through what they called an international peace offensive, which by most accounts was more successful than the American campaign. Although making strides through the initiation of various overt and covert programs, US officials still perceived that they were lagging behind the Soviet Union in the “war for hearts and minds.” But as unrest festered in much of the Third World, American officials faced difficult choices. ((Frank Ninkovich, The Diplomacy of Ideas: U.S. Foreign Policy and Cultural Relations, 1938-1950 (Cambridge University Press, 1981).))

As American blacks fought for justice at home, prominent American black radicals, including Malcolm X, Paul Robeson, and the aging W.E.B. DuBois, joined in solidarity with the global anti-colonial movement, arguing that the United States had inherited the racist European imperial tradition. Supporters of the Soviet Union made their own effort to win over countries in the non-aligned world, claiming that Marxist-Leninist doctrine offered a roadmap for their liberation from colonial bondage. Moreover, Kremlin propaganda pointed to injustices of the American South as an example of American hypocrisy: how could the United States claim to fight for global freedom when it refused to guarantee freedoms for its own citizenry? In such ways the Cold War connected the black freedom struggle, the Third World, and the global Cold War.

The Soviet Union took advantage of the very real racial tensions in the U.S. to create anti-American propaganda. This 1930 Soviet poster shows a black American being lynched from the Statue of Liberty, while the text below asserts the links between racism and Christianity. 1930 issue of Bezbozhnik. Wikimedia, http://commons.wikimedia.org/wiki/File:Bezbozhnik_u_stanka_US_1930.jpg.

The Soviet Union took advantage of the very real racial tensions in the U.S. to create anti-American propaganda. This 1930 Soviet poster shows a black American being lynched from the Statue of Liberty, while the text below asserts the links between racism and Christianity. 1930 issue of Bezbozhnik. Wikimedia, http://commons.wikimedia.org/wiki/File:Bezbozhnik_u_stanka_US_1930.jpg.

 

VI. Conclusion

In June 1987, American President Ronald Reagan stood at Berlin Wall and demanded that Soviet premier Mikhail Gorbachev “Tear down this wall!” Less than three years later, amid civil unrest in November 1989, East German authorities announced that their citizens were free to travel to and from West Berlin. The concrete curtain would be lifted and East Berlin would be opened to the world. Within months, the Berlin Wall was reduced to rubble by jubilant crowds anticipating the reunification of their city and their nation, which took place on October 3, 1990. By July 1991 the Warsaw Pact had crumbled, and on December 25 of that year, the Soviet Union was officially dissolved. Hungary, Poland, Czechoslovakia, and the Baltic States (Latvia, Estonia, and Lithuania) were freed from Russian domination.

Partisans fight to claim responsibility for the break-up of the Soviet Union and the ending of the Cold War. Whether it was the triumphalist rhetoric and militaristic pressure of conservatives or the internal fracturing of ossified bureaucracies and work of Russian reformers that shaped the ending of the Cold War is a question of later decades. Questions about the Cold War’s end must pause before appreciations of the Cold War’s impact at home and abroad. Whether measured by the tens of millions killed in Cold War-related conflicts, in the reshaping of American politics and culture, or in the transformation of America’s role in the world, the Cold War pushed American history upon a new path, one that it has yet to yield.

 

Contributors

This chapter was edited by Ari Cushner, with content contributions by Michael Brenes, Ari Cushner, Michael Franczak, Joseph Haker, Jonathan Hunt, Jun Suk Hyun, Zack Jacobson, Micki Kaufman, Lucie Kyrova, Celeste Day Moore, Joseph Parrott, Colin Reynolds, and Tanya Roth.

 

Recommended Reading

  1. Borstelmann, Thomas. The Cold War and the Color Line: American Race Relations in the Global Arena. Cambridge: Harvard University Press, 2001.
  2. Boyer, Paul. By the Bomb’s Early Light: American Thought and Culture at the Dawn of the Atomic Age. New York: Pantheon Books, 1985.
  3. Carlton, Don E. Red Scare! Right-wing Hysteria, Fifties Fanaticism, and Their Legacy in Texas. Austin: Texas Monthly Press, 1985.
  4. Gaddis, John L. The Cold War: A New History. New York: Penguin Press, 2005.
  5. Gaddis, John L. Strategies of Containment. New York: Oxford University Press, 2005.
  6. Gaddis, John L. The United States and the Origins of the Cold War. New York: Columbia University Press, 2000.
  7. Kolko, Gabriel. Confronting the Third World: United States Foreign Policy 1945-1980. New York: Pantheon, 1988.
  8. Lafeber, Walter. America, Russia, and the Cold War, 1945-1966. New York: John Wiley, 1967.
  9. May, Elaine Tyler. Homeward Bound: American Families in the Cold War Era. New York: Basic Books, 1988.
  10. Oshinsky, David M. A Conspiracy So Immense: The World of Joe McCarthy. New York: Oxford University Press, 2005.
  11. Patterson, James T. Grand Expectations: The United States, 1945–1974. New York: Oxford University Press, 1996.
  12. Powers, Richard Gid. Not Without Honor: The History of American Anticommunism. New York: The Free Press, 1995.
  13. Schrecker, Ellen. Many Are the Crimes: McCarthyism in America. New York: Little, Brown and Company, 1998.
  14. Schulman, Bruce J. From Cotton Belt to Sunbelt: Federal Policy, Economic Development, and the Transformation of the South, 1938–1980. New York: Oxford University Press, 1991.
  15. Whitfield, Stephen. The Culture of the Cold War. Baltimore: Johns Hopkins University Press, 1991.

 

Notes

24. World War II

Walter Rosenblum, "D Day Rescue, Omaha Beach," via Library of Congress.

Walter Rosenblum, “D Day Rescue, Omaha Beach,” via Library of Congress.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

The 1930s and 1940s were trying times. A global economic crisis gave way to a global war that would become the deadliest and most destructive in human history. Perhaps 80 million lost their lives during World War II. Moreover, the war unleashed the most fearsome wartime technology that has ever been used in war. It saw industrialized genocide and nearly threatened the eradication of an entire people. And when it ended, the United States found itself alone as the world’s greatest superpower, armed with the world’s greatest economy and looking forward to a prosperous consumers’ economy. But of course the war would raise as many questions as it would settle, unleashing new social forces at home and abroad that would confront new generations of Americans to come.

 

II. The Origins of the Pacific War

While the United States joined the war in 1941, two years after Europe exploded into conflict in 1939, the path to the Japanese bombing of Pearl Harbor, the surprise attack that threw the United States headlong into war, began much earlier. For the Empire of Japan, the war had begun a decade before Pearl Harbor.

On September 18, 1931, a small explosion tore up railroad tracks controlled by the Japanese-owned South Manchuria Railway near the city of Shenyang (Mukden) in the Chinese province of Manchuria. The railway company condemned the bombing as the work of anti-Japanese Chinese dissidents. Evidence, though, suggests that the initial explosion was neither an act of Chinese anti-Japanese sentiment nor an accident, but an elaborate ruse planned by the Japanese to provide a basis for invasion. In response, the privately operated Japanese Guandong (Kwangtung) army began shelling the Shenyang garrison the next day, and the garrison fell before nightfall. Hungry for Chinese territory and witnessing the weakness and disorganization of Chinese forces, but under the pretense of protecting Japanese citizens and investments, the Japanese Imperial Army ordered a full-scale invasion of Manchuria. The invasion was swift. Without a centralized Chinese army, the Japanese quickly defeated isolated Chinese warlords and by the end of February 1932, all of Manchuria was firmly under Japanese control. Japan established the nation of Manchukuo out of the former province of Manchuria. ((For the second Sino-Japanese War, see, for instance, Michael A. Barnhart, Japan Prepares for Total War: The Search for Economic Security, 1919–1941 (Ithaca: Cornell University Press, 1987); Dick Wilson, When Tigers Fight: The Story of the Sino-Japanese War, 1937–1945 (New York: Viking Press, 1982); Mark Peattie, Edward Drea, and Hans van de Ven, editors, The Battle for China: Essays on the Military History of the Sino-Japanese War of 1937–1945 (Palo Alto: Stanford University Press, 2011).))

This seemingly small skirmish—known—known by the Chinese as the September 18 Incident and the Japanese as the Manchurian Incident—sparked a war that would last thirteen years and claim the lives of over 35 million people. Comprehending Japanese motivations for attacking China, and the grueling stalemate of the ensuring war, are crucial for understanding Japan’s seemingly unprovoked attack on Pearl Harbor, Hawaii on December 7, 1941, and, therefore, for understanding the involvement of the United States in World War II as well.

Despite their rapid advance into Manchuria, the Japanese put off the invasion of China for nearly three years. Japan occupied a precarious domestic and international position after the September 18 Incident. At home, Japan was riven by political factionalism due to its stagnating economy. Leaders were torn as to whether to address modernization and lack of natural resources through unilateral expansion—the conquest of resource-rich areas such as Manchuria to export raw materials to domestic Japanese industrial bases such as Hiroshima and Nagasaki—or international cooperation—particularly a philosophy of pan-Asianism in an anti-Western coalition would push the colonial powers out of Asia. Ultimately, after a series of political crises and assassinations enflamed tensions, pro-war elements within the Japanese military triumphed over the more moderate civilian government. Japan committed itself to aggressive military expansion.

Chinese leaders Chiang Kai-shek and Zhang Xueliang appealed to the League of Nations for assistance against Japan. The United States supported the Chinese protest, proclaiming the Stimson Doctrine in January 1932, which refused to recognize any state established as a result of Japanese aggression. Meanwhile, the League of Nations sent Englishman Victor Bulwer-Lytton to investigate the September 18 Incident. After a six-month investigation, Bulwer-Lytton found the Japanese guilty of inciting the September 18 incident and demanded the return of Manchuria to China. The Japanese withdrew from the League of Nations in March 1933.

Japan isolated itself from the world. Its diplomatic isolation empowered radical military leaders who could point to Japanese military success in Manchuria and compare it to the diplomatic failures of the civilian government. The military took over Japanese policy. And in the military’s eyes, the conquest of China would not only provide for Japan’s industrial needs, it would secure Japanese supremacy in East Asia.

The Japanese launched a full-scale invasion of China. It assaulted the Marco Polo Bridge on July 7, 1937 and routed the forces of the Chinese National Revolutionary Army led by Chiang Kai-shek. The broken Chinese army gave up Beiping (Beijing) to the Japanese on August 8, Shanghai on November 26, and the capital, Nanjing (Nanking), on December 13. Between 250,000 and 300,000 people were killed, and tens of thousands of women were raped, when the Japanese besieged and then sacked Nanjing. The Western press labeled it the Rape of Nanjing. To halt the invading enemy, Chiang Kai-shek adopted a scorched-earth strategy of “trading space for time.” His Nationalist government retreated inland, burning villages and destroying dams, and established a new capital at the Yangtze River port of Chongqing (Chungking). Although the Nationalist’s scorched-earth policy hurt the Japanese military effort, it alienated scores of dislocated Chinese civilians and became a potent propaganda tool of the emerging Chinese Communist Party (CCP). ((See Joshua A. Fogel, The Nanjing Massacre in history and historiography (Berkeley: University of California Press, 2000).))

Americans read about the brutal fighting in China, but the United States lacked both the will and the military power to oppose the Japanese invasion. After the gut-wrenching carnage of World War I, many Americans retreated toward “isolationism” by opposing any involvement in the conflagrations burning in Europe and Asia. And even if Americans wished to intervene, their military was lacking. The Japanese army was a technologically advanced force consisting of 4,100,000 million men and 900,000 Chinese collaborators—and that was in China alone. The Japanese military was armed with modern rifles, artillery, armor, and aircraft. By 1940, the Japanese navy was the third-largest and among the most technologically advanced in the world.

Still, Chinese Nationalists lobbied Washington for aid. Chiang Kai-shek’s wife, Soong May-ling—known to the American public as Madame Chiang—led the effort. Born into a wealthy Chinese merchant family in 1898, Madame Chiang spent much of her childhood in the United States and had graduated from Wellesley College 1917 with a major in English literature. In contrast to her gruff husband, Madame Chiang was charming and able to use her knowledge of American culture and values to garner support for her husband and his government. But while the United States denounced Japanese aggression, it took no action.

As Chinese Nationalists fought for survival, the Communist Party was busy collecting people and supplies in the Northwestern Shaanxi Province. China had been at war with itself when the Japanese came. Nationalists battled a stubborn communist insurgency. In 1935 the Nationalists threw the communists out of the fertile Chinese coast, but an ambitious young commander named Mao Zedong recognized the power of the Chinese peasant population. In Shaanxi, Mao recruited from the local peasantry, building his force from a meager 7,000 survivors at the end of the Long March in 1935 to a robust 1.2 million members by the end of the war.

Although Japan had conquered much of the country, the Nationalists regrouped and the Communists rearmed. An uneasy truce paused the country’s civil war and refocused efforts on the invaders. The Chinese could not dislodge the Japanese, but they could stall their advance. The war mired in stalemate.

 

III. The Origins of the European War

Across the globe in Europe, the continent’s major powers were still struggling with the after-effects of the First World War when the global economic crisis spiraled much of the continent into chaos. Germany’s Weimer Republic collapsed with the economy and out of the ashes emerged Adolph Hitler’s National Socialists—the Nazis. Championing German racial supremacy, fascist government, and military expansionism, Adolph Hitler rose to power and, after aborted attempts to take power in Germany, became Chancellor in 1933 and the Nazis conquered German institutions. Democratic traditions were smashed. Leftist groups were purged. Hitler repudiated the punitive damages and strict military limitations of the Treaty of Versailles. He rebuilt the German military and navy. He reoccupied regions lost during the war and re-militarized the Rhineland, along the border with France. When the Spanish Civil War broke out in 1936, Hitler and Mussolini—the fascist Italian leader who had risen to power in the 1920s—intervened for the Spanish fascists, toppling the communist Spanish Republican Party. Britain and France stood by warily and began to rebuild their militaries, anxious in the face of a renewed Germany but still unwilling to draw Europe into another bloody war. ((On the origins of World War II in Europe, see, for instance, P. M. H. Bell, The Origins of the Second World War in Europe (New York: Routledge, 1986).))

In his autobiographical manifesto, Mein Kampf, Hitler advocated for the unification of Europe’s German peoples under one nation and that nation’s need for lebensraum, or living space, particularly in Eastern Europe, to supply Germans with the land and resources needed for future prosperity. The untermenschen (“lesser” humans) would have to go. Once in power, Hitler worked toward the twin goals of unification and expansion.

"Adolf Hitler salutes troops of the Condor Legion who fought alongside Spanish Nationalists in the Spanish Civil War, during a rally upon their return to Germany, 1939." Hugo Jaeger—Time & Life Pictures/Getty Images. http://life.time.com/world-war-ii/nazi-propaganda-and-the-myth-of-aryan-invincibility/#ixzz2Wd38MUY9

“Adolf Hitler salutes troops of the Condor Legion who fought alongside Spanish Nationalists in the Spanish Civil War, during a rally upon their return to Germany, 1939.” Hugo Jaeger—Time & Life Pictures/Getty Images. http://life.time.com/world-war-ii/nazi-propaganda-and-the-myth-of-aryan-invincibility/#ixzz2Wd38MUY9

Huge rallies like this one in Nuremberg displayed the sheer number of armed and ready troop and instilled a fierce loyalty to (or fearful silence about) Hitler and the National Socialist Party in Germany. Photograph, November, 9. 1935. Wikimedia, http://commons.wikimedia.org/wiki/File:Reichsparteitag_1935.jpg.

Huge rallies like this one in Nuremberg displayed the sheer number of armed and ready troop and instilled a fierce loyalty to (or fearful silence about) Hitler and the National Socialist Party in Germany. Photograph, November, 9. 1935. Wikimedia, http://commons.wikimedia.org/wiki/File:Reichsparteitag_1935.jpg.

In 1938 Germany annexed Austria and set its sights on the Sudetenland, a large, ethnically German area of Czechoslovakia. Britain and France, alarmed but still anxious to avoid war, the major powers agreed—without Czechoslovakia’s input—that Germany could annex the region in return for a promise to stop all future German aggression. It was thought that Hitler could be appeased, but it became clear that his ambitions would continue pushing German expansion. In March 1939, Hitler took the rest of Czechoslovakia and began to make demands on Poland. Britain and France promised war. And war came.

Hitler signed a secret agreement—the Molotov–Ribbentrop Pact—with the Soviet Union that coordinated the splitting of Poland between the two powers and promised non-aggression thereafter. The European war began when the German Wehrmacht invaded Poland on September 1st, 1939. Britain and France declared war two days later and mobilized their armies. Britain and France hoped that the Poles could hold out for three to four months, enough time for the Allies to intervene. Poland fell in three weeks. The German army, anxious to avoid the rigid, grinding war of attrition that took so many millions in the stalemate of WWI, built their new modern army for speed and maneuverability. German doctrine emphasized the use of tanks, planes, and motorized infantry (infantry that used trucks for transportation instead of marching) to concentrate forces, smash front lines, and wreak havoc behind the enemy’s defenses. It was called blitzkrieg, or lightening war.

After the fall of Poland, France and its British allies braced for an inevitable German attack. Throughout the winter of 1939-40, however, fighting was mostly confined to smaller fronts in Norway. Belligerents called it the Sitzkrieg (sitting war). But in May 1940, Hitler launched his attack into Western Europe. Mirroring the German’s Schlieffen Plan of 1914 in the previous war, Germany attacked through the Netherlands and Belgium to avoid the prepared French defenses along the French-German border. Poland had fallen in three weeks; France lasted only a few weeks more. By June, Hitler was posing for photographs in front of the Eiffel Tower. Germany split France in half. Germany occupied and governed the north, and the south would be ruled under a puppet government in Vichy.

With France under heel, Hitler turned to Britain. Operation Sea Lion—the planned German invasion of the British Isles—required air superiority over the English Channel. From June until October the German Luftwaffe fought the Royal Air Force (RAF) for control of the skies. Despite having fewer planes, British pilots won the so-called Battle of Britain, saving the islands from immediate invasion and prompting the new Prime Minister, Winston Churchill, to declare, “never before in the field of human conflict has so much been owed by so many to so few.”

The German aerial bombing of London left thousands homeless, hurt, or dead. This child sits among the rubble with a rather quizzical look on his face, as adults ponder their fate in the background. Toni Frissell, “[Abandoned boy holding a stuffed toy animal amid ruins following German aerial bombing of London],” 1945. Library of Congress, http://www.loc.gov/pictures/item/2008680191/.

The German aerial bombing of London left thousands homeless, hurt, or dead. This child sits among the rubble with a rather quizzical look on his face, as adults ponder their fate in the background. Toni Frissell, “[Abandoned boy holding a stuffed toy animal amid ruins following German aerial bombing of London],” 1945. Library of Congress.

If Britain was safe from invasion, it was not immune from additional air attacks. Stymied in the Battle of Britain, Hitler began the Blitz—a bombing campaign against cities and civilians. Hoping to crush the British will to fight, the Luftwaffe bombed the cities of London, Liverpool, and Manchester every night from September to the following May. Children were sent far into the countryside to live with strangers to shield them from the bombings. Remaining residents took refuge in shelters and subway tunnels, emerging each morning to put out fires and bury the dead. The Blitz ended in June 1941, when Hitler, confident that Britain was temporarily out of the fight, launched Operation Barbarossa—the invasion of the Soviet Union.

 

 

Hoping to capture agricultural lands, seize oil fields, and break the military threat of Stalin’s Soviet Union, Hitler broke the two powers’ 1939 non-aggression pact and, on June 22, invaded the Soviet Union. It was the largest land invasion in history. France and Poland had fallen in weeks, and German officials hoped to break Russia before the winter. And initially, the blitzkrieg worked. The German military quickly conquered enormous swaths of land and netted hundreds of thousands of prisoners. But Russia was too big and the Soviets were willing to sacrifice millions to stop the fascist advance. After recovering from the initial shock of the German invasion, Stalin moved his factories east of the Urals, out of range of the Luftwaffe. He ordered his retreating army to adopt a “scorched earth” policy, to move east and destroy food, rails, and shelters to stymie the advancing German army. The German army slogged forward. It split into three pieces and stood at the gates of Moscow, Stalingrad and Leningrad, but supply lines now stretched thousands of miles, Soviet infrastructure had been destroyed, partisans harried German lines, and the brutal Russian winter arrived. Germany had won massive gains but the winter found Germany exhausted and overextended. In the north, the German Army starved Leningrad to death during an interminable siege; in the south, at Stalingrad, the two armies bled themselves to death in the destroyed city; and, in the center, on the outskirts of Moscow, in sight of the capital city, the German army faltered and fell back. It was the Soviet Union that broke Hitler’s army. Twenty-five million Soviet soldiers and civilians died during the “The Great Patriotic War” and roughly 80% of all German casualties during the war came on the Eastern Front. The German army and its various conscripts suffered 850,000 casualties at the Battle of Stalingrad alone. In December 1941, Germany began its long retreat. ((Antony Beevor, Stalingrad: The Fateful Siege, 1942-1943 (New York: Penguin, 1999); Omer Bartov. The Eastern Front, 1941–45: German Troops and the Barbarization of Warfare (New York: Palgrave Macmillan, 1986); Catherine Merridale, Ivan’s War: Life and Death in the Red Army, 1939-1945 (New York: Picador, 2006).))

 

IV. The United States and the European War

While Hitler marched across Europe, the Japanese continued their war in the Pacific. In 1939 the United States dissolved its trade treaties with Japan. In 1940 the American Neutrality Acts cut off supplies of necessary war materials by embargoing oil, steel, rubber, and other vital goods. It was hoped that economic pressure would shut down the Japanese war machine. Instead, Japan’s resource-starved military launched invasions across the Pacific to sustain its war effort. The Japanese called their new empire the Greater East Asia Co-Prosperity Sphere and, with the cry of “Asia for the Asians,” made war against European powers and independent nations throughout the region. Diplomatic relations between Japan and the United States collapsed. The United States demanded Japan withdraw from China; Japan considered the oil embargo a de facto declaration of war. ((Herbert Feis, The Road to Pearl Harbor: The Coming of the War Between The United States and Japan (Princeton, 1950).))

Japanese military planners, believing that American intervention was inevitable, planned a coordinated Pacific offensive to neutralize the United States and other European powers and provide time for Japan to complete its conquests and fortify its positions. On the morning of December 7, 1941, the Japanese launched a surprise attack on the American naval base at Pearl Harbor, Hawaii. Japanese military planners hoped to destroy enough battleships and aircraft carriers to cripple American naval power for years. 2,400 Americans were killed in the attack.

American isolationism fell at Pearl Harbor. Japan had assaulted Hong Kong, the Philippines, and American holdings throughout the Pacific, but it was the attack on Hawaii that threw the United States into a global conflict. Franklin Roosevelt called December 7 “a date which will live in infamy” and called for a declaration of war, which Congress answered within hours. Within a week of Pearl Harbor the United States had declared war on the entire Axis, turning two previously separate conflicts into a true world war.

The American war began slowly. Britain had stood alone militarily in Europe, but American supplies had bolstered their resistance. Hitler unleashed his U-boat “wolf packs” into the Atlantic Oceans with orders to sink anything carrying aid to Britain, but Britain and the United States’ superior tactics and technology won them the Battle of the Atlantic. British code breakers cracked Germany’s radio codes and the surge of intelligence, dubbed Ultra, coupled with massive naval convoys escorted by destroyers armed with sonar and depth charges, gave the advantage to the Allies and by 1942, Hitler’s Kriegsmarine was losing ships faster than they could be built. (For the United States in the European Front, see, for instance, John Keegan, The Second World War (New York: Viking, 1990); Gerhard L. Weinberg, A World at Arms: A Global History of World War II (Cambridge University Press, 2005).))

In North Africa in 1942, British victory at El Alamein began pushing the Germans back. In November, the first American combat troops entered the European war, landing in French Morocco and pushing the Germans east while the British pushed west. ((Rick Atkinson. An Army at Dawn: The War in North Africa, 1942–1943.)) By 1943, the Allies had pushed Axis forces out of Africa. In January President Roosevelt and Prime Minister Churchill met at Casablanca to discuss the next step of the European war. Churchill convinced Roosevelt to chase the Axis up Italy, into the “soft underbelly” of Europe. Afterward, Roosevelt announcing to the press that the Allies would accept nothing less than unconditional surrender.

Meanwhile, the Army Air Force (AAF) sent hundreds (and eventually thousands) of bombers to England in preparation for a massive Strategic Bombing Campaign against Germany. The plan was to bomb Germany around the clock. American bombers hit German ball-bearing factories, rail yards, oil fields and manufacturing centers during the day, while the British Royal Air Force (RAF) carpet-bombed German cities at night. Flying in formation, they initially flew unescorted, since many believed that bombers equipped with defensive firepower flew too high and too fast to be attacked. However, advanced German technology allowed fighters to easily shoot down the lumbering bombers. On some disastrous missions, the Germans shot down almost 50% of American aircraft. However, the advent and implementation of a long-range escort fighter let the bombers hit their targets more accurately while fighters confronted opposing German aircraft.

In 1944, Allied forces began a bombing campaign of railroad and oil targets in Bucharest, part of the wider policy of bombing expeditions meant to incapacitate German transportation. Bucharest was considered the number one oil target in Europe. Photograph, August 1, 1943. Wikimedia, http://commons.wikimedia.org/wiki/File:B-24D%27s_fly_over_Polesti_during_World_War_II.jpg.

In 1944, Allied forces began a bombing campaign of railroad and oil targets in Bucharest, part of the wider policy of bombing expeditions meant to incapacitate German transportation. Bucharest was considered the number one oil target in Europe. Photograph, August 1, 1943. Wikimedia.

Bombings throughout Europe caused complete devastation in some areas, leveling beautiful ancient cities like Cologne, Germany. Cologne experienced an astonishing 262 separate air raids by Allied forces, leaving the city in ruins as in these the photograph above. Amazingly, the Cologne Cathedral stands nearly undamaged even after being hit numerous times, while the area around it crumbles. Photograph, April 24, 1945. Wikimedia, http://commons.wikimedia.org/wiki/File:Koeln_1945.jpg.

Bombings throughout Europe caused complete devastation in some areas, leveling beautiful ancient cities like Cologne, Germany. Cologne experienced an astonishing 262 separate air raids by Allied forces, leaving the city in ruins as in these the photograph above. Amazingly, the Cologne Cathedral stands nearly undamaged even after being hit numerous times, while the area around it crumbles. Photograph, April 24, 1945. Wikimedia, http://commons.wikimedia.org/wiki/File:Koeln_1945.jpg.

In the wake of the Soviet’s victory at Stalingrad, the “Big Three” (Roosevelt, Churchill, and Stalin) met in Tehran in November 1943. Dismissing Africa and Italy as a side-show, Stalin demanded that Britain and the United States invade France to relive pressure on the Eastern Front. Churchill was hesitant, but Roosevelt was eager. The invasion was tentatively scheduled for 1944.

Back in Italy, the “soft underbelly” turned out to be much tougher than Churchill had imagined. Italy’s narrow, mountainous terrain gave the defending Axis the advantage. Movement up the peninsula was slow and in some places conditions returned to the trench-like warfare of WWI. Americans attempted to land troops behind them at Anzio on the western coast of Italy but they became surrounded and suffered heavy casualties. But the Allies pushed up the peninsula, Mussolini’s government revolted, and a new Italian government quickly made peace.

On the day the American army entered Rome, American, British and Canadian forces launched Operation Overlord, the long-awaited invasion of France. D-Day, as it became popularly known, was the largest amphibious assault in history. American general Dwight Eisenhower was uncertain enough of the attack’s chances that the night before the invasion he wrote two speeches: one for success and one for failure. The Allied landings at Normandy were successful, and although progress across France was much slower than hoped for, Paris was liberated roughly two months later. Allied bombing expeditions meanwhile continued to level German cities and industrial capacity. Perhaps 400,000 German civilians were killed by allied bombing. ((Max Hastings, Overlord: D-Day and the Battle for Normandy (New York: Simon and Schuster, 1985).))

 

The Nazis were crumbling on both fronts. Hitler tried but failed to turn the war in his favor in the west. The Battle of the Bulge failed to drive the Allies back into the British Channel, but the delay cost the Allies the winter. The invasion of Germany would have to wait, while the Soviet Union continued its relentless push westward, ravaging German populations in retribution for German war crimes. ((Richard Overy, Why the Allies Won (New York: WW Norton & Company, 1997).))

German counterattacks in the east failed to dislodge the Soviet advance, destroying any last chance Germany might have had to regain the initiative. 1945 dawned with the end of European war in sight. The Big Three met again at Yalta in the Soviet Union, where they reaffirmed the demand for Hitler’s unconditional surrender and began to plan for postwar Europe.

The Soviet Union reached Germany in January, and the Americans crossed the Rhine in March. In late April American and Soviet troops met at the Elbe while the Soviets, pushed relentlessly by Stalin to reach Berlin first, took the capital city in May, days after Hitler and his high command had committed suicide in a city bunker. Germany was conquered. The European war was over. Allied leaders met again, this time at Potsdam, Germany, where it was decided that Germany would be divided into pieces according to current Allied occupation, with Berlin likewise divided, pending future elections. Stalin also agreed to join the fight against Japan in approximately three months. ((Christopher Duffy, Red Storm on the Reich: The Soviet March on Germany, 1945 (New York: De Capo Press, 1993.))

 

V. The United States and the Japanese War

As Americans celebrated “V.E.” (Victory in Europe) Day, they redirected their full attention to the still-raging Pacific War. As in Europe, the war in the Pacific started slowly. After Pearl Harbor, the American-controlled Philippine archipelago fell to Japan. After running out of ammunition and supplies, the garrison of American and Filipino soldiers surrendered. The prisoners were marched 80 miles to their prisoner of war camp without food, water, or rest. 10,000 died on the Bataan Death March. ((For the Pacific War, see, for instance, Ronald Spector, Eagle Against the Sun: The American War with Japan (New York: Vintage, 1985); John Keegan, The Second World War (New York: Viking, 1990); John Costello, The Pacific War: 1941-1945 (New York: Harper, 2009); and John W. Dower, War Without Mercy: Race and Power in the Pacific War (New York: Pantheon, 1986).))

But as Americans mobilized their armed forces, the tide turned. In the summer of 1942, American naval victories at the Battle of the Coral Sea and the aircraft carrier duel at the Battle of Midway crippled Japan’s Pacific naval operations. To dislodge Japan’s hold over the Pacific, the US military began island hopping: attacking island after island, bypassing the strongest but seizing those capable of holding airfields to continue pushing Japan out of the region. Combat was vicious. At Guadalcanal American soldiers saw Japanese soldiers launch suicidal charges rather than surrender. Many Japanese soldiers refused to be taken prisoner or to take prisoners themselves. Such tactics, coupled with American racial prejudice, turned the Pacific Theater into a more brutal and barbarous conflict than the European Theater ((Dower, War Without Mercy.)).

Japanese defenders fought tenaciously. Few battles were as one-sided as the Battle of the Philippine Sea, or what the Americans called the Japanese counterattack “The Great Marianas Turkey Shoot.” Japanese soldiers bled the Americans in their advance across the Pacific. At Iwo Jima, an eight-square-mile island of volcanic rock, 17,000 Japanese soldiers held the island against 70,000 marines for over a month. At the cost of nearly their entire force, they inflicted almost 30,000 casualties before the island was lost.

By February 1945, American bombers were in range of the mainland. Bombers hit Japan’s industrial facilities but suffered high casualties. To spare bomber crews from dangerous daylight raids, and to achieve maximum effect against Japan’s wooden cities, many American bombers dropped incendiary weapons that created massive fire storms and wreaked havoc on Japanese cities. Over sixty Japanese cities were fire-bombed. American fire bombs killed 100,000 civilians in Tokyo in March 1945.

In June 1945, after eighty days of fighting and tens of thousands of casualties, the Americans captured the island of Okinawa. The mainland of Japan was open before them. It was a viable base from which to launch a full invasion of the Japanese homeland and end the war.

Estimates varied but, given the tenacity of Japanese soldiers fighting on islands far from their home, some officials estimated that an invasion of the Japanese mainland could cost half-a-million American casualties and perhaps millions of Japanese civilians. Historians debate the many motivations that ultimately drove the Americans to use atomic weapons against Japan, and many American officials criticized the decision, but these would be the numbers later cited by government leaders and military officials to justify their use. ((Michael J. Hogan, Hiroshima in History and Memory (New York: Cambridge University Press, 1996); Gar Alperovitz, The Decision to Use the Atomic Bomb (New York: Vintage, 1996).))

Early in the war, fearing that the Germans might develop an atomic bomb, the U.S. government launched the Manhattan Project, a hugely expensive, ambitious program to harness atomic energy and create a single weapon capable of leveling entire cities. The Americans successfully exploded the world’s first nuclear device, Trinity, in New Mexico in July 1945. (Physicist J. Robert Oppenheimer, the director of the Los Alamos Laboratory, where the bomb was designed, later recalled that the event reminded him of Hindu scripture: “Now I am become death, the destroyer of worlds.”) Two more bombs—“Fat Man” and “Little Boy”—were built and detonated over two Japanese cities in August. Hiroshima was hit on August 6th. Over 100,000 civilians were killed. Nagasaki followed on August 9th. Perhaps 80,000 civilians were killed. (Ibid.)

Emperor Hirohito announced the surrender of Japan on August 14th. The following day, aboard the battleship USS Missouri, delegates from the Japanese government formally signed their surrender. World War II was finally over.

 

VI. Soldiers’ Experiences

Almost eighteen million men served in World War II. Volunteers rushed to join the military after Pearl Harbor, but the majority—over 10 million—were drafted into service. Volunteers could express their preference for assignment, and many preempted the draft by volunteering. Regardless, those recruits judged I-A, “fit for service,” were moved into basic training, where soldiers were developed physically and trained in the basic use of weapons and military equipment. Soldiers were indoctrinated into the chain of command and introduced to military life. After basic, soldiers moved onto more specialized training. For example, combat infantrymen received additional weapons and tactical training and radio operators learned transmission codes and the operation of field radios. Afterward, an individual’s experience varied depending upon what service he entered and to what theatre he was assigned. ((Works on the experiences of World War II soldiers are seemingly endless and include popular histories such as Stephen E. Ambrose’s Citizen Soldiers (New York: Simon & Schuster, 1997) and memoirs such as Eugene Sledge’s With the Old Breed: At Peleliu and Okinawa (New York: Presidio Press, 1981).))

Soldiers and marines bore the brunt of on-the-ground combat. After transportation to the front by trains, ships, and trucks, they could expect to march carrying packs weighing anywhere from 20-50 pounds of rations, ammunition, bandages, tools, clothing, and miscellaneous personal items in addition to their weapons. Sailors, once deployed, spent months at sea operating their assigned vessels. Larger ships, particularly aircraft carriers, were veritable floating cities. In most, sailors lived and worked in cramped conditions, often sleeping in bunks stacked in rooms housing dozens of sailors. Senior officers received small rooms of their own. Sixty-thousand American sailors lost their lives in the war.

During World War II the Air Force was still a branch of the U.S. Army. Soldiers in the served on ground crews and air crews. World War II saw the institutionalization of massive bombing campaigns against cities and industrial production. Large bombers like the B-17 Flying Fortress required pilots, navigators, bombardiers, radio operators, and four dedicated machine gunners. Soldiers on bombing raids left from bases in England or Italy, or from Pacific Islands, endured hours of flight before approaching enemy territory. At high altitude, and without pressurized cabins, crews used oxygen tanks to breath and on-board temperatures plummeted. Once in enemy airspace crews confronted enemy fighters and anti-aircraft “flak” from the ground. While fighter pilots flew as escorts, the Air Corps suffered heavy casualties. Tens of thousands of airmen lost their lives.

On-the ground conditions varied. Soldiers in Europe endured freezing winters, impenetrable French hedgerows, Italian mountain ranges, and dense forests. Germans fought with a Western mentality familiar to Americans. Soldiers in the Pacific endured heat and humidity, monsoons, jungles, and tropical diseases. And they confronted an unfamiliar foe. Americans, for instance, could understand surrender as prudent; many Japanese soldiers saw it as cowardice. What Americans saw as a fanatical waste of life, the Japanese saw as brave and honorable. Atrocities flourished in the Pacific at a level unmatched in Europe.

 

VII. The Wartime Economy

Economies win wars no less than militaries. The war converted American factories to wartime production, reawakened Americans’ economic might, armed Allied belligerents and the American armed forces, effectively pulled America out of the Great Depression, and ushered in an era of unparalleled economic prosperity. ((See, for instance, Michael Adams, The Best War Ever: America and World War II (Baltimore: Johns Hopkins University Press, 1994); Mark Harrison, editor, The Economics of World War II: Six Great Powers in International Comparison (Cambridge: Cambridge University Press, 1998); David M. Kennedy, Freedom from Fear: The American People in Depression and War, 1929–1945 (New York: Oxford University Press, 1999).))

Roosevelt’s New Deal had ameliorated the worst of the Depression, but the economy still limped its way forward into the 1930s. But then Europe fell into war, and, despite its isolationism, Americans were glad to sell the Allies arms and supplies. And then Pearl Harbor changed everything. The United States drafted the economy into war service. The “sleeping giant” mobilized its unrivaled economic capacity to wage worldwide war. Governmental entities such as the War Production Board and the Office of War Mobilization and Reconversion managed economic production for the war effort and economic output exploded. An economy that was unable to provide work for a quarter of the work force less than a decade earlier now struggled to fill vacant positions.

Government spending during the four years of war doubled all federal spending in all of American history up to that point. The budget deficit soared, but, just as Depression Era economists had counseled, the government’s massive intervention annihilated unemployment and propelled growth. The economy that came out of the war looked nothing like the one that had begun it.

Military production came at the expense of the civilian consumer economy. Appliance and automobile manufacturers converted their plants to produce weapons and vehicles. Consumer choice was foreclosed. Every American received rationing cards and, legally, goods such as gasoline, coffee, meat, cheese, butter, processed food, firewood, and sugar could not be purchased without them. The housing industry was shut down, and the cities became overcrowded.

But the wartime economy boomed. The Roosevelt administration urged citizens to save their earnings or buy war bonds to prevent inflation. Bond drives were held nationally and headlined by Hollywood celebrities. Such drives were hugely successful. They not only funded much of the war effort, they helped to tame inflation as well. So too did tax rates. The federal government raised income taxes and boosted the top marginal tax rate to 94%.

Much like during WWI, citizens during WWI were urged to buy war bonds to support the effort overseas. Rallies like this one appealed to Americans’ sense of patriotism. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/5/5b/A_war_bond_rally_during_World_War_II_-_NARA_-_197250.jpg.

As in World War I, citizens were urged to buy war bonds to support the effort overseas. Rallies, such as this one, appealed to Americans’ sense of patriotism. Wikimedia.

With the economy booming and twenty million American workers placed into military service, unemployment virtually disappeared. And yet limits remained. Many defense contractors still refused to hire black workers. A. Philip Randolph in 1941 threatened to lead a march on Washington in protest, compelling Roosevelt to issue Executive Order Number 8802, the Fair Employment Practice in Defense Industries Act, which established the Fair Employment Practices Committee to end racial discrimination in the federal government and the defense industry. ((William P. Jones, The March on Washington: Jobs, Freedom, and the Forgotten History of Civil Rights (New York: Norton, 2013).))

During the war, more and more African Americans continued to leave the agrarian south for the industrial north. And as more and more men joined the military, and more and more positions went unfilled, women joined the workforce en masse. Other American producers looked outside of the United States, southward, to Mexico, to fill its labor force. Between 1942 and 1964, the United States contracted thousands of Mexican nationals to work in American agriculture and railroads in the Bracero Program. Jointly administered by the State Department, the Department of Labor, and the Department of Justice, the binational agreement secured five million contracts across twenty four states. ((Deborah Cohen, Braceros: Migrant Citizens and Transnational Subjects in the Postwar United States and Mexico (Chapel Hill: University of North Carolina Press, 2011).))

With factory work proliferating across the country and agricultural labor experiencing severe labor shortages, the presidents of Mexico and the U.S. signed an agreement in July 1942 to bring the first group of legally contracted workers to California. Discriminatory policies towards people of Mexican descent prevented bracero contracts in Texas until 1947. The Bracero Program survived the war, enshrined in law until the 1960s, when the United States liberalized its immigration laws. Though braceros suffered exploitative labor conditions, for the men who participated the program was a mixed blessing. Interviews with ex-braceros captured the complexity. “They would call us pigs … they didn’t have to treat us that way,” one said of his employers, while another said, “For me it was a blessing, the United States was a blessing … it is a nation I fell in love with because of the excess work and good pay.” ((Interview with Rogelio Valdez Robles by Valerie Martinez and Lydia Valdez, transcribed by Nancy Valerio, September 21, 2008; Interview with Alvaro Hernández by Myrna Parra-Mantilla, February 5, 2003, Interview No. 33, Institute of Oral History, University of Texas at El Paso.)) After the exodus of Mexican migrants during the Depression, the program helped to reestablish Mexican migration, institutionalized migrant farm work across much of the country, and further planted a Mexican presence in the southern and western United States.

 

VIII. Women and World War II

President Franklin D. Roosevelt and his administration had encouraged all able-bodied American women to help the war effort. He considered the role of women in the war critical for American victory and the public expected women to assume various functions to free men for active military service. While the majority of women opted to remain at home or volunteer with charitable organizations, many went to work or donned a military uniform.

World War II brought unprecedented labor opportunities for American women. Industrial labor, an occupational sphere dominated by men, shifted in part to women for the duration of wartime mobilization. Women applied for jobs in converted munitions factories. The iconic illustrated image of “Rosie the Riveter,” a muscular woman dressed in coveralls with her hair in a kerchief and inscribed with the phrase, “We Can Do It!” would come stand for female factory labor during the war. But women also worked in various auxiliary positions for the government. Although often a traditionally gendered female occupation, over a million administrative jobs at the local, state, and national levels were transferred from men to women for the duration of the war. ((Alecea Standlee, “Shifting Spheres: Gender, Labor, and the Construction of National Identity in U.S. Propaganda during the Second World War,” Minerva Journal of Women and War 4 (Spring 2010): 43-62.))

Women came into the workforces in greater numbers than ever before during WWII. With vacancies left by deployed men and new positions created by war production, posters like this iconic “We Can Do It!” urged women to support the war effort by going to work in America’s factories. Poster for Westinghouse, 1942. Wikimedia, http://commons.wikimedia.org/wiki/File:We_Can_Do_It!.jpg.

With so many American workers deployed overseas and so many new positions created by war production, posters like the iconic “We Can Do It!” urged women to support the war effort by entering the work force. Poster for Westinghouse, 1942. Wikimedia Commons.

For women who elected not to work, many volunteer opportunities presented themselves. The American Red Cross, the largest charitable organization in the nation, encouraged women to volunteer with local city chapters. Millions of females organized community social events for families, packed and shipped almost a half million ton of medical supplies overseas, and prepared twenty-seven million care packages of nonperishable items for American and other Allied prisoners of war. The American Red Cross further required all women volunteers to certify as nurse’s aides, providing an extra benefit and work opportunity for hospital staffs that suffered severe manpower losses. Other charity organizations, such as church and synagogue affiliates, benevolent associations, and social club auxiliaries, gave women further outlets for volunteer work.

Military service was another option for women who wanted to join the war effort. Over 350,000 women served in several all-female units of the military branches. The Army and Navy Nurse Corps Reserves, the Women’s Army Auxiliary Corps, the Navy’s Women Accepted for Volunteer Emergency Service, the Coast Guard’s SPARs (named for the Coast Guard motto,  Semper Paratus, “Always Ready”), and Marine Corps units gave women the opportunity to serve as either commissioned officers or enlisted members at  military bases at home and abroad. The Nurse Corps Reserves alone commissioned 105,000 Army and Navy nurses recruited by the American Red Cross. Military nurses worked at base hospitals, mobile medical units, and onboard hospital “mercy” ships. ((Major Jeanne Holm, USAF (Ret.), Women in the Military: An Unfinished Revolution (Novato, CA: Presidio Press, 1982), 21-109; Portia Kernodle, The Red Cross Nurse in Action, 1882-1948 (New York, NY: Harper and Brothers Publishers), 406-453.))

Jim Crow segregation in both the civilian and military sectors remained a problem for black women who wanted to join the war effort. Even after President Roosevelt signed Executive Order 8802 in 1941, supervisors that hired black women still often relegated them to the most menial tasks on factory floors. Segregation was further upheld in factory lunchrooms and many black women were forced to work at night to keep them separate from whites. In the military, only the Women’s Army Auxiliary Corps and the Nurse Corps Reserves accepted black women for active service, and the Army set a limited quota of ten percent of total end strength for black female officers and enlisted women and segregated black units on active duty. The American Red Cross, meanwhile, recruited only four hundred black nurses for the Army and Navy Nurse Corps Reserves, and black Army and Navy nurses worked in segregated military hospitals on bases stateside and overseas.

And for all of the postwar celebration of Rosie the Riveter, after the war ended the men returned and most women voluntarily left the work force or lost their jobs. Meanwhile, former military women faced a litany of obstacles in obtaining veteran’s benefits during their transition to civilian life. The nation that beckoned the call for assistance to millions of women during the four-year crisis hardly stood ready to accommodate their postwar needs and demands.

 

IX. Race and World War II

World War II affected nearly every aspect of life in the United States, and America’s racial relationships were not immune. African Americans, Mexicans and Mexican Americans, Jews, and Japanese Americans were profoundly impacted.

In early 1941, months before the Japanese attack on Pearl Harbor, A. Philip Randolph, president of the Brotherhood of Sleeping Car Porters, the largest black trade union in the nation, made headlines by threatening President Roosevelt with a march on Washington, D.C. In this “crisis of democracy,” Randolph said, defense industries refused to hire African Americans and the armed forces remained segregated. In exchange for Randolph calling off the march, Roosevelt issued Executive Order 8802, banning racial and religious discrimination in defense industries and establishing the Fair Employment Practices Committee (FEPC) to monitor defense industry hiring practices. While the armed forces would remain segregated throughout the war, and the FEPC had limited influence, the order showed that the federal government could stand against discrimination. The black workforce in defense industries rose from 3 percent in 1942 to 9 percent in 1945. ((William P. Jones, The March on Washington: Jobs, Freedom, and the Forgotten History of Civil Rights (New York: Norton, 2013).))

More than one million African Americans fought in the war. Most blacks served in segregated, non-combat units led by white officers. Some gains were made, however. The number of black officers increased from 5 in 1940 to over 7,000 in 1945. The all-black pilot squadrons, known as the Tuskegee Airmen, completed more than 1,500 missions, escorted heavy bombers into Germany, and earned several hundred merits and medals. Many bomber crews specifically requested the “Red Tail Angels” as escorts. And near the end of the war, the army and navy began integrating some of its platoons and facilities, before, in 1948, the U.S. government finally ordered the full integration of its armed forces. ((Stephen Tuck, Fog of War: The Second World War and the Civil Rights Movement (New York: Oxford University Press, 2012); Daniel Kryder, Divided Arsenal: Race and the American State during World War II (New York: Cambridge University Press, 2000).))

The Tuskegee Airmen stand at attention as Major James A. Ellison returns the salute of Mac Ross, one of the first graduates of the Tuskegee cadets. The photographs shows the pride and poise of the Tuskegee Airmen, who continued a tradition of African Americans honorably serving a country that still considered them second-class citizens. Photograph, 1941. Wikimedia, http://commons.wikimedia.org/wiki/File:First_Tuskeegee_Class.jpg.

The Tuskegee Airmen stand at attention as Major James A. Ellison returns the salute of Mac Ross, one of the first graduates of the Tuskegee cadets. The photographs shows the pride and poise of the Tuskegee Airmen, who continued a tradition of African Americans honorably serving a country that still considered them second-class citizens. Photograph, 1941. Wikimedia, http://commons.wikimedia.org/wiki/File:First_Tuskeegee_Class.jpg.

While black Americans served in the armed forces (though they were segregated), on the home front they became riveters and welders, rationed food and gasoline, and bought victory bonds. But many black Americans saw the war as an opportunity not only to serve their country but to improve it. The Pittsburgh Courier, a leading black newspaper, spearheaded the “Double V” campaign. It called on African Americans to fight two wars: the war against Nazism and Fascism abroad and the war against racial inequality at home. To achieve victory, to achieve “real democracy,” the Courier encouraged its readers to enlist in the armed forces, volunteer on the home front, and fight against racial segregation and discrimination. ((Andrew Buni, Robert L. Vann of The Pittsburgh Courier: Politics and Black Journalism (University of Pittsburgh Press, 1974).))

During the war, membership in the NAACP jumped tenfold, from 50,000 to 500,000. The Congress of Racial Equality (CORE) was formed in 1942 and spearheaded the method of nonviolent direct action to achieve desegregation. Between 1940 and 1950, some 1.5 million southern blacks, the largest number than any other decade since the beginning of the Great Migration, also indirectly demonstrated their opposition to racism and violence by migrating out of the Jim Crow South to the North. But transitions were not easy. Racial tensions erupted in 1943 in a series of riots in cities such as Mobile, Beaumont, and Harlem. The bloodiest race riot occurred in Detroit and resulted in the death of 25 blacks and 9 whites. Still, the war ignited in African Americans an urgency for equality that they would carry with them into the subsequent years. ((Dominic J. Capeci, Jr., and Martha Wilkerson, Layered Violence: The Detroit Rioters of 1943 (Jackson: University Press of Mississippi, 1991).))

Many Americans had to navigate American prejudice, and America’s entry into the war left foreign nationals from the belligerent nations in a precarious position. The Federal Bureau of Investigation targeted numbers on suspicions of disloyalty for detainment, hearings, and possible internment under the Alien Enemy Act. Those who received an order for internment were sent to government camps secured by barbed wire and armed guards. Such internments were supposed to be for cause. Then, on February 19, 1942, President Roosevelt signed Executive Order 9066, authorizing the removal any persons from designated “exclusion zones”—which ultimately covered nearly a third of the country—at the discretion of military commanders. 30,000 Japanese Americans fought for the United States in World War II, but wartime anti-Japanese sentiment reinforced historical prejudices and, under the order, persons of Japanese descent, both immigrants and American citizens, were detained and placed under the custody of the War Relocation Authority, the civil agency that supervised their relocation to internment camps. They lost their homes and jobs. The policy indiscriminately targeted Japanese-descended populations. Individuals did not receive individual review prior to their internment. This policy of mass exclusion and detention affected over 110,000 individuals. 70,000 were American citizens. ((Greg Robinson, By Order of the President: FDR and the Internment of Japanese Americans (Cambridge: Harvard University Press, 2001).))

In its 1982 report, Personal Justice Denied, the congressionally appointed Commission on Wartime Relocation and Internment of Civilians concluded that “the broad historical causes” shaping the relocation program were “race prejudice, war hysteria, and a failure of political leadership.” ((Commission on Wartime Relocation and Internment of Civilians, Personal Justice Denied: Report of the Commission on Wartime Relocation and Internment of Civilians (Seattle: University of Washington Press, 1997 [1988]).)) Although the exclusion orders were found to have been constitutionally permissible under the vagaries of national security, they were later judged, even by the military and judicial leaders of the time, to have been a grave injustice against persons of Japanese descent. In 1988, President Reagan signed a law that formally apologized for internment and provided reparations to surviving internees.

But if actions taken during war would later prove repugnant, so too could inaction. As the Allies pushed into Germany and Poland, they uncovered the full extent of Hitler’s genocidal atrocities. The Allies liberated massive camp systems set up for the imprisonment, forced labor, and extermination of all those deemed racially, ideologically, or biologically “unfit” by Nazi Germany. But the holocaust—the systematic murder of 11 million civilians, including 6 million Jews—had been underway for years. How did America respond?

This photograph became one of the most well-known images from WWII. Originally from Jürgen Stroop's May 1943 report to Heinrich Himmler, it circulated throughout Europe and America as an image of the Nazi Party’s brutality. The original German caption read: "Forcibly pulled out of dug-outs". Wikimedia, http://commons.wikimedia.org/wiki/File:Stroop_Report_-_Warsaw_Ghetto_Uprising_06b.jpg.

This photograph, originally from Jürgen Stroop’s May 1943 report to Heinrich Himmler, circulated throughout Europe and America as an image of the Nazi Party’s brutality. The original German caption read: “Forcibly pulled out of dug-outs”. Wikimedia Commons.

Initially, American officials expressed little official concern for Nazi persecutions. At the first signs of trouble in the 1930s, the State Department and most U.S. embassies did relatively little to aid European Jews. Roosevelt publically spoke out against the persecution, and even withdrew the U.S. ambassador to Germany after Kristallnacht. He pushed for the 1938 Evian Conference in France in which international leaders discussed the Jewish refugee problem and worked to expand Jewish immigration quotas by tens of thousands of people per year. But the conference came to nothing and the United States turned away countless Jewish refugees who requested asylum in the United States.

In 1939, the German ship St. Louis carried over 900 Jewish refugees. They could not find a country that would take them. The passengers could not receive visas under the United States’ quota system. A State Department wire to one passenger read that all must “await their turns on the waiting list and qualify for and obtain immigration visas before they may be admissible into the United States.” The ship cabled the president for special privilege, but the president said nothing. The ship was forced to return back to Europe. Hundreds of the St. Louis’s passengers would perish in the Holocaust.

Anti-Semitism still permeated the United States. Even if Roosevelt wanted to do more—it’s difficult to trace his own thoughts and personal views—he judged the political price for increasing immigration quotas as too high. In 1938 and 1939 the U.S. Congress debated the Wagner-Rogers Bill, an act to allow 20,000 German-Jewish children into the United States. First Lady Eleanor Roosevelt endorsed the measure but the president remained publicly silent. The bill was opposed by roughly two-thirds of the American public and was defeated. Historians speculate that Roosevelt, anxious to protect the New Deal and his rearmament programs, was unwilling to expend political capital to protect foreign groups that the American public had little interest in protecting. ((Richard Breitman and Allan J. Lichtman, FDR and the Jews (Cambridge: The Belknap Press of Harvard University Press, 2013), 149.))

Knowledge of the full extent of the Holocaust was slow in coming. When the war began, American officials, including Roosevelt, doubted initial reports of industrial death camps. But even when they conceded their existence, officials pointed to their genuinely limited options. The most plausible response was for the U.S. military was to bomb either the camps or the railroads leading to them, but those options were rejected by military and civilian officials who argued that it would do little to stop the deportations, would distract from the war effort, and could cause casualties among concentration camp prisoners. Whether bombing would have saved lives remains a hotly debated question. ((Peter Novick, The Holocaust in American Life (New York: Houghton Mifflin, 1999).))

Late in the war, Secretary of the Treasury Henry Morgenthau, himself born into a wealthy New York Jewish family, pushed through major changes in American policy. In 1944, he formed the War Refugees Board (WRB) and became a passionate advocate for Jewish refugees. The efforts of the WPB saved perhaps 200,000 Jews and 20,000 others. Morgenthau also convinced Roosevelt to issue a public statement condemning the Nazi’s persecution. But it was already 1944, and such policies were far too little, far too late. ((David Mayers, Dissenting Voices in America’s Rise to Power (Cambridge: Cambridge University Press, 2007), 274.))

 

X. Toward a Postwar World

Americans celebrated the end of the war. At home and abroad, the United States looked to create a postwar order that would guarantee global peace and domestic prosperity. Although the alliance-of-convenience with Stalin’s Soviet Union would collapse, Americans nevertheless looked for the means to ensure postwar stability and economic security for returning veterans.

The inability of the League of Nations to stop German, Italian, and Japanese aggressions caused many to question whether any global organization or agreements could ever ensure world peace. This included Franklin Roosevelt who, as Woodrow Wilson’s Undersecretary of the Navy, witnessed the rejection of this idea by both the American people and the Senate. In 1941, Roosevelt believed that postwar security could be maintained by an informal agreement between what he termed “the Four Policemen”—the U.S., Britain, the Soviet Union, and China—instead of a rejuvenated League of Nations. But others, including Secretary of State Cordell Hull and British Prime Minister Winston Churchill, disagreed and convinced Roosevelt to push for a new global organization. As the war ran its course, Roosevelt came around to the idea. And so did the American public. Pollster George Gallup noted a “profound change” in American attitudes. The United States had rejected membership in the League of Nations after World War I, and in 1937 only a third of Americans polled supported such an idea. But as war broke out in Europe, half of Americans did. America’s entry into the war bolstered support, and, by 1945, with the war closing, 81% of Americans favored the idea. ((Fraser Harbutt, Yalta 1945: Europe and America at the Crossroads of Peace (Cambridge, UK: Cambridge University Press, 2010), 258; Mark Mazower, Governing the World: The History of a Modern Idea (New York: Penguin Press, 2012, p. 208.))

Whatever his support, Roosevelt had long showed enthusiasm for the ideas later enshrined in the United Nations charter. In January 1941, he announced his Four Freedoms—freedom of speech, of worship, from want, and from fear—that all of the world’s citizens should enjoy. That same year he signed the Atlantic Charter with Churchill, which reinforced those ideas and added the right of self-determination and promised some sort of post-war economic and political cooperation. Roosevelt first used the phrase “united nations” to describe the Allied powers, not the subsequent post-war organization. But the name stuck. At Tehran in 1943, Roosevelt and Churchill convinced Stalin to send a Soviet delegation to a conference at Dumbarton Oaks, outside Washington D.C., in August 1944 where they agreed on the basic structure of the new organization. It would have a Security Council—the original “four policemen,” plus France—who would consult on how best to keep the peace, and when to deploy the military power of the assembled nations. According to one historian, the organization demonstrated an understanding that “only the Great Powers, working together, could provide real security.” But the plan was a kind of hybrid between Roosevelt’s policemen idea and a global organization of equal representation. There would also be a General Assembly, made up of all nations, an International Court of Justice, and a council for economic and social matters. Dumbarton Oaks was a mixed success—the Soviets especially expressed concern over how the Security Council would work—but the powers agreed to meet again in San Francisco between April and June 1945 for further negotiations. There, on June 26 1945, fifty nations signed the UN charter. ((Paul Kennedy, The Parliament of Man: The Past, Present, and Future of the United Nations (New York: Random House, 2006).))

Anticipating victory in World War II, leaders not only looked to the postwar global order, they looked to the fate of returning American servicemen. American politicians and interest groups sought to avoid another economic depression—the economy had tanked after World War I—by gradually easing returning veterans back into the civilian economy. The brainchild of the head of the American Legion, William Atherton, the G.I. Bill won support from progressives and conservatives alike. Passed in 1944, the G.I. Bill was a multifaceted, multi-billion-dollar entitlement program that rewarded honorably discharged veterans with numerous benefits. ((Kathleen Frydl, The G.I. Bill (Cambridge: Cambridge University Press, 2009); Suzanne Mettler, Soldiers to Citizens: The G.I. Bill and the Making of the Greatest Generation (Oxford University Press, 2005).))

Faced with the prospect of over 15 million members of the armed services (including approximately 350,000 women) suddenly returning to civilian life, the G.I. Bill offered a bevy of inducements to slow their influx into the civilian workforce as reward their service with public benefits. The legislation offered a year’s worth of unemployment benefits for veterans unable to secure work. About half of American veterans (8 million) received $4 billion in unemployment benefits over the life of the bill. The G.I. Bill also made post-secondary education a reality for many. The Veterans Administration (VA) paid the lion’s share of educational expenses, including tuition, fees, supplies, and even stipends for living expenses. The G.I. Bill sparked a boom in higher education. Enrollments at accredited colleges, universities, and technical and professional schools spiked, rising from 1.5 million in 1940 to 3.6 million in 1960. The VA disbursed over $14 billon in educational aid in just over a decade. Furthermore, the Bill encouraged home ownership. Roughly 40 percent of Americans owned homes in 1945, but that figured climbed to 60 percent a decade after the close of the war. Doing away with down-payment requirements, veterans could obtain home loans for as little as $1 down. Close to 4 million veterans purchased homes through the G.I. Bill, sparking a construction bonanza that fueled postwar growth. In addition, the V.A. also helped nearly 200,000 veterans secure farms and offered thousands more guaranteed financing for small businesses. ((Ibid.))

Not all Americans, however, benefitted equally from the G.I. Bill. Indirectly, since the military limited the number of female personnel men qualified for the bill’s benefits in far higher numbers. Colleges also limited the number of female applicants to guarantee space for male veterans. African Americans, too, faced discrimination. Segregation forced black veterans into overcrowded “historically black colleges” that had to turn away close to 20,000 applicants. Meanwhile, residential segregation limited black home ownership in various neighborhoods, denying black homeowners the equity and investment that would come in home ownership. There were other limits, and other disadvantaged groups. Veterans accused of homosexuality, for instance, were similarly unable to claim GI benefits. ((Lizabeth Cohen, A Consumer’s Republic: The Politics of Mass Consumption in Postwar America (New York: Vintage, 2003).))

The effects of the G.I. Bill were significant and long-lasting. It helped to sustain the great postwar economic boom and, if many could not attain it, it nevertheless established the hallmarks of American middle class life.

 

XI. Conclusion

The United States entered the war in a crippling economic depression and exited at the beginning of an unparalleled economic boom. The war had been won, the United States was stronger than ever, and Americans looked forward to a prosperous future. And yet new problems loomed. Stalin’s Soviet Union and the proliferation of nuclear weapons would disrupt postwar dreams of global harmony. Meanwhile, Americans that had fought a war for global democracy would find that very democracy eradicated around the world in reestablished colonial regimes and at home in segregation and injustice. The war had unleashed powerful forces, forces that would reshape the United States at home and abroad.

 

Contributors

This chapter was edited by Joseph Locke, with content contributions by Mary Beth Chopas, Andrew David, Ashton Ellett, Paula Fortier, Joseph Locke, Jennifer Mandel, Valerie Martinez, Ryan Menath, Chris Thomas.

 

Recommended Reading

  1. Adams, Michael. The Best War Ever: America and World War II. Baltimore: Johns Hopkins University Press, 1994.
  2. Anderson, Karen. Wartime Women: Sex Roles, Family Relations, and the Status of Women during WWII. Westport: Greenwood Press, 1981.
  3. Blum, John Morton, V Was for Victory: Politics and American Culture during World War II. New York: Marine Books, 1976.
  4. Daniels, Roger. Prisoners without Trial: Japanese Americans in World War II. New York: Hill and Wang, 1993.
  5. Dower, John. War without Mercy: Race and Power in the Pacific War. New York: Pantheon, 1993.
  6. Honey, Maureen. Creating Rosie the Riveter: Class, Gender, and Propaganda during World War II. Amherst: University of Massachusetts Press, 1984.
  7. Keegan, John. The Second World War. New York: Viking Press, 1990.
  8. Kennedy, David. Freedom From Fear: America in Depression and War, 1929-1945. New York: Oxford University Press, 1999.
  9. Leonard, Kevin Allen. The Battle for Los Angeles: Racial Ideology and World War II. Albuquerque: University of New Mexico Press, 2006.
  10. Lichtenstein, Nelson. Labor’s War at Home: The CIO in World War II. New York: Cambridge University Press, 1982.
  11. Malloy, Sean L. Atomic Tragedy: Henry L. Stimson and the Decision to Use the Bomb. Ithaca: Cornell University Press, 2008.
  12. Rhodes, Richard. The Making of the AtomicBomb. New York: Simon and Schuster, 1988.
  13. Schulman, Bruce J. From Cotton Belt to Sunbelt: Federal Policy, Economic Development, and the Transformation of the South, 1938–1980. New York: Oxford University Press, 1991.
  14. Spector, Ronald H. Eagle against the Sun: The American War with Japan. New York: Random House, 1985
  15. Takaki, Ronald T. Double Victory: A Multicultural History of America in World War II. New York: Little, Brown, 2000.

 

Notes

23. The Great Depression

"Destitute pea pickers in California. Mother of seven children. Age thirty-two. Nipomo, California," Library of Congress.

“Destitute pea pickers in California. Mother of seven children. Age thirty-two. Nipomo, California,” Library of Congress.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

The wonder of the stock market had permeated popular culture throughout the 1920s. Although it was released during the first year of the Great Depression, the 1930 film High Society Blues captured the speculative hope and prosperity of the previous decade. “I’m in the Market for You,” a popular musical number from the film, even used the stock market as a metaphor for love: You’re going up, up, up in my estimation, / I want a thousand shares of your caresses, too. / We’ll count the hugs and kisses, / When dividends are due, / Cause I’m in the market for you.

But, just as the song was being recorded in 1929, the stock market reached the apex of its swift climb, crashed, and brought an abrupt end to the seeming prosperity of the “Roaring ‘20s.” The Great Depression had arrived.

 

II. The Origins of the Great Depression

“Crowd of people gather outside the New York Stock Exchange following the Crash of 1929,” 1929. Library of Congress, http://www.loc.gov/pictures/item/99471695/.

“Crowd of people gather outside the New York Stock Exchange following the Crash of 1929,” 1929. Library of Congress, http://www.loc.gov/pictures/item/99471695/.

On Thursday, October 24, 1929, stock market prices suddenly plummeted. Ten billion dollars in investments (roughly equivalent to about $100 billion today) disappeared in a matter of hours. Panicked selling set in, stock sunk to record lows, and stunned investors crowded the New York Stock Exchange demanding answers. Leading bankers met privately at the offices of J.P. Morgan and raised millions in personal and institutional contributions to halt the slide. They marched across the street and ceremoniously bought stocks at inflated prices. The market temporarily stabilized but fears spread over the weekend and the following week frightened investors dumped their portfolios to avoid further losses. On October 29, “Black Tuesday,” the stock market began its long precipitous fall. Stock values evaporated. Shares of U.S. Steel dropped from $262 to $22. General Motors’ stock fell from $73 a share to $8. Four-fifths of the J.D. Rockefeller’s fortune—the greatest in American history—vanished.

Although the Crash stunned the nation, it exposed the deeper, underlying problems with the American economy in the 1920s. The stock market’s popularity grew throughout the decade, but only 2.5% of Americans had brokerage accounts; the overwhelming majority of Americans had no direct personal stake in Wall Street. The stock market’s collapse, no matter how dramatic, did not by itself depress the American economy. Instead, the Crash exposed a great number of factors which, when combined with the financial panic, sunk the American economy into the greatest of all economic crises. Rising inequality, declining demand, rural collapse, overextended investors, and the bursting of speculative bubbles all conspired to plunge the nation into the Great Depression.

Despite resistance by Progressives, the vast gap between rich and poor accelerated throughout the early-twentieth century. In the aggregate, Americans were better off in 1929 than in 1920. Per capita income had risene 10% for all Americans, but 75% for the nation’s wealthiest citizens. ((George Donelson Moss, The Rise of Modern America: A History of the American People, 1890-1945 (New York: Prentice Hall, 1995), 185-86.)) The return of conservative politics in the 1920s reinforced federal fiscal policies that exacerbated the divide: low corporate and personal taxes, easy credit, and depressed interest rates overwhelmingly favored wealthy investors who, flush with cash, spent their money on luxury goods and speculative investments in the rapidly rising stock market.

The pro-business policies of the 1920s were designed for an American economy built upon the production and consumption of durable goods. Yet, by the late 1920s, much of the market was saturated. The boom of automobile manufacturing, the great driver of the American economy in the 1920s, slowed as fewer and fewer Americans with the means to purchase a car had not already done so. More and more, the well-to-do had no need for the new automobiles, radios, and other consumer goods that fueled GDP growth in the 1920s. When products failed to sell, inventories piled up, manufacturers scaled back production, and companies fired workers, stripping potential consumers of cash, blunting demand for consumer goods, and replicating the downward economic cycle. The situation was only compounded by increased automation and rising efficiency in American factories. Despite impressive overall growth throughout the 1920s, unemployment hovered around 7% throughout the decade, suppressing purchasing power for a great swath of potential consumers. ((Ibid., 186.))

While a manufacturing innovation, Henry Ford’s assembly line produced so many cars as to flood the automobile market in the 1920s. Interview with Henry Ford, Literary Digest, January 7, 1928. Wikimedia, http://commons.wikimedia.org/wiki/File:Ford_Motor_Company_assembly_line.jpg.

While a manufacturing innovation, Henry Ford’s assembly line produced so many cars as to flood the automobile market in the 1920s. Interview with Henry Ford, Literary Digest, January 7, 1928. Wikimedia, http://commons.wikimedia.org/wiki/File:Ford_Motor_Company_assembly_line.jpg.

For American farmers, meanwhile, “hard times” began long before the markets crashed. In 1920 and 1921, after several years of larger-than-average profits, farm prices in the South and West continued their long decline, plummeting as production climbed and domestic and international demand for cotton, foodstuffs, and other agricultural products stalled. Widespread soil exhaustion on western farms only compounded the problem. Farmers found themselves unable to make payments on loans taken out during the good years, and banks in agricultural areas tightened credit in response. By 1929, farm families were overextended, in no shape to make up for declining consumption, and in a precarious economic position even before the Depression wrecked the global economy. ((Robert S. McElvaine, The Great Depression: America, 1921-1940 (New York: Random House, 1984), 36.))

Despite serious foundational problems in the industrial and agricultural economy, most Americans in 1929 and 1930 still believed the economy would bounce back. In 1930, amid one of the Depression’s many false hopes, President Herbert Hoover reassured an audience that “the depression is over.” ((John Steel Gordon, An Empire of Wealth: The Epic History of American Economic Power (Harper Perennial), 320.)) But the president was not simply guilty of false optimism. Hoover made many mistakes. During his 1928 election campaign, Hoover promoted higher tariffs as a means for encouraging domestic consumption and protecting American farmers from foreign competition. Spurred by the ongoing agricultural depression, Hoover signed into law the highest tariff in American history, the Smoot-Hawley Tariff of 1930, just as global markets began to crumble. Other countries responded in kind, tariff walls rose across the globe, and international trade ground to a halt. Between 1929 and 1932, international trade dropped from $36 billion to only $12 billion. American exports fell by 78%. Combined with overproduction and declining domestic consumption, the tariff exacerbated the world’s economic collapse. ((Moss, 186-87.))

But beyond structural flaws, speculative bubbles, and destructive protectionism, the final contributing element of the Great Depression was a quintessentially human one: panic. The frantic reaction to the market’s fall aggravated the economy’s other many failings. More economic policies backfired. The Federal Reserve over-corrected in their response to speculation by raising interest rates and tightening credit. Across the country, banks denied loans and called in debts. Their patrons, afraid that reactionary policies meant further financial trouble, rushed to withdraw money before institutions could close their doors, ensuring their fate. Such bank runs were not uncommon in the 1920s, but, in 1930, with the economy worsening and panic from the crash accelerating, 1,352 banks failed. In 1932, nearly 2,300 banks collapsed, taking personal deposits, savings, and credit with them. ((David M. Kennedy, Freedom from Fear: The American People in Depression and War, 1929-1945 (New York: Oxford University Press, 1999), 65, 68.))

The Great Depression was the confluence of many problems, most of which had begun during a time of unprecedented economic growth. Fiscal policies of the Republican “business presidents” undoubtedly widened the gap between rich and poor and fostered a “stand-off” over international trade, but such policies were widely popular and, for much of the decade, widely seen as a source of the decade’s explosive growth. With fortunes to be won and standards of living to maintain, few Americans had the foresight or wherewithal to repudiate an age of easy credit, rampant consumerism, and wild speculation. Instead, as the Depression worked its way across the United States, Americans hoped to weather the economic storm as best they could, waiting for some form of relief, any answer to the ever-mounting economic collapse that strangled so many Americans’ lives.

 

III. Herbert Hoover and the Politics of the Depression

“Unemployed men queued outside a depression soup kitchen opened in Chicago by Al Capone,” February 1931. Wikimeida, http://commons.wikimedia.org/wiki/File:Unemployed_men_queued_outside_a_depression_soup_kitchen_opened_in_Chicago_by_Al_Capone,_02-1931_-_NARA_-_541927.jpg.

“Unemployed men queued outside a depression soup kitchen opened in Chicago by Al Capone,” February 1931. Wikimeida.

As the Depression spread, public blame settled on President Herbert Hoover and the conservative politics of the Republican Party. But Hoover was as much victim as perpetrator, a man who had the misfortune of becoming a visible symbol for large invisible forces. In 1928 Hoover had no reason to believe that his presidency would be any different than that of his predecessor, Calvin Coolidge, whose time in office was marked by relative government inaction, seemingly rampant prosperity, and high approval ratings.

Coolidge had decided not to seek a second term in 1928. A man of few words, “Silent Cal” publicized this decision by handing a scrap of paper to a reporter that simply read: “I do not choose to run for president in 1928.” The race therefore became a contest between the Democratic governor of New York, Al Smith, whose Catholic faith and immigrant background aroused nativist suspicions and whose connections to Tammany Hall and anti-Prohibition politics offended reformers, and the Republican candidate, Herbert Hoover, whose All-American, Midwestern, Protestant background and managerial prowess during the First World War endeared him to American voters. ((Allan J. Lichtman, Prejudice and the Old Politics: The Presidential Election of 1928 (Chapel Hill, NC: University of North Carolina Press, 1979).))

Hoover epitomized the “self-made man.” Orphaned at age 9, he was raised by a strict Quaker uncle on the West Coast. He graduated from Stanford University in 1895 and worked as an engineer for several multinational mining companies. He became a household name during World War I when he oversaw voluntary rationing as the head of the U.S. Food Administration and, after the armistice, served as the Director General of the American Relief Association in Europe. Hoover’s reputation for humanitarian service and problem-solving translated into popular support, even as the public soured on Wilson’s Progressive activism. Hoover was one of the few politicians whose career benefitted from wartime public service. After the war both the Democratic and Republican parties tried to draft him to run for president in 1920. ((Richard Norton Smith, An Uncommon Man: The Triumph of Herbert Hoover (New York: Simon and Schuster, 1987).))

 

Hoover declined to run in 1920 and 1924. He served instead as Secretary of Commerce under both Harding and Coolidge, taking an active role in all aspects of government. In 1928, he seemed the natural successor to Coolidge. Politically, aside from the issue of Prohibition (he was a “dry,” Smith a “wet”), Hoover’s platform differed very little from Smith’s, leaving little to discuss during the campaign except personality and religion. Both benefitted Hoover. Smith’s background engendered opposition from otherwise solid Democratic states, especially in the South, where his Catholic, ethnic, urban, and anti-Prohibition background were anathema. His popularity among urban ethnic voters counted for little. Several southern states, in part owing to the work of itinerant evangelical politicking, voted Republican for the first time since Reconstruction. Hoover won in a landslide, taking nearly 60% of the popular vote. ((Ibid.))

Although Hoover is sometimes categorized as a “business president” in line with his Republican predecessors, he also embraced an inherent business progressivism, a system of voluntary action called “Associationalism” that assumed Americans could maintain a web of voluntary cooperative organizations dedicated to providing economic assistance and services to those in need. Businesses, the thinking went, would willingly limit harmful practice for the greater economic good. To Hoover, direct government aid would discourage a healthy work ethic while Associationalism would encourage the very self-control and self-initiative that fueled economic growth. But when the Depression exposed the incapacity of such strategies to produce an economic recovery, Hoover proved insufficiently flexible to recognize the limits of his ideology. And when the ideology failed, so too did his presidency. ((Kennedy, 70-103.))

Hoover entered office upon a wave of popular support, but by October 1929 the economic collapse had overwhelmed his presidency. Like all too many Americans, Hoover and his advisers assumed—or perhaps simply hoped—that the sharp financial and economic decline was a temporary downturn, another “bust” of the inevitable boom-bust cycles that stretched back through America’s commercial history. Many economists argued that periodic busts culled weak firms and paved the way for future growth. And so when suffering Americans looked to Hoover for help, Hoover could only answer with volunteerism. He asked business leaders to promise to maintain investments and employment and encouraged state and local charities to provide assistance to those in need. Hoover established the President’s Organization for Unemployment Relief, or POUR, to help organize the efforts of private agencies. While POUR urged charitable giving, charitable relief organizations were overwhelmed by the growing needs of the many multiplying unemployed, underfed, and unhoused Americans. By mid-1932, for instance, a quarter of all of New York’s private charities closed: they had simply run out of money. In Atlanta, solvent relief charities could only provide $1.30 per week to needy families. The size and scope of the Depression overpowered the radically insufficient capacity of private volunteer organizations to mediate the crisis. ((Ibid.))

By 1932, with the economy long-since stagnant and a reelection campaign looming, Hoover, hoping to stimulate American industry, created the Reconstruction Finance Corporation to provide emergency loans to banks, building-and-loan societies, railroads, and other private industries. It was radical in its use of direct government aid and out of character for the normally laissez-faire Hoover, but it also bypassed needy Americans to bolster industrial and financial interests. New York Congressman Fiorello LaGuardia, who later served as mayor of New York City, captured public sentiment when he denounced the RFC as a “millionaire’s dole.” ((Ibid., 76.))

IV. The Bonus Army

“Shacks, put up by the Bonus Army on the Anacostia flats, Washington, D.C., burning after the battle with the military. The Capitol in the background. 1932.” Wikimedia, http://commons.wikimedia.org/wiki/File:Evictbonusarmy.jpg.

“Shacks, put up by the Bonus Army on the Anacostia flats, Washington, D.C., burning after the battle with the military. The Capitol in the background. 1932.” Wikimedia, http://commons.wikimedia.org/wiki/File:Evictbonusarmy.jpg.

Hoover’s reaction to a major public protest sealed his legacy. In the summer of 1932, Congress debated a bill authorizing immediate payment of long-promised cash bonuses to veterans of World War I, originally scheduled to be paid out in 1945. Given the economic hardships facing the country, the bonus came to symbolize government relief for the most deserving recipients, and from across the country more than 15,000 unemployed veterans and their families converged on Washington, D.C. They erected a tent city across the Potomac River in Anacostia Flats, a “Hooverville” in the spirit of the camps of homeless and unemployed Americans then appearing in American cities.

Concerned with what immediate payment would do to the federal budget, Hoover opposed the bill, which was eventually voted down by the Senate. While most of the “Bonus Army” left Washington in defeat, many stayed to press their case. Hoover called the remaining veterans “insurrectionists” and ordered them to leave. When thousands failed to heed the vacation order, General Douglas MacArthur, accompanied by local police, infantry, cavalry, tanks, and a machine gun squadron, stormed the tent city and routed the Bonus Army. National media covered the disaster as troops chased down men and women, tear-gassed children, and torched the shantytown. ((Kennedy, 92.))

Hoover’s insensitivity toward suffering Americans, his unwillingness to address widespread economic problems, and his repeated platitudes about returning prosperity condemned his presidency. Hoover of course was not responsible for the Depression, not personally. But neither he nor his advisers conceived of the enormity of the crisis, a crisis his conservative ideology could neither accommodate nor address. As a result, Americans found little relief from Washington. They were on their own.

 

V. The Lived Experience of the Great Depression

"Hooverville, Seattle." 1932-1937. Washington State Archives. http://www.digitalarchives.wa.gov/Record/View/B7A94A0DC95F7B3E0F1081FDB3A72C1E

“Hooverville, Seattle.” 1932-1937. Washington State Archives. http://www.digitalarchives.wa.gov/Record/View/B7A94A0DC95F7B3E0F1081FDB3A72C1E

In 1934 a woman from Humboldt County, California, wrote to First Lady Eleanor Roosevelt seeking a job for her husband, a surveyor, who had been out of work for nearly two years. The pair had survived on the meager income she received from working at the county courthouse. “My salary could keep us going,” she explained, “but—I am to have a baby.” The family needed temporary help, and, she explained, “after that I can go back to work and we can work out our own salvation. But to have this baby come to a home full of worry and despair, with no money for the things it needs, is not fair. It needs and deserves a happy start in life.” ((Mrs. M. H. A. to Eleanor Roosevelt, June 14, 1934, in Robert S. McElvaine, ed., Down and Out in the Great Depression: Letters from the Forgotten Man (Chapel Hill: University of North Carolina Press, 1983), 54–55.))

As the United States slid ever deeper into the Great Depression, such tragic scenes played out time and time again. Individuals, families, and communities faced the painful, frightening, and often bewildering collapse of the economic institutions upon which they depended. The more fortunate were spared worst effects, and a few even profited from it, but by the end of 1932, the crisis had become so deep and so widespread that most Americans had suffered directly. Markets crashed through no fault of their own. Workers were plunged into poverty because of impersonal forces for which they shared no responsibility. With no safety net, they were thrown into economic chaos.

With rampant unemployment and declining wages, Americans slashed expenses. The fortunate could survive by simply deferring vacations and regular consumer purchases. Middle- and working-class Americans might rely upon disappearing credit at neighborhood stores, default on utility bills, or skip meals. Those that could borrowed from relatives or took in boarders in homes or “doubled up” in tenements. The most desperate, the chronically unemployed, encamped on public or marginal lands in “Hoovervilles,” spontaneous shantytowns that dotted America’s cities, depending upon breadlines and street-corner peddling. Poor women and young children entered the labor force, as they always had. The ideal of the “male breadwinner” was always a fiction for poor Americans, but the Depression decimated millions of new workers. The emotional and psychological shocks of unemployment and underemployment only added to the shocking material depravities of the Depression.  Social workers and charity officials, for instance, often found the unemployed suffering from feelings of futility, anger, bitterness, confusion, and loss of pride. Such feelings affected the rural poor no less than the urban. ((See especially chapter five of Lizabeth Cohen, Making a New Deal: Industrial Workers in Chicago, 1919–1939 (New York: Cambridge University Press, 1990).))

 

VI. Migration and Immigration during the Great Depression

On the Great Plains, environmental catastrophe deepened America’s longstanding agricultural crisis and magnified the tragedy of the Depression. Beginning in 1932, severe droughts hit from Texas to the Dakotas and lasted until at least 1936. The droughts compounded years of agricultural mismanagement. To grow their crops, Plains farmers had plowed up natural ground cover that had taken ages to form over the surface of the dry Plains states. Relatively wet decades had protected them, but, during the early 1930s, without rain, the exposed fertile topsoil turned to dust, and without sod or windbreaks such as trees, rolling winds churned the dust into massive storms that blotted out the sky, choked settlers and livestock, and rained dirt not only across the region but as far east as Washington, D.C., New England, and ships on the Atlantic Ocean. The “Dust Bowl,” as the region became known, exposed all-too-late the need for conservation. The region’s farmers, already hit by years of foreclosures and declining commodity prices, were decimated. ((Robert S. McElvaine, ed., Encyclopedia of the Great Depression (New York: Macmillan, 2004), 320.)) For many in Texas, Oklahoma, Kansas, and Arkansas who were “baked out, blown out, and broke,” their only hope was to travel west to California, whose rains still brought bountiful harvests and–potentially–jobs for farmworkers. It was an exodus. Oklahoma lost 440,000 people, or a full 18.4 percent of its 1930 population, to out-migration. ((Donald Worster, Dust Bowl: The Southern Plains in the 1930s (New York: Oxford University Press, 1979), 48.))

This iconic photograph made real the suffering of millions during the Great Depression. Dorothea Lange, “Destitute pea pickers in California. Mother of seven children. Age thirty-two. Nipomo, California” or “Migrant Mother,” February/March 1936. Library of Congress, http://www.loc.gov/pictures/item/fsa1998021539/PP/.

This iconic photograph made real the suffering of millions during the Great Depression. Dorothea Lange, “Destitute pea pickers in California. Mother of seven children. Age thirty-two. Nipomo, California” or “Migrant Mother,” February/March 1936. Library of Congress, http://www.loc.gov/pictures/item/fsa1998021539/PP/.

Dorothea Lange’s Migrant Mother became one of the most enduring images of the “Dust Bowl” and the ensuing westward exodus. Lange, a photographer for the Farm Security Administration, captured the image at migrant farmworker camp in Nipomo, California, in 1936. In the photograph a young mother stares out with a worried, weary expression. She a migrant, having left her home in Oklahoma to follow the crops in the Golden State. She took part in what many in the mid-1930s were beginning to recognize as a vast migration of families out of the southwestern plains states. In the image she cradles an infant and supports two older children, who cling to her. Lange’s photo encapsulated the nation’s struggle. The subject of the photograph seemed used to hard work but down on her luck, and uncertain about what the future might hold.

The “Okies,” as such westward migrants were disparagingly called by their new neighbors, were the most visible group many who were on the move during the Depression, lured by news and rumors of jobs in far flung regions of the country. By 1932 sociologists were estimating that millions of men were on the roads and rails travelling the country. Economists sought to quantify the movement of families from the Plains. Popular magazines and newspapers were filled with stories of homeless boys and the veterans-turned-migrants of the Bonus Army commandeering boxcars. Popular culture, such as William Wellman’s 1933 film, Wild Boys of the Road, and, most famously, John Steinbeck’s Grapes of Wrath, published in 1939 and turned into a hit movie a year later, captured the Depression’s dislocated populations.

These years witnessed the first significant reversal in the flow of people between rural and urban areas. Thousands of city-dwellers fled the jobless cities and moved to the country looking for work. As relief efforts floundered, many state and local officials threw up barriers to migration, making it difficult for newcomers to receive relief or find work. Some state legislatures made it a crime to bring poor migrants into the state and allowed local officials to deport migrants to neighboring states. In the winter of 1935-1936, California, Florida, and Colorado established “border blockades” to block poor migrants from their states and reduce competition with local residents for jobs. A billboard outside Tulsa, Oklahoma, informed potential migrants that there were “NO JOBS in California” and warned them to “KEEP Out.” ((James N. Gregory, American Exodus: The Dust Bowl Migration and Okie Culture in California (New York: Oxford University Press, 1989), 22.))

Sympathy for migrants, however, accelerated late in the Depression with the publication of John Steinbeck’s Grapes of Wrath. The Joad family’s struggles drew attention to the plight of Depression-era migrants and, just a month after the nationwide release of the film version, Congress created the Select Committee to Investigate the Interstate Migration of Destitute Citizens. Starting in 1940, the Committee held widely publicized hearings. But it was too late. Within a year of its founding, defense industries were already gearing up in the wake of the outbreak of World War II, and the “problem” of migration suddenly became a lack of migrants needed to fill war industries. Such relief was nowhere to be found in the 1930s.

Americans meanwhile feared foreign workers willing to work for even lower wages. The Saturday Evening Post warned that foreign immigrants, who were “compelled to accept employment on any terms and conditions offered,” would exacerbate the economic crisis. ((Cybelle Fox, Three Worlds of Relief (Princeton: Princeton University Press, 2012), 126.)) On September 8, 1930, the Hoover administration issued a press release on the administration of immigration laws “under existing conditions of unemployment.” Hoover instructed consular officers to scrutinize carefully the visa applications of those “likely to become public charges” and suggested that this might include denying visas to most, if not all, alien laborers and artisans. The crisis itself had served to stifle foreign immigration, but such restrictive and exclusionary actions in the first years of the Depression intensified its effects. The number of European visas issued fell roughly 60 percent while deportations dramatically increased. Between 1930 and 1932, 54,000 people were deported. An additional 44,000 deportable aliens left “voluntarily.” ((Fox, Three Worlds of Relief, 127.))

Exclusionary measures hit Mexican immigrants particularly hard. The State Department made a concerted effort to reduce immigration from Mexico as early as 1929 and Hoover’s executive actions arrived the following year. Officials in the Southwest led a coordinated effort to push out Mexican immigrants. In Los Angeles, the Citizens Committee on Coordination of Unemployment Relief began working closely with federal officials in early 1931 to conduct deportation raids while the Los Angeles County Department of Charities began a simultaneous drive to repatriate Mexicans and Mexican Americans on relief, negotiating a charity rate with the railroads to return Mexicans “voluntarily” to their mother country. According to the federal census, from 1930 to 1940 the Mexican-born population living in Arizona, California, New Mexico and Texas fell from 616,998 to 377,433. Franklin Roosevelt did not indulge anti-immigrant sentiment as willingly as Hoover had. Under the New Deal, the Immigration and Naturalization Service halted some of the Hoover Administration’s most divisive practices, but, with jobs suddenly scarce, hostile attitudes intensified, and official policies less than welcoming, immigration plummeted and deportations rose. Over the course of the Depression, more people left the United States than entered it. ((ristide Zolberg, A Nation by Design: Immigration Policy in the Fashioning of America (New York: Russell Sage, 2006), 269.))

 

VII. Franklin Delano Roosevelt and the “First” New Deal

Posters like this showing the extent of the Federal Art Project were used to prove the worth of the WPA’s various endeavors and, by extension, the value of the New Deal to the American people. “Employment and Activities poster for the WPA's Federal Art Project,” January 1, 1936. Wikimedia, http://commons.wikimedia.org/wiki/File:Archives_of_American_Art_-_Employment_and_Activities_poster_for_the_WPA%27s_Federal_Art_Project_-_11772.jpg.

Posters like this showing the extent of the Federal Art Project were used to prove the worth of the WPA’s various endeavors and, by extension, the value of the New Deal to the American people. “Employment and Activities poster for the WPA’s Federal Art Project,” January 1, 1936. Wikimedia.

The early years of the Depression were catastrophic. The crisis, far from relenting, deepened each year. Unemployment peaked at 25% in 1932. With no end in sight, and with private firms crippled and charities overwhelmed by the crisis, Americans looked to their government as the last barrier against starvation, hopelessness, and perpetual poverty.

Few presidential elections in modern American history have been more consequential than that of 1932. The United States was struggling through the third year of the Depression and exasperated voters overthrew Hoover in a landslide to elect the Democratic governor of New York, Franklin Delano Roosevelt. Roosevelt came from a privileged background in New York’s Hudson River Valley (his distant cousin, Theodore Roosevelt, became president while Franklin was at Harvard). Franklin Roosevelt embarked upon a slow but steady ascent through state and national politics. In 1913, he was appointed Assistant Secretary of the Navy, a position he held during the defense emergency of World War I. In the course of his rise, in the summer of 1921, Roosevelt suffered a sudden bout of lower-body pain and paralysis. He was diagnosed with polio. The disease left him a paraplegic, but, encouraged and assisted by his wife, Eleanor, Roosevelt sought therapeutic treatment and maintained sufficient political connections to reenter politics. In 1928, Roosevelt won election as governor of New York. He oversaw the rise of the Depression and drew from progressivism to address the economic crisis. During his gubernatorial tenure, Roosevelt introduced the first comprehensive unemployment relief program and helped to pioneer efforts to expand public utilities. He also relied on like-minded advisors. For example, Frances Perkins, then commissioner of the state’s Labor Department, successfully advocated pioneering legislation which enhanced workplace safety and reduced the use of child labor in factories. Perkins later accompanied Roosevelt to Washington and serve as the nation’s first female Secretary of Labor. ((Biographies of Roosevelt include Kenneth C. Davis, FDR: The Beckoning of Destiny: 1882-1928 (New York: Rand House, 1972); and Jean Edward Smith, FDR (New York: Random House, 2007).))

On July 1, 1932, Roosevelt, the newly-designated presidential nominee of the Democratic Party, delivered the first and one of the most famous on-site acceptance speeches in American presidential history. Building to a conclusion, he promised, “I pledge you, I pledge myself, to a new deal for the American people.” Newspaper editors seized upon the phrase “new deal,” and it entered the American political lexicon as shorthand for Roosevelt’s program to address the Great Depression. ((Outstanding general treatments of the New Deal include: Arthur M. Schlesinger, Jr., The Age of Roosevelt, 3 vols. (Boston: Houghton Mifflin, 1956–1960); William E. Leuchtenburg, Franklin D. Roosevelt and the New Deal (New York: Harper and Row, 1963); Anthony J. Badger, The New Deal: The Depression Years, 1933–1940 (New York: Hill and Wang), 1989; David M. Kennedy, Freedom from Fear: The American People in Depression and War, 1929–1945 (New York: Oxford University Press, 1999).  On Roosevelt, see especially James MacGregor Burns, Roosevelt: The Lion and the Fox (New York: Harcourt, Brace, 1956); Frank B. Friedel, Franklin D. Roosevelt, 4 vols. (Boston: Little, Brown, 1952–1973); Patrick J. Maney, The Roosevelt Presence: A Biography of Franklin D. Roosevelt (New York: Twayne, 1992); Alan Brinkley, Franklin Delano Roosevelt (New York: Oxford University Press, 2010).)) There were, however, few hints in his political campaign that suggested the size and scope of the “New Deal.” Regardless, Roosevelt crushed Hoover. He won more counties than any previous candidate in American history. He spent the months between his election and inauguration traveling, planning, and assembling a team of advisors, the famous “Brain Trust” of academics and experts, to help him formulate a plan of attack. On March 4th, 1933, in his first Inaugural Address, Roosevelt famously declared, “This great Nation will endure as it has endured, will revive and will prosper. So, first of all, let me assert my firm belief that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.” ((Franklin D. Roosevelt: “Inaugural Address,” March 4, 1933. Online by Gerhard Peters and John T. Woolley, The American Presidency Project. http://www.presidency.ucsb.edu/ws/?pid=14473.))

Roosevelt’s reassuring words would have rung hollow if he had not taken swift action against the economic crisis. In his first days in office, Roosevelt and his advisers prepared, submitted, and secured Congressional enactment of numerous laws designed to arrest the worst of the Great Depression. His administration threw the federal government headlong into the fight against the Depression.

Roosevelt immediately looked to stabilize the collapsing banking system. He declared a national “bank holiday” closing American banks and set to work pushing the Emergency Banking Act swiftly through Congress. On March 12th, the night before select banks reopened under stricter federal guidelines, Roosevelt appeared on the radio in the first of his “Fireside Chats.” The addresses, which the president continued delivering through four terms, were informal, even personal. Roosevelt used his airtime to explain New Deal legislation, to encourage confidence in government action, and to mobilize the American people’s support. In the first “chat,” Roosevelt described the new banking safeguards and asked the public to place their trust and their savings in banks. Americans responded and across the country, deposits outpaced withdrawals. The act was a major success. In June, Congress passed the Glass-Steagall Banking Act, which instituted federal deposit insurance and barred the mixing of commercial and investment banking. ((Michael E. Parrish, Securities Regulation and the New Deal (New Haven: Yale University Press, 1970).))

Stabilizing the banks was only a first step. In the remainder of his “First Hundred Days,” Roosevelt and his congressional allies focused especially on relief for suffering Americans. ((See especially Anthony J. Badger, FDR: The First Hundred Days (New York: Hill and Wang, 2008).)) Congress debated, amended, and passed what Roosevelt proposed. As one historian noted, the president “directed the entire operation like a seasoned field general.” ((William E. Leuchtenburg, Franklin D. Roosevelt and the New Deal (New York: Harper and Row, 1963).)) And despite some questions over the constitutionality of many of his actions, Americans and their congressional representatives conceded that the crisis demanded swift and immediate action. The Civilian Conservation Corps (CCC) employed young men on conservation and reforestation projects; the Federal Emergency Relief Administration (FERA) provided direct cash assistance to state relief agencies struggling to care for the unemployed; ((Neil Maher, Nature’s New Deal: The Civilian Conservation Corps and the Roots of the American Environmental Movement (New York: Oxford University Press, 2008).))  the Tennessee Valley Authority (TVA) built a series of hydroelectric dams along the Tennessee River as part of a comprehensive program to economically develop a chronically depressed region; ((Thomas K. McCraw, TVA and the Power Fight, 1933-1939 (Philadelphia: Lippincott, 1971).)) several agencies helped home and farm owners refinance their mortgages. And Roosevelt wasn’t done.

The heart of Roosevelt’s early recovery program consisted of two massive efforts to stabilize and coordinate the American economy: the Agricultural Adjustment Administration (AAA) and the National Recovery Administration (NRA). The AAA, created in May 1933, aimed to raise the prices of agricultural commodities (and hence farmers’ income) by offering cash incentives to voluntarily limit farm production (decreasing supply, thereby raising prices). ((Ellis W. Hawley, The New Deal and the Problem of Monopoly: A Study in Economic Ambivalence (Princeton: Princeton University Press, 1969); Gavin Wright, Old South, New South: Revolutions in the Southern Economy since the Civil War (Baton Rouge: LSU Press, 1986), 217.)) The National Industrial Recovery Act, which created the National Recovery Administration (NRA) in June 1933, suspended antitrust laws to allow businesses to establish “codes” that would coordinate prices, regulate production levels, and establish conditions of employment to curtail “cutthroat competition.” In exchange for these exemptions, businesses agreed to provide reasonable wages and hours, end child labor, and allow workers the right to unionize. Participating businesses earned the right to display a placard with the NRA’s “Blue Eagle,” showing their cooperation in the effort to combat the Great Depression. ((Theodore Saloutos, The American Farmer and the New Deal (Ames: Iowa State University Press, 1982).))

The programs of the First Hundred Days stabilized the American economy and ushered in a robust though imperfect recovery. GDP climbed once more, but even as output increased, unemployment remained stubbornly high. Though the unemployment rate dipped from its high in 1933, when Roosevelt was inaugurated, vast numbers remained out of work. If the economy could not put people back to work, the New Deal would try. The Civil Works Administration (CWA) and, later, the Works Progress Administration (WPA) put unemployed men and women to work on projects designed and proposed by local governments. The Public Works Administration (PWA) provided grants-in-aid to local governments for large infrastructure projects, such as bridges, tunnels, schoolhouses, libraries, and America’s first federal public housing projects. Together, they provided not only tangible projects of immense public good, but employment for millions. The New Deal was reshaping much of the nation. ((Bonnie Fox Schwartz, The Civil Works Administration, 1933–1934: The Business of Emergency Employment in the New Deal (Princeton: Princeton University Press, 1984); Edwin Amenta, Bold Relief: Institutional Politics and the Origins of Modern American Social Policy (Princeton: Princeton University Press, 1998); Jason Scott Smith, Building New Deal Liberalism: The Political Economy of Public Works, 1933–1956 (New York: Cambridge University Press, 2006); Mason B. Williams, City of Ambition: FDR, La Guardia, and the Making of Modern New York (New York: Norton, 2013).))

 

VIII. The New Deal in the South

The unjust and unreasonable accusation of rape brought against the so-called Scottsboro Boys, pictured with their attorney in 1932, generated serious controversy throughout the country.  “Working people of Washington negro and white. students and intellectuals attend The ‘Scottsboro boys must not die’ mass meeting Mt. Carmel Baptist church 3d and Eye Streets N. W. Wednesday February 7 8 PM ....” Washington, D. C., 1934. Library of Congress, http://memory.loc.gov/cgi-bin/query/r?ammem/AMALL:@field%28NUMBER+@band%28rbpe+20805500%29%29.  “The Scottsboro Boys, with attorney Samuel Leibowitz, under guard by the state militia, 1932.” Wikipedia, http://en.wikipedia.org/wiki/Scottsboro_Boys#mediaviewer/File:Leibowitz,_Samuel_%26_Scottsboro_Boys_1932.jpg.

The accusation of rape brought against the so-called Scottsboro Boys, pictured with their attorney in 1932, generated controversy across the country. “The Scottsboro Boys, with attorney Samuel Leibowitz, under guard by the state militia, 1932.” Wikipedia.

The impact of initial New Deal legislation was readily apparent in the South, a region of perpetual poverty especially plagued by the Depression. In 1929 the average per capita income in the American Southeast was $365, the lowest in the nation. Southern farmers averaged $183 per year at a time when farmers on the West Coast made more than four times that. ((Howard Odum, Southern Regions of the United States (1936), quoted in David L. Carlton and Peter Coclanis, eds., Confronting Southern Poverty in the Great Depression: The Report on Economic Conditions of the South with Related Documents (Boston: Bedford/St. Martin’s, 1996): 118-19.))  Moreover, they were trapped into the production of cotton and corn, crops that depleted the soil and returned ever-diminishing profits. Despite the ceaseless efforts of civic boosters, what little industry the South had remained low-wage, low-skilled, and primarily extractive. Southern workers made significantly less than their national counterparts: 75% of non-southern textile workers, 60% of iron and steel workers, and a paltry 45% of lumber workers. At the time of the crash, southerners were already underpaid, underfed, and undereducated. ((Gavin Wright, Old South, New South: Revolutions in the Southern Economy since the Civil War (Baton Rouge: LSU Press, 1986), 217.))

Major New Deal programs were designed with the South in mind. FDR hoped that by drastically decreasing the amount of land devoted to cotton, the AAA would arrest its long-plummeting price decline. Farmers plowed up existing crops and left fields fallow, and the market price did rise. But in an agricultural world of land-owners and landless farmworkers (such as tenants and sharecroppers), the benefits of the AAA bypassed the southerners who needed them most. The government relied on land owners and local organizations to distribute money fairly to those most affected by production limits, but many owners simply kicked tenants and croppers off their land, kept the subsidy checks for keeping those acres fallow, and reinvested the profits in mechanical farming equipment that further suppressed the demand for labor. Instead of making farming profitable again, the AAA pushed landless southern farmworkers off the land. ((Ibid., 227-228.))

But Roosevelt’s assault on southern poverty took many forms. Southern industrial practices attracted much attention. The NRA encouraged higher wages and better conditions. It began to suppress the rampant use of child labor in southern mills, and, for the first time, provided federal protection for unionized workers all across the country. Those gains were eventually solidified in the 1938 Fair Labor Standards Act, which set a national minimum wage of $0.25/hour (eventually rising to .40/hour). The minimum wage disproportionately affected low-paid southern workers, and brought southern wages within the reach of northern wages. ((Ibid., 216-220.))

The president’s support for unionization further impacted the South. Southern industrialists had proven themselves ardent foes of unionization, particularly in the infamous southern textile mills. In 1934, when workers at textile mills across the southern Piedmont struck over low wages and long hours, owners turned to local and state authorities to quash workers’ groups, even as they recruited thousands of strikebreakers from the many displaced farmers swelling industrial centers looking for work. But in 1935 the National Labor Relations Act, also known as the Wagner Act, guaranteed the rights of most workers to unionize and bargain collectively. And so unionized workers, backed by the support of the federal government and determined to enforce the reforms of the New Deal, pushed for higher wages, shorter hours, and better conditions. With growing success, union members came to see Roosevelt as a protector or workers’ rights. Or, as one union leader put it, an “agent of God.” ((William Leuchtenberg, The White House Looks South: Franklin D. Roosevelt, Harry S. Truman, Lyndon B. Johnson (Baton Rouge: LSU Press, 2005), 74.))

Perhaps the most successful New Deal program in the South was the Tennessee Valley Authority (TVA), an ambitious program to use hydroelectric power, agricultural and industrial reform, flood control, economic development, education, and healthcare, to radically remake the impoverished watershed region of the Tennessee River. Though the area of focus was limited, Roosevelt’s TVA sought to “make a different type of citizen” out of the area’s penniless residents. ((“Press Conference #160,” 23 November 1934, 214, in Roosevelt, Complete Presidential Press Conferences of Franklin D. Roosevelt, Volumes 3-4, 1934 (New York: Da Capo Press, 1972).)) The TVA built a series of hydroelectric dams to control flooding and distribute electricity to the otherwise non-electrified areas at government-subsidized rates. Agents of the TVA met with residents and offered training and general education classes to improve agricultural practices and exploit new job opportunities. The TVA encapsulates Roosevelt’s vision for uplifting the South and integrating it into the larger national economy. ((Thomas K. McCraw, TVA and the Power Fight, 1933-1939 (Philadelphia: Lippincott, 1971).))

Roosevelt initially courted conservative southern Democrats to ensure the legislative success of the New Deal, all but guaranteeing that the racial and economic inequalities of the region remained intact, but, by the end of his second term, he had won the support of enough non-southern voters that he felt confident in confronting some of the region’s most glaring inequalities. Nowhere was this more apparent than in his endorsement of a report, formulated by a group of progressive southern New Dealers, entitled “A Report on Economic Conditions in the South.” The pamphlet denounced the hardships wrought by the southern economy—in his introductory letter to the Report, called the region “the Nation’s No. 1 economic problem”—and blasted reactionary southern anti-New Dealers. He suggested that the New Deal could save the South and thereby spur a nationwide recovery. The Report was among the first broadsides in Roosevelt’s coming reelection campaign that addressed the inequalities that continued to mark southern and national life. ((Carleton and Coclanis, eds., Confronting Southern Poverty in the Great Depression, 42.))

 

IX. The New Deal in Appalachia

The New Deal also addressed another poverty-stricken region, Appalachia, the mountain-and-valley communities that roughly follow the Appalachian Mountain Range from southern New York to the foothills of Northern Georgia, Alabama, and Mississippi. Appalachia’s abundant natural resources, including timber and coal, were in high demand during the country’s post-Civil War industrial expansion, but Appalachian industry simply extracted these resources for profit in far-off industries, depressing the coal-producing areas even earlier than the rest of the country. By the mid-1930s, with the Depression suppressing demand, many residents were stranded in small, isolated communities whose few employers stood on the verge of collapse. Relief workers from the Federal Emergency Relief Administration (FERA) reported serious shortages of medical care, adequate shelter, clothing, and food. Rampant illnesses, including typhus, tuberculosis, pneumonia, and venereal disease, as well as childhood malnutrition, further crippled Appalachia.

Several New Deal programs targeted the region. Under the auspices of the NIRA, Roosevelt established the Division of Subsistence Homesteads (DSH) within the Department of the Interior to give impoverished families an opportunity to relocate “back to the land”: the DSH established 34 homestead communities nationwide, including the Appalachian regions of Alabama, Pennsylvania, Tennessee, and West Virginia. The CCC contributed to projects throughout Appalachia, including the Blue Ridge Parkway in North Carolina and Virginia, reforestation of the Chattahoochee National Forest in Georgia, and state parks such as Pine Mountain Resort State Park in Kentucky. The TVA’s efforts aided communities in Tennessee and North Carolina, and the Rural Electric Administration (REA) brought electricity to 288,000 rural households.

 

X. Voices of Protest

Huey Long was a dynamic, indomitable force (with a wild speech-giving style, seen in the photograph) who campaigned tirelessly for the common man, demanding that Americans “Share Our Wealth.” Photograph of Huey P. Long, c. 1933-35. Wikimedia, http://commons.wikimedia.org/wiki/File:HueyPLongGesture.jpg. “Share Our Wealth” button, c. 1930s. Authentic History, http://www.authentichistory.com/1930-1939/2-fdr/2-reception/Huey_Long-Share_Our_Wealth_Button.jpg.

Huey Long was a dynamic, indomitable force (with a wild speech-giving style, seen in the photograph) who campaigned tirelessly for the common man, demanding that Americans “Share Our Wealth.” Photograph of Huey P. Long, c. 1933-35. Wikimedia.

Despite the unprecedented actions taken in his first year in office, Roosevelt’s initial relief programs could often be quite conservative. He had usually been careful to work within the bounds of presidential authority and congressional cooperation. And, unlike Europe, where several nations had turned towards state-run economies, and even fascism and socialism, Roosevelt’s New Deal demonstrated a clear reluctance to radically tinker with the nation’s foundational economic and social structures. Many high-profile critics attacked Roosevelt for not going far enough, and, beginning in 1934, Roosevelt and his advisors were forced to respond.

Senator Huey Long, a flamboyant Democrat from Louisiana, was perhaps the most important “voice of protest.” Long’s populist rhetoric appealed those who saw deeply rooted but easily addressed injustice in the nation’s economic system. Long proposed a “Share Our Wealth” program in which the federal government would confiscate the assets of the extremely wealthy and redistribute them to the less well-off through guaranteed minimum incomes. “How many men ever went to a barbecue and would let one man take off the table what’s intended for nine-tenths of the people to eat?” he asked. Over 27,000 “Share the Wealth” clubs sprang up across the nation as Long traveled the country explaining his program to crowds of impoverished and unemployed Americans. Long envisioned the movement as a stepping stone to the presidency, but his crusade ended in late 1935 when he was assassinated on the floor of the Louisiana state capitol. Even in death, however, Long convinced Roosevelt to more stridently attack the Depression and American inequality.

But Huey Long was not alone in his critique of Roosevelt. Francis Townsend, a former doctor and public health official from California, promoted a plan for old age pensions which, he argued, would provide economic security for the elderly (who disproportionately suffered poverty) and encourage recovery by allowing older workers to retire from the work force. Reverend Charles Coughlin, meanwhile, a priest and radio personality from the suburbs of Detroit, Michigan, gained a following by making vitriolic, anti-Semitic attacks on Roosevelt for cooperating with banks and financiers and proposing a new system of “social justice” through a more state-driven economy instead. Like Long, both Townsend and Coughlin built substantial public followings.

If many Americans urged Roosevelt to go further in addressing the economic crisis, the president faced even greater opposition from conservative politicians and business leaders. By late 1934, growing complaints from business-friendly Republicans of Roosevelt’s willingness to regulate industry and use federal spending for public works and employment programs. In the South, Democrats who had originally supported the president grew increasingly hostile towards programs that challenged the region’s political, economic, and social status quo. Yet the greatest opposition came from the Supreme Court, a conservative filled with appointments made from the long years of Republican presidents.

By early 1935 the Court was reviewing programs of the New Deal. On May 27, a day Roosevelt’s supporters called “Black Monday,” the justices struck down one of the president’s signature reforms: in a case revolving around poultry processing, the Court unanimously declared the NRA unconstitutional. In early 1936, the AAA fell. ((William E. Leuchtenburg, The Supreme Court Reborn: The Constitutional Revolution in the Age of Roosevelt (New York: Oxford University Press, 1995); Theda Skocpol and Kenneth Finegold, State and Party in America’s New Deal (Madison: University of Wisconsin Press, 1995); Colin Gordon, New Deals: Business, Labor, and Politics in America, 1920–1935 (New York: Cambridge University Press, 1994).))

 

XI. The “Second” New Deal (1935-1936)

Facing reelection and rising opposition from both the left and the right, Roosevelt decided to act. The New Deal adopted a more radical, aggressive approach to poverty, the “Second” New Deal. In 1935, hoping to reconstitute some of the protections afforded workers in the now-defunct NRA, Roosevelt worked with Congress to pass the National Labor Relations Act (known as the Wagner Act for its chief sponsor, New York Senator Robert Wagner), offering federal legal protection, for the first time, for workers to organize unions. Three years later, Congress passed the Fair Labor Standards Act, creating the modern minimum wage. The Second New Deal also oversaw the restoration of a highly progressive federal income tax, mandated new reporting requirements for publicly traded companies, refinanced long-term home mortgages for struggling homeowners, and attempted rural reconstruction projects to bring farm incomes in line with urban ones. (( Mark H. Leff, The Limits of Symbolic Reform: The New Deal and Taxation, 1933–1939 (New York: Cambridge University Press, 1984); Kenneth T. Jackson, Crabgrass Frontier: The Suburbanization of the United States (New York: Oxford University Press, 1985); Sarah T. Phillips, This Land, This Nation: Conservation, Rural America, and the New Deal (New York: Cambridge University Press, 2007).))

The labor protections extended by Roosevelt’s New Deal were revolutionary. In northern industrial cities, workers responded to worsening conditions by banding together and demanding support for worker’s rights. In 1935, the head of the United Mine Workers, John L. Lewis, took the lead in forming a new national workers’ organization, the Congress of Industrial Organizations, breaking with the more conservative, craft-oriented AFL. The CIO won a major victory in 1937 when affiliated members in the United Auto Workers struck for recognition and better pay and hours at a General Motors plant in Flint, Michigan. In the first instance of a “sit-down” strike, the workers remained in the building until management agreed to negotiate. GM recognized the UAW and the “sit-down” strike became a new weapon in the fight for workers’ rights. Across the country, unions and workers took advantage of the New Deal’s protections to organize and win major concessions from employers.

Unionization met with fierce opposition from owners and managers, particularly in the “Manufacturing Belt” of the Mid-West. Sheldon Dick, photographer, “Strikers guarding window entrance to Fisher body plant number three. Flint, Michigan,” January/February 1937. Library of Congress, http://www.loc.gov/pictures/item/fsa2000021503/PP/.

Unionization met with fierce opposition from owners and managers, particularly in the “Manufacturing Belt” of the Mid-West. Sheldon Dick, photographer, “Strikers guarding window entrance to Fisher body plant number three. Flint, Michigan,” January/February 1937. Library of Congress, http://www.loc.gov/pictures/item/fsa2000021503/PP/.

The signature piece of Roosevelt’s Second New Deal came the same year, in 1935. The Social Security Act provided for old-age pensions, unemployment insurance, and economic aid, based on means, to assist both the elderly and dependent children. The president was careful to mitigate some of the criticism from what was, at the time, in the American context, a revolutionary concept. He specifically insisted that social security be financed from payroll, not the federal government; “No dole,” Roosevelt said repeatedly, “mustn’t have a dole.” ((Kennedy, 267.)) He thereby helped separate Social Security from the stigma of being an undeserved “welfare” entitlement. While such a strategy saved the program from suspicions, Social Security became the centerpiece of the modern American social welfare state. It was the culmination of a long progressive push for government-sponsored social welfare, an answer to the calls of Roosevelt’s opponents on the Left for reform, a response to the intractable poverty among America’s neediest groups, and a recognition that the government would now assume some responsibility for the economic well-being of its citizens. But for all of its groundbreaking provisions, the Act, and the larger New Deal as well, excluded large swaths of the American population. ((W. Andrew Achenbaum, Old Age in the New Land (Baltimore: Johns Hopkins University Press, 1978); Edwin E. Witte, The Development of the Social Security Act (Madison: University of Wisconsin Press, 1963).))

 

XII. Equal Rights and the New Deal

The Great Depression was particularly tough for nonwhite Americans. As an African American pensioner told interviewer Studs Terkel, “The Negro was born in depression. It didn’t mean too much to him. The Great American Depression … only became official when it hit the white man.” Black workers were generally the last hired when businesses expanded production and the first fired when businesses experienced downturns. In 1932, with the national unemployment average hovering around 25%, black unemployment reached as high as 50%, while even those black who kept their jobs saw their already low wages cut dramatically. ((Terkel, Hard Times, 82.))

Blacks faced discrimination everywhere, but suffered especially severe legal inequality in the Jim Crow South. In 1931, for instance, a group of nine young men riding the rails between Chattanooga and Memphis, Tennessee, were pulled from the train near Scottsboro, Alabama, and charged with assaulting two white women. Despite clear evidence that the assault had not occurred, and despite one of the women later recanting, the young men endured a series of sham trials in which all but one were sentenced to death. Only the communist-oriented International Legal Defense came to the aid of the “Scottsboro Boys,” who soon became a national symbol of continuing racial prejudice in America and a rallying point for civil rights-minded Americans. In appeals, the ILD successfully challenged the Boys’ sentencing and the death sentences were either commuted or reversed, although the last of the accused did not receive parole until 1946. ((Dan T/. Carter, Scottsboro: A Tragedy of the American South (Baton Rouge: Louisiana State University Press, 1969).))

Despite a concerted effort to appoint black advisors to some New Deal programs, Franklin Roosevelt did little to directly address the difficulties black communities faced. To do so openly would provoke southern Democrats and put his New Deal coalition at risk. Roosevelt not only rejected such proposals as abolishing the poll tax and declaring lynching a federal crime, he refused to specifically target African American needs in any of his larger relief and reform packages. As he explained to the national secretary of the NAACP, “I just can’t take that risk.” ((Kennedy, 201.))

In fact, even many of the programs of the New Deal had made hard times more difficult. When the codes of the NRA set new pay scales, they usually took into account regional differentiation and historical data. In the South, where African Americans had long suffered unequal pay, the new codes simply perpetuated that inequality. The codes also exempted those involved in farm work and domestic labor, the occupations of a majority of southern black men and women. The AAA was equally problematic as owners displaced black tenants and sharecroppers, many of whom were forced to return to their farms as low-paid day labor or to migrate to cities looking for wage work. ((Ira Katznelson, When Affirmative Action Was White: An Untold History of Racial Inequality in Twentieth-Century America (New York: Norton, 2005).))

Perhaps the most notorious failure of the New Deal to aid African Americans came with the passage of the Social Security Act. Southern politicians chaffed at the prospect of African Americans benefiting from federally-sponsored social welfare, afraid that economic security would allow black southerners to escape the cycle of poverty that kept them tied to the land as cheap, exploitable farm laborers. The Jackson (Mississippi) Daily News callously warned that “The average Mississippian can’t imagine himself chipping in to pay pensions for able-bodied Negroes to sit around in idleness … while cotton and corn crops are crying for workers.” Roosevelt agreed to remove domestic workers and farm laborers from the provisions of the bill, excluding many African Americans, already laboring under the strictures of legal racial discrimination, from the benefits of an expanding economic safety net. ((George Brown Tindall, The Emergence of the New South, 1913-1945 (Baton Rouge, Louisiana State University Press, 1967). 491.))

Women, too, failed to receive the full benefits of New Deal programs. On one hand, Roosevelt included women in key positions within his administration, including the first female Cabinet secretary, Frances Perkins, and a prominently placed African American advisor in the National Youth Administration, Mary McLeod Bethune. First Lady Eleanor Roosevelt was a key advisor to the president and became a major voice for economic and racial justice. But many New Deal programs were built upon the assumption that men would serve as “breadwinners” and women as mothers, homemakers, and consumers. New Deal programs aimed to help both but usually by forcing such gendered assumptions, making it difficult for women to attain economic autonomy. New Deal social welfare programs tended to funnel women into means-tested, state administered relief programs while reserving “entitlement” benefits for male workers, creating a kind of two-tiered social welfare state. And so, despite great advances, the New Deal failed to challenge core inequalities that continued to mark life in the United States. ((Alice Kessler-Harris, In Pursuit of Equity: Women, Men, and the Quest for Economic Citizenship in 20th Century America (New York: Oxford University Press, 2001); Linda Gordon, Pitied But Not Entitled: Single Mothers and the History of Welfare, 1890–1935 (New York: Free Press, 1994);))

 

XIII. The End of the New Deal (1937-1939)

By 1936 Roosevelt and his New Deal had won record popularity. In November Roosevelt annihilated his Republican challenger, Governor Alf Landon of Kansas, who lost in every state save Maine and Vermont. The Great Depression had certainly not ended, but it appeared to many to be beating a slow yet steady retreat, and Roosevelt, now safely re-elected, appeared ready to take advantage of both his popularity and the improving economic climate to press for even more dramatic changes. But conservative barriers continued to limit the power of his popular support. The Supreme Court, for instance, continued to gut many of his programs.

In 1937, concerned that the Court might overthrow Social Security in an upcoming case, Roosevelt called for legislation allowing him to expand the Court by appointing a new, younger justice for every sitting member over the age of 70. Roosevelt argued that the measure would speed up the Court’s ability to handle a growing back-log of cases; however, his “court-packing scheme,” as opponents termed it, was clearly designed to allow the president to appoint up to six friendly, pro-New Deal justices to drown the influence of old-time conservatives on the Court. Roosevelt’s “scheme” riled opposition and did not become law, but the chastened Court upheld Social Security and other pieces of New Deal legislation thereafter. Moreover, Roosevelt was slowly able to appoint more amenable justices as conservatives died or retired. Still, the “court-packing scheme” damaged the Roosevelt administration and opposition to the New Deal began to emerge and coalesce. ((Alan Brinkley, The End of Reform: New Deal Liberalism in Recession and War (New York: Knopf, 1995).))

Compounding his problems, Roosevelt and his advisors made a costly economic misstep. Believing the United States had turned a corner, Roosevelt cut spending in 1937. The American economy plunged nearly to the depths of 1932–1933. Roosevelt reversed course and, adopting the approach popularized by the English economist John Maynard Keynes, hoped that countercyclical, “compensatory” spending would pull the country out of the recession, even at the expense of a growing budget deficit. It was perhaps too late. The “Roosevelt Recession” of 1937 became fodder for critics. Combined with the “court-packing scheme,” the recession allowed for significant gains by a “conservative coalition” of southern Democrats and Midwestern Republicans. By 1939, Roosevelt struggled to build congressional support for new reforms, let alone maintain existing agencies. Moreover, the growing threat of war in Europe stole the public’s attention and increasingly dominated Roosevelt’s interests. The New Deal slowly receded into the background, outshone by war. ((Ibid.))

 

XIV. The Legacy of the New Deal

By the end of the 1930s, Roosevelt and his Democratic congresses had presided over a transformation of the American government and a realignment in American party politics. Before World War I, the American national state, though powerful, had been a “government out of sight.” After the New Deal, Americans came to see the federal government as a potential ally in their daily struggles, whether finding work, securing a decent wage, getting a fair price for agricultural products, or organizing a union. Voter turnout in presidential elections jumped in 1932 and again in 1936, with most of these newly-mobilized voters forming a durable piece of the Democratic Party that would remain loyal well into the 1960s. Even as affluence returned with the American intervention in World War II, memories of the Depression continued to shape the outlook of two generations of Americans. ((Lizabeth Cohen, Making a New Deal: Industrial Workers in Chicago, 1919–1939 (New York: Cambridge University Press, 1990); Kristi Andersen, The Creation of a Democratic Majority, 1928–1936 (Chicago: University of Chicago Press, 1979); Caroline Bird, The Invisible Scar (New York: D. McKay Co., 1966).))  Survivors of the Great Depression, one man would recall in the late 1960s, “are still riding with the ghost—the ghost of those days when things came hard.” ((Quoted in Studs Terkel, Hard Times: An Oral History of the Great Depression (New York: Pantheon, 1970), 34.))

Historians debate when the New Deal “ended.” Some identify the Fair Labor Standards Act of 1938 as the last major New Deal measure. Others see wartime measures such as price and rent control and the G.I. Bill (which afforded New Deal-style social benefits to veterans) as species of New Deal legislation. Still others conceive of a “New Deal order,” a constellation of “ideas, public policies, and political alliances,” which, though changing, guided American politics from Roosevelt’s Hundred Days forward to Lyndon Johnson’s Great Society—and perhaps even beyond. Indeed, the New Deal’s legacy still remains, and its battle lines still shape American politics.

 

Contributors

This chapter was edited by Matthew Downs, with content contributed by Dana Cochran, Matthew Downs, Benjamin Helwege, Elisa Minoff, Caitlin Verboon, and Mason Williams.

 

Recommended Reading

  1. Brinkley, Alan, The End of Reform: New Deal Liberalism in Recession and War. New York: Knopf, 1995.
  2. Brinkley, Alan. Voices of Protest: Huey Long, Father Coughlin, and the Great Depression. New York: Knopf, 1982.
  3. Cohen, Lizabeth. Making a New Deal: Industrial Workers in Chicago, 1919–1939. New York: Cambridge University Press, 1990.
  4. Cowie, Jefferson, and Salvatore, Nick. “The Long Exception: Rethinking the Place of the New Deal in American History,” International Labor and Working-Class History 74 (Fall 2008): 1-32.
  5. Dickstein, Morris. Dancing in the Dark a Cultural History of the Great Depression. Norton, 2009.
  6. The Rise and Fall of the New Deal Order, 1930–1980. Edited by Fraser, Steve, and Gerstle, Gary. Princeton: Princeton University Press, 1989.
  7. Gordon, Linda. Dorothea Lange: A Life Beyond Limits. New York:  Norton, 2009.
  8. Kelly, Robin D. G. Hammer and Hoe: Alabama Communists during the Great Depression. Chapel Hill: University of North Carolina Press, 1990.
  9. Kennedy, David. Freedom From Fear: America in Depression and War, 1929-1945. New York: Oxford University Press, 1999.
  10. Leach, William. Land of Desire: Merchants, Power, and the Rise of a New American Culture. New York: Pantheon, 1993.
  11. Leuchtenburg, William. Franklin Roosevelt and the New Deal, 1932-1940. New York: Harper & Row, 1963.
  12. Pells, Richard. Radical Visions and American Dreams: Culture and Social Thought in the Depression Years. New York: Harper & Row, 1973.
  13. Phillips-Fein, Kim. Invisible Hands: The Businessmen’s Crusade Against the New Deal. New York: Norton, 2010
  14. Sitkoff, Harvard. A New Deal for Blacks: the Emergence of Civil Rights as a National Issue. New York: Oxford University Press, 1978.
  15. Wright, Gavin. Old South, New South: Revolutions in the Southern Economy since the Civil War. Baton Rouge: Louisiana State University Press, 1986.

Notes

21. World War I & Its Aftermath

Striking steel mill workers holding bulletins, Chicago, Illinois, September 22, 1919. ExplorePAhistory.com

Striking steel mill workers holding bulletins, Chicago, Illinois, September 22, 1919. ExplorePAhistory.com

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

World War I (“The Great War”) toppled empires, created new nations, and sparked tensions that would explode across future years. On the battlefield, its gruesome modern weaponry wrecked an entire generation of young men. The United States entered the conflict in 1917 and was never the same. The war heralded to the world the United States’ potential as a global military power, and, domestically, it advanced but then beat back American progressivism by unleashing vicious waves of repression. The war simultaneously stoked national pride and fueled disenchantments that burst Progressive Era hopes for the modern world. And it laid the groundwork for a global depression, a second world war, and an entire history of national, religious, and cultural conflict around the globe.

 

II. Prelude to War

As the German empire rose in power and influence at the end of the nineteenth century, skilled diplomats maneuvered this disruption of traditional powers and influences into several decades of European peace. In Germany, however, a new ambitious monarch would overshadow years of tactful diplomacy. Wilhelm II rose to the German throne in 1888. He admired the British Empire of his grandmother, Queen Victoria, and envied the Royal Navy of Great Britain so much so that he attempted to build a rival German navy and plant colonies around the globe. The British viewed the prospect of a German navy as a strategic threat, but, jealous of what he perceived to as a lack of prestige in the world, Wilhelm II pressed Germany’s case for access to colonies and symbols of status suitable for a world power. Wilhelm’s maneuvers and Germany’s rise spawned a new system of alliances as rival nations warily watched Germany’s expansion. ((David Stevenson, The First World War and International Politics (London: Oxford: University Press, 1988); David Stevenson, Cataclysm: The First World War as Political Tragedy (New York: Basic Books, 2004).))

In 1892, German posturing worried the leaders of Russia and France and prompted a defensive alliance to counter the existing triple threat between Germany, Austria-Hungary, and Italy. Britain’s Queen Victoria remained unassociated with the alliances until a series of diplomatic crises and an emerging German naval threat led to British agreements with Czar Nicholas II and French President Emile Loubet in the early twentieth century. (The alliance between Great Britain, France, and Russia became known as the Triple Entente.) ((Ibid.))

The other great threat to European peace was the Ottoman Empire, in Turkey. While the leaders of the Austrian-Hungarian Empire showed little interest in colonies elsewhere, Turkish lands on its southern border appealed to their strategic goals. However, Austrian-Hungarian expansion in Europe worried Czar Nicholas II who saw Russia as both the historic guarantor of the Slavic nations in the Balkans and as the competitor for territories governed by the Ottoman Empire. ((Ibid.))

By 1914, the Austrian-Hungarian Empire had control of Bosnia and Herzegovina and viewed Slavic Serbia, a nation protected by Russia, as its next challenge. On June 28, 1914, after Serbian Gavrilo Princip assassinated the Austrian-Hungarian heirs to the throne, Archduke Franz Ferdinand and his wife, Grand Duchess Sophie, vengeful nationalist leaders believed the time had arrived to eliminate the rebellious ethnic Serbian threat. ((Ibid.))

On the other side of the Atlantic, the United States played an insignificant role in global diplomacy—it rarely forayed into internal European politics. The federal government did not participate in international diplomatic alliances but nevertheless championed and assisted with the expansion of the transatlantic economy. American businesses and consumers benefited from the trade generated as the result of the extended period of European peace.

Stated American attitudes toward international affairs followed the advice given by President George Washington in his 1796 Farewell Address, one-hundred and twenty years before America’s entry in World War I. He had recommended that his fellow countrymen avoid “foreign alliances, attachments, and intrigues” and “those overgrown military establishments which, under any form of government, are inauspicious to liberty, and which are to be regarded as particularly hostile to republican liberty.”

A national foreign policy of neutrality reflected America’s inward-looking focus on the construction and management of its new powerful industrial economy (built in large part with foreign capital). The federal government possessed limited diplomatic tools with which to engage an international struggles for world power. America’s small and increasingly antiquated military precluded forceful coercion and left American diplomats to persuade by reason, appeals to justice, or economic coercion. But in the 1880s, as Americans embarked upon empire, Congress authorized the construction of a modern Navy. The Army nevertheless remained small and underfunded compared to the armies of many industrializing nations.

After the turn of the century, the Army and Navy faced a great deal of organizational uncertainty. New technologies—airplanes, motor vehicles, submarines, modern artillery—stressed the capability of Army and Navy personnel to effectively procure and use them. The nation’s Army could police Native Americans in the West and garrison recent overseas acquisitions, but it could not sustain a full-blown conflict of any size. The Davis Act of 1908 and the National Defense Act of 1916 represented the rise of the modern versions of the National Guard and military reserves. A system of state-administered units available for local emergencies that received conditional federal funding for training could be activated for use in international wars. The National Guard program encompassed individual units separated by state borders. The program supplied summer training for college students as a reserve officer corps. This largely resolved the myriad of conflicts between the demands of short term state problems such as natural disasters, the fear in the federal government of too few or substandard soldiers, and state leaders who thought their men would fill gaps in the national armed forces during international wars. Military leaders resisted similar efforts from allied nations to use American forces as fillers for depleted armies. The federal and state governments needed a long term strategic reserve full of trained soldiers and sailors. Meanwhile, for weapons and logistics, safe and reliable prototypes of new technologies capable of rapid deployment often ran into developmental and production delays. ((Paul Koistinen, Mobilizing for Modern War: The Political Economy of American Warfare, 1865–1919 (Lawrence: University Press of Kansas, 1997).))

Border troubles in Mexico served as an important field test for modern American military forces. Revolution and chaos threatened American business interests in Mexico. Mexican reformer Francisco Madero challenged Porfirio Diaz’s corrupt and unpopular conservative regime, was jailed, and fled to San Antonio, where he penned the Plan of San Luis Potosí, paving the way for the Mexican Revolution and the rise of armed revolutionaries across the country.

In April 1914, President Woodrow Wilson ordered Marines to accompany a naval escort to Veracruz on the lower eastern coast of Mexico. After a brief battle, the Marines supervised the city government and prevented shipments of German arms to Mexican leader Victor Huerta until they departed in November 1914. The raid emphasized the continued reliance on naval forces and the difficulty in modernizing the military during a period of European imperial influence in the Caribbean and elsewhere. The threat of war in Europe enabled passage of the Naval Act of 1916. President Wilson declared that the national goal was to build the Navy as “incomparably, the greatest…in the world.” And yet Mexico still beckoned. The Wilson administration had withdrawn its support of Diaz, but watched warily as the Revolution devolved into assassinations and deceit. In 1916, Pancho Villa, a popular revolutionary in Northern Mexico, spurned by American support for rival contenders, raided Columbus, New Mexico. His raiders killed seventeen Americans and burned down the town center before they sustained severe casualties from American soldiers and retreated. In response, President Wilson commissioned Army General John “Black Jack” Pershing to capture Villa and disperse his rebels. Motorized vehicles, reconnaissance aircraft, and the wireless telegraph aided in the pursuit of Villa. Motorized vehicles in particular allowed General Pershing supplies without relying on railroads controlled by the Mexican government. The aircraft assigned to the campaign crashed or were grounded due to mechanical malfunctions, but they provided invaluable lessons in their worth and use in war. Wilson used the powers of the new National Defense Act to mobilize over 100,000 National Guard units across the country as a show of force in northern Mexico. ((John S. D. Eisenhower, Intervention!: The United States and the Mexican Revolution, 1913–1917 (New York: W.W. Norton, 1995); Friedrich Katz, The Secret War in Mexico: Europe, the United States, and the Mexican Revolution (Chicago: University of Chicago Press, 1981).))

The conflict between the United States and Mexico might have escalated into full-scale war if the international crisis in Europe had not overwhelmed the public’s attention. After the outbreak of war in Europe in 1914, President Wilson declared American neutrality. He insisted from the start that the United States be neutral “in fact as well as in name;” a policy the majority of American people enthusiastically endorsed. What exactly “neutrality” meant in a world of close economic connections, however, prompted immediate questions the United States was not yet adequately prepared to answer. Ties to the British and French proved strong, and those nations obtained far more loans and supplies than the Germans. In October 1914, President Wilson approved commercial credit loans to the combatants which made it increasingly difficult for the nation to claim impartiality as war spread through Europe. Trade and trade-related financial relations with combatant nations conflicted with previous agreements with the Allies that ultimately drew the nation further into the conflict. In spite of mutually declared blockades between Germany, Great Britain, and France, munitions and other war suppliers in the United States witnessed a brisk and booming increase in business. The British naval blockades that often stopped or seized ships proved annoying and costly, but the unrestricted and surprise torpedo attacks from German submarines were far more deadly. In May 1915, the sinking of the RMS Lusitania at the cost of over a hundred American lives and other German attacks on American and British shipping raised the ire of the public and stoked the desire for war. ((Arthur S. Link, Wilson: The Struggle for Neutrality, 19141915 (Princeton: Princeton University Press, 1960).))

If American diplomatic tradition avoided alliances and the Army seemed inadequate for sustained overseas fighting, the United States outdistanced the nations of Europe in one important measure of world power: by 1914, the nation held the top position in the global industrial economy. The United States producing slightly more than one-third of the world’s manufactured goods, roughly equal to the outputs of France, Great Britain, and Germany combined.

 

III. War Spreads through Europe

After the assassination of Archduke Ferdinand and Grand Duchess Sophie, Austria secured the promise of aid from its German ally and issued a list of ten ultimatums to Serbia. On July 28, 1914, Austria declared war on Serbia for failure to meet all of the demands. Russia, determined to protect Serbia, began to mobilize its armed forces. On August 1, 1914, Germany declared war on Russia to protect Austria after warnings directed at Czar Nicholas II failed to stop Russian preparations for war.

In spite of the central European focus of the initial crises, the first blow was struck against neutral Belgium in northwestern Europe. Germany made plans to deal with the French and Russian threats by taking advantage of the sluggish Russian mobilization to focus the mission of the Germany army on France. Similar to the military operations of 1871, to enter France quickly German military leaders activated the Schlieffen Plan that directed the placement shift of German armies by rapid rail transport. The clever strategy detered and confused Russian forces and ultimately led to a victory march over Belgium and into France.

Belgium fell victim early to invading German forces. Germany wanted to avoid the obvious avenue of advance across the French-German border and encounter the French army units stationed there. German army commanders ordered a wide sweep around the French border forces that led straight to Belgium. However, this violation of Belgian neutrality also ensured that Great Britain entered the war against Germany. On August 4, 1914, Great Britain declared war on Germany for failure to respect Belgium as a neutral nation.

WAR & CONFLICT BOOKERA: WORLD WAR I/THE FRONT

In 1915, the European war had developed into a series of bloody trench stalemates that continued through the following year. Offensives, largely carried out by British and French armies, achieved nothing but huge numbers of casualties. Peripheral campaigns against the Ottoman Empire in Turkey at Gallipoli, throughout the Middle East, and in various parts of Africa were either unsuccessful or had no real bearing on the European contest for victory. The third year of the war proposed promises of great German successes in eastern Europe after the regime of Czar Nicholas II collapsed in Russia in March 1917. At about the same time, the German general staff demanded the reimposition of unrestricted submarine warfare to deprive the Allies of replenishment supplies from the United States. ((Michael S. Neiberg, Fighting the Great War: A Global History, Cambridge: Harvard University Press, 2005).))

The Germans realized that submarine warfare would likely bring intervention on behalf of the United States. However, the Germans also believed the European war would be over before American soldiers could arrive in sufficient numbers to alter the balance of power. A German diplomat, Arthur Zimmermann, planned to complicate the potential American intervention. He offered support to the Mexican government via a desperate bid to regain Texas, New Mexico, and Arizona. Mexican national leaders declined the offer, but the revelation of the Zimmermann Telegram helped to usher the United States into the war.

 

IV. America Enters the War

By the fall of 1916 and spring of 1917, President Wilson believed an imminent German victory would drastically and dangerously alter the balance of power in Europe. With a good deal of public support inflamed by submarine warfare and items like the Zimmermann telegram (which revealed a German menace in Mexico), Congress declared war on Germany on April 4th, 1917. Despite the National Defense Act of 1916 and Naval Act of 1916, America faced a war three thousand miles away with a small and unprepared military. The United States was unprepared in nearly every respect for modern war. Considerable time elapsed before an effective Army and Navy could be assembled, trained, equipped, and deployed to the Western Front in Europe. The process of building the Army and Navy for the war proved to be different from previous American conflicts and counter to the European military experience. Unlike the largest European military powers of Germany, France, and Austria-Hungary, no tradition existed in the United States to maintain large standing armed forces or trained military reserves during peacetime. Moreover, there was no American counterpart to the European practice of rapidly equipping, training, and mobilizing reservists and conscripts.

America relied solely on traditional volunteerism to fill the ranks of the armed forces. Notions of patriotic duty and adventure appealed to many young men who not only volunteered for wartime service, but sought and paid for their own training at Army camps before the war. American labor organizations favored voluntary service over conscription. Labor leader Samuel Gompers argued for volunteerism in letters to the Congressional committees considering the question. “The organized labor movement,” he wrote, “has always been fundamentally opposed to compulsion.” Referring to American values as a role model for others, he continued, “It is the hope of organized labor to demonstrate that under voluntary conditions and institutions the Republic of the United States can mobilize its greatest strength, resources and efficiency.” ((American Federation of Labor, Report of the Proceedings of the Annual Convention (1917), 112.))

Moments like this – with the Boy Scouts of America charging up Fifth Avenue in New York City with flags in their hands – signaled to Americans that it was time to wake up to the reality of war and support the effort in any way possible. "Wake Up, America" parades like this one were throughout the country in support of recruitment. Nearly 60,000 people attended this single parade in New York City. Photograph from National Geographics Magazine, 1917. Wikimedia, http://commons.wikimedia.org/wiki/File:Boy_Scouts_NGM-v31-p359.jpg.

The Boy Scouts of America charge up Fifth Avenue in New York City in a “Wake Up, America” parade to support recruitment efforts. Nearly 60,000 people attended this single parade.Photograph from National Geographics Magazine, 1917. Wikimedia.

Though some observers believed that opposition to conscription might lead to civil disturbances, Congress quickly instituted a reasonably equitable and locally administered system to draft men for the military. On May 18, 1917, Congress approved the Selective Service Act, and President Wilson signed it into action a week later. The new legislation avoided the unpopular system of bonuses and substitutes used during the Civil War and was generally received without serious objection by the American people. ((Christopher Capozzola, Uncle Sam Wants You: World War I and the Making of the Modern American Citizen (New York: Oxford University Press, 2010).))

The conscription act initially required men from ages 21 to 30 to register for compulsory military service. The basic requirement for the military was to demonstrate a competitive level of physical fitness. These tests offered the emerging fields of social science a range of data collection tools and new screening methods. The Army Medical Department examined the general condition of young American men selected for service from the population. The Surgeon General compiled his findings from draft records in the 1919 report, “Defects Found in Drafted Men,” a snapshot of the 2.5 million men examined for military service. Of that group, 1,533,937 physical defects were recorded (often more than one per individual). More than thirty-four percent of those examined were rejected for service or later discharged for neurological, psychiatric, or mental deficiencies. ((Albert Gallitin Love, Defects Found in Drafted Men (Washington, D.C.: U.S. Government Printing Office, 1920), 73.))

To provide a basis for the neurological, psychiatric, or mental evaluations, the Army assessed eligibility for service and aptitude for advanced training through the use of cognitive skills tests to determine intelligence. About 1.9 million men were tested on intelligence. Soldiers who were literate took the Army Alpha test. Illiterates and non-English speaking immigrants took the non-verbal equivalent, the Army Beta test, which relied on visual testing procedures. Robert M. Yerkes, president of the American Psychological Association and chairman of the Committee on the Psychological Examination of Recruits, developed and analyzed the tests. His data suggested that the mental age of recruits, in particular immigrant recruits from southern and eastern Europe, averaged about thirteen years. As a eugenicist, he interpreted the results as roughly equivalent to a mild level of retardation and as an indication of racial deterioration. Many years later, experts agreed the results misrepresented the levels of education for the recruits and revealed defects in the design of the tests.

The experience of service in the Army expanded many individual social horizons as natives and immigrants joined the ranks. Immigrants had been welcomed into Union ranks during the Civil War with large numbers of Irish and Germans who had joined and fought alongside native born men. Some Germans in the Civil War fought in units where German was the main language. Between 1917 and 1918, the Army accepted immigrants with some hesitancy because of the widespread public agitation against “hyphenated Americans” that demanded they conform without delay or reservation. However, if the Army appeared concerned about the level of assimilation and loyalty of recent immigrants, some social mixtures simply could not be tolerated within the ranks.

Propagandistic images increased patriotism in a public relatively detached from events taking place overseas. This photograph, showing two United States soldiers sprinting past the bodies of two German soldiers toward a bunker, showed Americans the heroism evinced by their men in uniform. Likely a staged image taken after fighting ending, it nonetheless played on the public’s patriotism, telling them to step up and support the troops. “At close grips with the Hun, we bomb the corkshaffer's, etc.,” c. 1922?. Library of Congress, http://www.loc.gov/pictures/item/91783839/.

Propagandistic images increased patriotism in a public relatively detached from events taking place overseas. This photograph, showing two United States soldiers sprinting past the bodies of two German soldiers toward a bunker, showed Americans the heroism evinced by their men in uniform. Likely a staged image taken after fighting ending, it nonetheless played on the public’s patriotism, telling them to step up and support the troops. “At close grips with the Hun, we bomb the corkshaffer’s, etc.,” c. 1922?. Library of Congress.

Prevailing racial attitudes mandated the assignment of white and black soldiers to different units. Despite racial discrimination and Jim Crow, many black American leaders, such as W. E. B. DuBois, supported the war effort and sought a place at the front for black soldiers. Black leaders viewed military service as an opportunity to demonstrate to white society the willingness and ability of black men to assume all duties and responsibilities of citizens, including the wartime sacrifice. If black soldiers were drafted and fought and died on equal footing with white soldiers, then white Americans would see that they deserved to full citizenship. The War Department, however, barred black troops from combat specifically to avoid racial tensions. The military relegated black soldiers to segregated service units where they worked in logistics and supply and as general laborers.

In France, the experiences of black soldiers during training and periods of leave broadened their understanding of the Allies and life in Europe. The Army often restricted the privileges of black soldiers to ensure the conditions they encountered in Europe did not lead them to question their place in American society. However, black soldiers were not the only ones feared to be at risk by the temptations of European vice. To ensure that American “doughboys” did not compromise their special identity as men of the new world who arrived to save the old, several religious and progressive organizations created an extensive program designed to keep the men pure of heart, mind, and body. With assistance from the Young Men’s Christian Association (YMCA) and other temperance organizations, the War Department put together a program of schools, sightseeing tours, and recreational facilities to provide wholesome and educational outlets. The soldiers welcomed most of the activities from these groups, but many still managed to find and enjoy the traditional recreational venues of soldiers at war. ((Dawley, Changing the World.))

While the War and Navy Departments initiated recruitment and mobilization plans for millions of men, women reacted to the war preparations by joining several military and civilian organizations. Their enrollment and actions in these organizations proved to be a pioneering effort for American women in war. Military leaders authorized the permanent gender transition of several occupations that gave women opportunities to don uniforms where none had existed before in history. Civilian wartime organizations, although chaired by male members of the business elite, boasted all-female volunteer workforces. Women performed the bulk of volunteer charitable work during the war. ((Susan Zeiger, In Uncle Sam’s Service: Women Workers with the American Expeditionary Force, 1917-1919 (Philadelphia: University of Pennsylvania Press, 2004), 2-4.))

The military faced great upheaval with the admittance of women in the war. The War and Navy Departments authorized the enlistment of women to fill positions in several established administrative occupations. The gendered transition of these jobs freed more men to join combat units. Army women served as telephone operators (Hello Girls) for the Signal Corps, Navy women enlisted as Yeomen (clerical workers), and the first groups of women joined the Marine Corps in July 1918. For the military medical professions, approximately 25,000 nurses served in the Army and Navy Nurse Corps for duty stateside and overseas, and about a hundred female physicians were contracted by the Army. Neither the female nurses nor the doctors served as commissioned officers in the military. The Army and Navy chose to appoint them instead which left the status of professional medical women hovering somewhere between the enlisted and officer ranks. As a result, many female nurses and doctors suffered various physical and mental abuses at the hands of their male coworkers with no system of redress in place. ((Lettie Gavin, American Women in World War I: They Also Served (Boulder: University Press of Colorado, 1997); Kimberly Jensen, Mobilizing Minerva: American Women in the First World War (Chicago: University of Illinois Press, 2008), 170-172.))

The experiences of women in civilian organizations proved to be less stressful than in the military. Millions of women volunteered with the American Red Cross, the Young Men’s and Women’s Christian Associations (YMCA/YWCA), and the Salvation Army. Most women performed their volunteer duties in communal spaces owned by the leaders of the municipal chapters of these organizations. Women met at designated times to roll bandages, prepare and serve meals and snacks, package and ship supplies, and organize community fundraisers. The variety of volunteer opportunities that existed gave women the ability to appear in public spaces and promote charitable activities for the war effort. Women volunteers encouraged entire communities, including children, to get involved in war work. While most of these efforts focused on support for the home front, a small percentage of women volunteers served with the American Expeditionary Force in France. ((Gavin, American Women, 129-240.))

Jim Crow segregation in both the military and the civilian sector stood as a barrier for black women who wanted to give their time to the war effort. The military prohibited black women from serving as enlisted or appointed medical personnel. The only avenue for black women to wear a military uniform existed with the armies of the allied nations. A few black female doctors and nurses joined the French Foreign Legion to escape the racism in the American Army.  Black women volunteers faced the same discrimination in civilian wartime organizations. White leaders of American Red Cross, YMCA/YWCA, and Salvation Army municipal chapters refused to admit them as equal participants. Black women were forced to charter auxiliary units as subsidiary divisions to the chapters and given little guidance in which to organize fellow volunteers. They turned instead to the community for support and recruited millions of women for auxiliaries that supported the nearly 200,000 black soldiers and sailors serving in the military. While the majority of women volunteers labored to care for black families on the homefront, three YMCA secretaries received the opportunity of a lifetime to work with the black troops in France. ((Nikki Brown, Private Politics and Public Voices: Black Women’s Activism from World War I to the New Deal (Bloomington: Indiana University Press, 2006), 66-107.))

 

V. On the Homefront

In the early years of the war, Americans were generally detached from the events in Europe. The population paired their horror of war accounts with gratitude for the economic opportunities provided by the war and pride in a national tradition of non-involvement with the kind of entangling alliances that had caused the current war. Progressive Era reform politics dominated the political landscape, and Americans remained most concerned with domestic issues and the shifting role of government at home. However, the facts of the war could not be ignored by the public. The destruction taking place on European battlefields and the ensuing casualty rates indicated the unprecedented brutality of modern warfare. Increasingly, a sense that the fate of the Western world lay in the victory or defeat of the Allies.

President Wilson, a committed progressive, had articulated a global vision of democracy even as he embraced neutrality. And as war continued to engulf Europe, it seemed apparent that the United States’ economic power would shape the outcome of the conflict regardless of any American military intervention. By 1916, American trade with the Allies tripled while trade with the Central Powers shrunk astronomically, to less than one percent of previous levels.

The large numbers of German immigrants living throughout the United States created suspicion within the federal government. The American Protective League, a group of private citizens, worked directly with the U.S. government during WWI to identify suspected German sympathizers. Additionally, they sought to eradicate all radical, anarchical, left-wing, and anti-war activities through surveillance and raids. Even Herbert Hoover, the infamous head of the FBI, used the APL to gather intelligence. A membership card in the American Protective League, issued 28 May 1918. Wikimedia, http://commons.wikimedia.org/wiki/File:APL-Membership-Card.png.

German immigrants in the United States aroused popular suspicions. The American Protective League, a group of private citizens, worked directly with the U.S. government during WWI to identify suspected German sympathizers. Additionally, they sought to eradicate all radical, anarchical, left-wing, and anti-war activities through surveillance and raids. Even Herbert Hoover, the infamous head of the FBI, used the APL to gather intelligence. A membership card in the American Protective League, issued 28 May 1918. Wikimedia.

The progression of the war in Europe generated fierce national debates about military preparedness. The Allies and the Central Powers had taken little time to raise and mobilize vast armies and navies. By comparison, the United States still fielded a minuscule army and had limited federal power to summon an adequate defense force before the enactment of conscription. When America entered the war, mobilization of military resources and the cultivation of popular support for the war consumed the country. Because the federal government had lacked the coercive force to mobilize before the war, the American war effort was marked by enormous publicity and propaganda campaigns. President Wilson went to extreme measures to push public opinion towards the war. Most notably, he created the Committee on Public Information, known as the “Creel Committee,” headed by Progressive George Creel, to enflame the patriotic mood of the country and generate support for military adventures abroad. Creel enlisted the help of Hollywood studios and other budding media outlets to cultivate a view of the war that pit democracy against imperialism, that framed America as crusading nation endeavoring to rescue Western civilization from medievalism and militarism. As war passions flared, challenges to the onrushing patriotic sentiment that America was making the world “safe for democracy” were labeled disloyal. Wilson signed the Espionage Act in 1917 and the Sedition Act in 1918, stripping dissenters and protesters of their rights to publicly resist the war. Critics and protesters were imprisoned. Immigrants, labor unions, and political radicals became targets of government investigations and an ever more hostile public culture. Meanwhile, the government insisted that individual financial contributions made a discernible difference for the men on the Western Front. Americans lent their financial support to the war effort by purchasing war bonds or supporting Liberty Loan Drive. Many Americans, however, sacrificed much more than money. ((David Kennedy, Over Here: The First World War and American Society (New York: Oxford University Press, 1980).))

 

VI. Before the Armistice

The brutality of war persevered as European powers struggled to adapt to modern war. Until the spring of 1917, the Allies possessed few effective defensive measures against submarine attacks.  German submarines sank more than a thousand ships by the time America entered the war. The rapid addition of American naval escorts to the British surface fleet and the establishment of a convoy system countered much of the effect of German submarines. Shipping and military losses declined rapidly, just as the American Army arrived in Europe in large numbers. Although much of the equipment still needed to make the transatlantic passage, the physical presence of the Army proved to a fatal blow to German war plans. ((Neiberg, Fighting the Great War.))

In July 1917, after one last disastrous offensive against the Germans, the Russian army disintegrated. The tsarist regime collapsed and in November 1917 Vladimir Lenin’s Bolshevik party came to power. Russia soon surrendered to German demands and exited the war, freeing Germany to finally fight the one-front war it had desired since 1914. The German general staff quickly shifted hundreds of thousands of soldiers from the eastern theater in preparation for a new series of offensives planned for the following year in France. ((Ibid.))

In March 1918, Germany launched the Kaiserschlacht (Spring Offensive), a series of five major attacks. By the middle of July 1918, each and every one had failed to break through on the Western Front. A string of Allied offensives commenced on the Western Front On August 8, 1918. The two million men of the American Expeditionary Force joined British and French armies in a series of successful counter offensives that pushed the disintegrating German front lines back across France. German General Erich Ludendorff referred to launch of the counteroffensive as the “black day of the German army.” The German offensive gamble exhausted Germany’s faltering military effort. Defeat was inevitable. Kaiser Wilhelm II abdicated at the request of the German general staff and the new German democratic government agreed to an armistice (cease fire) on November 11, 1918. German military forces withdrew from France and Belgium and returned to a Germany teetering on the brink of chaos. ((Ibid.))

By the end of the war, more than 4.7 had million American men served in all branches of the military: four million in the Army, six hundred thousand in the Navy, and about eighty thousand in the Marine Corps. The United States lost over 100,000 men (Fifty-three thousand died in battle, and even more from disease). Their terrible sacrifice, however, paled before the Europeans’. After four years of brutal stalemate, France had suffered almost a million and a half military dead and Germany even more. Both nations lost about 4% of its population to the war. And death was not done. ((Ibid.))

 

VII. The War and the Influenza Pandemic

Even as war raged on the Western Front, a new deadly threat loomed: influenza. In the spring of 1918, a strain of the “flu” virus appeared in the farm country of Haskell County, Kansas, and hit nearby Camp Funston, one of the largest Army training camps in the nation. The virus spread like wildfire. The camp had brought disparate populations together, shuffled them between bases, sent them back to their homes across the nation, and, in consecutive waves, deployed them around the world. Between March and May 1918, fourteen of the largest American military training camps reported outbreaks of influenza. Some of the infected soldiers carried the virus on troop transports to France. By September 1918, influenza spread to all training camps in the United States. And then it mutated into a deadlier version. ((Nancy K. Bristow, American Pandemic: The Lost Worlds of the 1918 Influenza Epidemic (New York: Oxford University Press, 2012); Alfred W. Crosby, America’s Forgotten Pandemic: The Influenza of 1918 (Cambridge: Cambridge University Press, 2003).))

The second wave of the virus, a mutated strain, was even deadlier than the first. It struck down those in the prime of their lives: a disproportionate amount of influenza victims were between the ages of 18 and 35 years old. In Europe, influenza hit both sides of the Western Front. The “Spanish Influenza,” or the “Spanish Lady,” misnamed due to accounts of the disease that first appeared in the uncensored newspapers of neutral Spain, resulted in the deaths of an estimated fifty million people worldwide. Reports from the Surgeon General of the Army revealed that while 227,000 soldiers were hospitalized from wounds received in battle, almost half a million suffered from influenza. The worst part of the epidemic struck during the height of the Meuse-Argonne Offensive in the fall of 1918 and compromised the combat capabilities of the American and German armies. During the war, more soldiers died from influenza than combat. The pandemic continued to spread after the Armistice before finally fading in the early 1920s. No cure was ever found. ((Ibid.))

 

VIII. The Fourteen Points and the League of Nations

As the flu virus wracked the world, Europe and America rejoiced at the end of hostilities. On December 4, 1918, President Wilson became the first American president to leave the country during his term. He intended to shape the peace. The war brought an abrupt end to four great European imperial powers. The German, Russian, Austrian-Hungarian and Ottoman empires evaporated and the map of Europe was redrawn to accommodate new independent nations. As part of the terms of the Armistice, Allied forces followed the retreating Germans and occupied territories in the Rhineland to prevent Germany from reigniting war. As Germany disarmed, Wilson and the other Allied leaders gathered in France at Versailles for the Paris Peace Conference to dictate the terms of a settlement to the war.

Earlier that year, on January 8, 1918, before a joint session of Congress, President Wilson offered an enlightened statement of war aims and peace terms known as the Fourteen Points. The plan not only dealt with territorial issues but offered principles upon which a long-term peace could be built, including the establishment of a League of Nations to guard against future wars. But in January 1918 Germany still anticipated a favorable verdict on the battlefield and did not seriously consider accepting the terms of the Fourteen Points. The initial reaction from Germany seemed even more receptive than the Allies. French Prime Minister Georges Clemenceau remarked, “The good Lord only had ten (points).” ((Alan Dawley, Changing the World: American Progressives in War and Revolution (Princeton: Princeton University Press, 2005).))

President Wilson toiled for his vision of the post-war world. The United States had entered the fray, Wilson proclaimed, “to make the world safe for democracy.” At the center of the plan was a novel international organization–the League of Nations–charged with keeping a worldwide peace by preventing the kind of destruction that tore across Europe and “affording mutual guarantees of political independence and territorial integrity to great and small states alike.” This promise of collective security, that an attack on one sovereign member would be viewed as an attack on all, was a key component of the Fourteen Points. ((Thomas J. Knock, To End All Wars: Woodrow Wilson and the Quest for a New World Order (New York: Oxford University Press, 1992).))

But the fight for peace was daunting. While President Wilson was celebrated in Europe and welcomed as the “God of Peace,” his fellow statesmen were less enthusiastic about his plans for post-war Europe. America’s closest allies had little interest in the League of Nations. Allied leaders sought to guarantee the future safety of their own nations. Unlike the United States, the Allies endured firsthand the horrors of the war. They refused to sacrifice further. The negotiations made clear that British Prime Minister David Lloyd-George was more interested in preserving Britain’s imperial domain, while French Prime Minister Clemenceau sought a peace that recognized the Allies’ victory and the Central Powers’ culpability: he wanted reparations—severe financial penalties—and limits on Germany’s future ability to wage war. The fight for the League of Nations was therefore largely on the shoulders of President Wilson. By June 1919, the final version of the treaty was signed and President Wilson was able to return home. The treaty was a compromise that included demands for German reparations, provisions for the League of Nations, and the promise of collective security. For President Wilson, it was an imperfect peace, but better than no peace at all.

The real fight for the League of Nations was on the American homefront. Republican Senator Henry Cabot Lodge of Massachusetts stood as the most prominent opponent of the League of Nations. As chair of the Senator Foreign Relations Committee and an influential Republican Party leader, he could block ratification of the treaty. Lodge attacked the treaty for potentially robbing the United States of its sovereignty. Never an isolationist, Lodge demanded instead that the country deal with its own problems in its own way, free from the collective security—and oversight—offered by the League of Nations. Unable to match Lodge’s influence the Senate, President Wilson took his case to the American people in the hopes that ordinary voters might be convinced that the only guarantee of future world peace was the League of Nations. During his grueling cross-country trip, however, President Wilson suffered an incapacitating stroke. His opponents had the upper hand. ((John Milton Cooper, Breaking the Heart of the World: Woodrow Wilson and the Fight for the League of Nations (Cambridge: Cambridge University Press, 2001).))

President Wilson’s dream for the League of Nations died on the floor of the Senate. Lodge’s opponents successfully blocked America’s entry into the League of Nations, an organization conceived and championed by the American president. The League of Nations operated with fifty-eight sovereign members, but the United States refused to join, refused to lend it American power, and refused to provide it with the power needed to fulfill its purpose. ((Ibid.))

 

IX. Aftermath of World War I

The war transformed the world. It drastically changed the face of the Middle East, for instance. For centuries the Ottoman Empire had shaped life in the region. Before the war, the Middle East had three main centers of power: Egypt, the Ottoman Empire, and Iran. President Wilson’s call for self-determination appealed to many under the Ottoman Empire’s rule. In the aftermath of the war, Wilson sent a commission to investigate the region to determine the conditions and aspirations of the populace. The King-Crane Commission found that most of the inhabitants favored an independent state free of European control. However, these wishes were largely ignored, and the lands of the former Ottoman Empire divided into mandates through the Treaty of Sevres at the San Remo Conference in 1920. The Ottoman Empire disintegrated into several nations, many created in part without regard to ethnic realities by European powers. These Arab provinces were ruled by Britain and France, and the new nation of Turkey emerged from the former heartland of Anatolia. According to the League of Nations, mandates “were inhabited by peoples not yet able to stand by themselves under the strenuous conditions of the modern world.” Though allegedly for the benefit of the people of the Middle East, the mandate system was essentially a reimagined form of nineteenth-century imperialism. France received Syria; Britain took control of Iraq, Palestine, and Transjordan (Jordan). The United States was asked to become a mandate power, but declined. The geographical realignment of the Middle East also included the formation of two new nations: the Kingdom of Hejaz and Yemen. (The Kingdom of Hejaz was ruled by Sharif Hussein and only lasted until the 1920s when it became part of Saudi Arabia.) ((David Fromkin, A Peace to End All Peace: The Fall of the Ottoman Empire and the Creation of the Modern Middle East (New York: Henry Holt, 1989).))

The fates of Nicola Sacco and Bartolomeo Vanzetti, two Italian-born anarchists who were convicted of robbery and murder in 1920, reflected the Red Scare in American society that followed the Russian Revolution in 1917. Their arrest, trial, and execution inspired many leftists and dissenting artists to express their sympathy with the accused, such as in Maxwell Anderson’s Gods of the Lightning or Upton Sinclair’s Boston. The Sacco-Vanzetti demonstrated a newly exacerbated American nervousness about immigrants’ and the potential spread of radical ideas, especially those related to international communism after the Russian Revolution. ((Moshik Temkin, The Sacco-Vanzetti Affair: America on Trial (New Haven: Yale University Press, 2009).))

When in March 1918 the Bolsheviks signed a separate peace treaty with Germany, the Allies planned to send troops to northern Russia and Siberia prevent German influence and fight the Bolshevik revolution. Wilson agreed, and, in a little-known foreign intervention, American troops remained in Russia as late as 1920. Although the Bolshevik rhetoric of self-determination followed many of the ideals of Wilson’s Fourteen Points—Vladimir Lenin supported revolutions against imperial rule across the world—imperialism and anti-communism could not be so easily undone by vague ideas of self-rule.

While still fighting in WWI, President Wilson sent American troops to Siberia during the Russian Civil War for reasons both diplomatic and military. This photograph shows American soldiers in Vladivostok parading before the building occupied by the staff of the Czecho-Slovaks (those opposing the Bolsheviks). To the left, Japanese marines stand to attention as the American troops march. Photograph, August 1, 1918. Wikimedia, http://commons.wikimedia.org/wiki/File:American_troops_in_Vladivostok_1918_HD-SN-99-02013.JPEG.

While still fighting in WWI, President Wilson sent American troops to Siberia during the Russian Civil War for reasons both diplomatic and military. This photograph shows American soldiers in Vladivostok parading before the building occupied by the staff of the Czecho-Slovaks (those opposing the Bolsheviks). To the left, Japanese marines stand to attention as the American troops march. Photograph, August 1, 1918. Wikimedia, http://commons.wikimedia.org/wiki/File:American_troops_in_Vladivostok_1918_HD-SN-99-02013.JPEG.

At home, the United States grappled with harsh postwar realities. Racial tensions culminated in the Red Summer of 1919 when violence broke out in at least twenty-five cities, including Chicago and Washington, D.C. The riots originated from wartime racial tensions. Industrial war production and massive wartime service created vast labor shortages and thousands of southern blacks traveled to the North and Midwest to escape the traps of southern poverty. But the so-called Great Migration sparked significant racial conflict as local whites and returning veterans fought to reclaim their jobs and their neighborhoods from new black migrants. ((Isabel Wilkerson, The Warmth of Other Suns: The Epic Story of America’s Great Migration (New York: Vintage, 2011).))

But many black Americans, who had fled the Jim Crow South and traveled halfway around the world to fight for the United States, would not so easily accede to postwar racism. The overseas experience of black Americans and their return triggered a dramatic change in black communities. W.E.B. DuBois wrote boldly of returning soldiers: “We return. We return from fighting. We return fighting. Make way for Democracy!” ((The Crisis, (May, 1919), 14.)) But white Americans desired a return to the status quo, a world that did not include social, political, or economic equality for black people.

In 1919, America suffered through the “Red Summer.” Riots erupted across the country from April until October. The massive bloodshed during included thousands of injuries, hundreds of deaths, and a vast destruction of private and public property across the nation. The Chicago Riot, from July 27 to August 3, 1919, considered the summer’s worst, sparked a week of mob violence, murder, and arson. Race riots had rocked the nation, but the Red Summer was something new. Recently empowered blacks actively defended their families and homes, often with militant force. This behavior galvanized many in black communities, but it also shocked white Americans who alternatively interpreted black resistance as a desire for total revolution or as a new positive step in the path toward black civil rights. In the riots’ aftermath, James Weldon Johnson wrote, “Can’t they understand that the more Negroes they outrage, the more determined the whole race becomes to secure the full rights and privileges of freemen?” Those six hot months in 1919 forever altered American society and roused and terrified those that experienced the sudden and devastating outbreaks of violence. ((William Tuttle, Race Riot: Chicago in the Red Summer of 1919 (Champaign: University of Illinois Press, 1970); Cameron McWhirter, Red Summer: The Summer of 1919 and the Awakening of Black America (New York: Henry Holt, 2011).))

 

X. Conclusion

World War I decimated millions and profoundly altered the course of world history. Postwar instabilities led directly toward a global depression and a second world war. The war sparked the Bolshevik revolution that the United States later engaged in Cold War. It created Middle Eastern nations and aggravated ethnic tensions that the United States could never tackle. By fighting with and against European powers on the Western Front, America’s place in the world was never the same. By whipping up nationalist passions, American attitudes toward radicalism, dissent, and immigration were poisoned. Postwar disillusionment shattered Americans’ hopes for the progress of the modern world. The war came and went, and left in its place the bloody wreckage of an old world through which the world travelled to a new and uncertain future.

 

Contributors

This chapter was edited by Paula Fortier, with content contributions by Tizoc Chavez, Zachary W. Dresser, Blake Earle, Morgan Deane, Paula Fortier, Larry A. Grant, Mariah Hepworth, Jun Suk Hyun, and Leah Richier.

 

Recommended Reading

  1. Capozzola, Christopher. Uncle Sam Wants You: World War I and the Making of the Modern American Citizen. Oxford University Press, USA, 2010
  2. Dawley, Alan. Changing the World: American Progressives in War and Revolution. Princeton: Princeton University Press, 2003.
  3. Freeberg, Ernest. Democracy’s Prisoners: Eugene V. Debs, the Great War, and the Right to Dissent. Cambridge: Harvard University Press. 2008.
  4. Fussell, Paul. The Great War and Modern Memory. New York: Oxford University Press, 1975.
  5. Greenwald, Maurine W. Women, War, and Work: The Impact of World War I on Women Workers in the United States. Westport: Greenwood Press. 1980.
  6. Hahn, Steven. A Nation under Our Feet: Black Political Struggles in the Rural South from Slavery to the Great Migration. Cambridge: Harvard University Press, 2003.
  7. Hawley, Ellis. The Great War and the Search for Modern Order. New York: St. Martin’s, 1979.
  8. Keene, Jennifer. Doughboys, The Great War, and the Remaking of America. Baltimore: Johns Hopkins University Press, 2001.
  9. Kennedy, David. Over Here: The First World War and American Society. New York: Oxford University Press, 1980.
  10. Knock, Thomas J. To End All Wars: Woodrow Wilson and the Quest for a New World Order. New York: Oxford University Press. 1992.
  11. Manela, Erez. The Wilsonian Movement: Self-Determination and the International Origins of Anticolonial Nationalism. New York: Oxford University Press, 2007.
  12. Montgomery, David. The Fall of the House of Labor: The Workplace, the State, and American Labor Activism, 1865-1925. Cambridge: Cambridge University Press, 1988.
  13. Murphy, Paul. World War I and the Origins of Civil Liberties in the United States. New York: Norton, 1979.
  14. Tuttle, William. Race Riot: Chicago in the Red Summer of 1919. Champaign: University of Illinois Press, 1970.
  15. Wilkerson, Isabel. The Warmth of Other Sons: The Epic Story of America’s Great Migration. New York: Vintage, 2010.

 

Notes

22. The New Era

Al Jolson in The Jazz Singer, 1927.

Al Jolson in The Jazz Singer, 1927.

*The American Yawp is an evolving, collaborative text. Please click here to improve this chapter.*

I. Introduction

On a sunny day in early March of 1921, Warren G. Harding took the oath to become the twenty-ninth President of the United States. He had won a landslide election by promising a “return to normalcy.” “Our supreme task is the resumption of our onward, normal way,” he declared in his inaugural address. Two months later, he said, “America’s present need is not heroics, but healing; not nostrums, but normalcy; not revolution, but restoration.” The nation still reeled from the shock of World War I, the explosion of racial violence and political repression in 1919, and, bolstered by the Bolshevik Revolution in Russia, a lingering “Red Scare.”

More than 115,000 American soldiers had lost their lives in barely a year of fighting in Europe. Between 1918 and 1920, nearly 700,000 Americans died in a flu epidemic that hit nearly twenty percent of the American population. Waves of strikes hit soon after the war. Radicals bellowed. Anarchists and others sent more than thirty bombs through the mail on May 1, 1919. After war controls fell, the economy tanked and national unemployment hit twenty percent. Farmers’ bankruptcy rates, already egregious, now skyrocketed. Harding could hardly deliver the peace that he promised, but his message nevertheless resonated among a populace wracked by instability.

The 1920s would be anything but “normal.” The decade so reshaped American life that it came to be called by many names: the New Era, the Jazz Age, the Age of the Flapper, the Prosperity Decade, and, most commonly, the Roaring Twenties. The mass production and consumption of automobiles, household appliances, film, and radio fueled a new economy and new standards of living, new mass entertainment introduced talking films and jazz while sexual and social restraints loosened. BUt at the same time, many Americans turned their back on reform, denounced America’s shifting demographics, stifled immigration, retreated toward “old time religion,” and revived with millions of new members the Ku Klux Klan. Others, meanwhile, fought harder than ever for equal rights. Americans noted the appearance of “the New Woman” and “the New Negro.” Old immigrant communities that had predated the new immigration quotas clung to their cultures and their native faiths. The 1920s were a decade of conflict and tension. And whatever it was, it was not “normalcy.”

 

II. Republican White House, 1921-1933

To deliver on his promises of stability and prosperity, Harding signed legislation to restore a high protective tariff and dismantled the last wartime controls over industry. Meanwhile, the vestiges of America’s involvement in the First World War and its propaganda and suspicions of anything less than “100 percent American,” pushed Congress to address fears of immigration and foreign populations. A sour postwar economy led elites to raise the specter of the Russian Revolution and sideline not just the various American socialist and anarchist organizations, but nearly all union activism. During the 1920s, the labor movement suffered a sharp decline in memberships. Workers not only lost bargaining power, but also the support of courts, politicians, and, in large measure, the American public. ((David Montgomery, The Fall of the House of Labor: The Workplace, the State, and American Labor Activism, 1865-1925 (Cambridge University Press, 1988).))

Harding’s presidency, though, would go down in history as among the most corrupt. Many of Harding’s cabinet appointees, for instance, were individuals of true stature that answered to various American constituencies. For instance, Henry C. Wallace, the very vocal editor of Wallace’s Farmer and a well-known proponent of “scientific farming,” was made Secretary of Agriculture. Herbert Hoover, the popular head and administrator of the wartime Food Administration and a self-made millionaire, was made Secretary of Commerce. To satisfy business interests, the conservative businessmen Andrew Mellon became Secretary of the Treasury. Mostly, however, it was the appointing of friends and close supporters, dubbed “the Ohio gang,” that led to trouble. ((William E. Leuchtenburg, The Perils of Prosperity, 19141932 (Chicago: University of Chicago Press, 1993 [1958]).))

Harding’s administration suffered a tremendous setback when several officials conspired to lease government land in Wyoming to oil companies in exchange for cash. Known as the Teapot Dome scandal (named after the nearby rock formation that resembled a teapot), Interior Secretary Albert Fall and Navy Secretary Edwin Denby were eventually convicted and sent to jail. Harding took vacation in the summer of 1923 so that he could think deeply on how to deal “with my God-damned friends”—it was his friends, and not his enemies, that kept him up walking the halls at nights. But then, on August of 1923, Harding died suddenly of a heart attack and Vice President Calvin Coolidge ascended to the highest office in the land. ((Robert K. Murray, The Harding Era: Warren G. Harding and His Administration (Minneapolis: University of Minnesota Press, 1969.))

The son of a shopkeeper, Coolidge climbed the Republican ranks from city councilman to the Governor of Massachusetts. As president, Coolidge sought to remove the stain of scandal but he otherwise continued Harding’s economic approach, refusing to take actions in defense of workers or consumers against American business. “The chief business of the American people,” the new President stated, “is business.” One observer called Coolidge’s policy “active inactivity,” but Coolidge was not afraid of supporting business interests and wealthy Americans by lowering taxes or maintaining high tariff rates. Congress, for instance, had already begun to reduce taxes on the wealthy from wartime levels of sixty-six percent to twenty percent, which Coolidge championed. ((Leuchtenburg, Perils.))

While Coolidge supported business, other Americans continued their activism. The 1920s, for instance, represented a time of great activism among American women, who had won the vote with the passage of the 19th Amendment in 1920. Female voters, like their male counterparts, pursued many interests. Concerned about squalor, poverty, and domestic violence, women had already lent their efforts to prohibition, which went into effect under the Eighteenth Amendment in January 1920. After that point, alcohol could no longer be manufactured or sold. Other reformers urged government action to ameliorate high mortality rates among infants and children, to provide federal aid for education, and ensure peace and disarmament. Some activists advocated protective legislation for women and children, while Alice Paul and the National Women’s Party called for the elimination of all legal distinctions “on account of sex” through the proposed Equal Rights Amendment (ERA), which was introduced but defeated in Congress. ((Nancy Cott, The Grounding of Modern Feminism (New Haven: Yale University Press, 1987).))

During the 1920s, the National Women’s Party fought for the rights of women beyond that of suffrage, which they had secured through the 19th Amendment in 1920. They organized private events, like the tea party pictured here, and public campaigning, such as the introduction of the Equal Rights Amendment to Congress, as they continued the struggle for equality. “Reception tea at the National Womens [i.e., Woman's] Party to Alice Brady, famous film star and one of the organizers of the party,” April 5, 1923. Library of Congress, http://www.loc.gov/pictures/item/91705244/.

During the 1920s, the National Women’s Party fought for the rights of women beyond that of suffrage, which they had secured through the 19th Amendment in 1920. They organized private events, like the tea party pictured here, and public campaigning, such as the introduction of the Equal Rights Amendment to Congress, as they continued the struggle for equality. “Reception tea at the National Womens [i.e., Woman’s] Party to Alice Brady, famous film star and one of the organizers of the party,” April 5, 1923. Library of Congress.

National politics in the 1920s were dominated by the Republican Party, which not held the presidency but both houses of Congress as well. In a note passed to American reporters, Coolidge announced his decision not to run in the presidential election of 1928. Republicans nominated Herbert Hoover. An orphan from Iowa who graduated from Stanford, became wealthy as a mining engineer, and won a deserved reputation as a humanitarian for his relief efforts in famine-struck, war-torn Europe. Running against Hoover was Democrat Alfred E. Smith, the four-time governor of New York and the son of Irish immigrants. Smith was a part of the New York machine and favored workers’ protections while also opposing prohibition and immigration restrictions. Hoover focused on economic growth and prosperity. He had served as Secretary of Commerce under Harding and Coolidge and claimed credit for the sustained growth seen during the 1920s, Hoover claimed in 1928 that America had never been closer to eliminating poverty. Much of the election, however, centered around Smith’s religion: he was a Catholic. And not only was he a Catholic, he opposed Protestant America’s greatest political triumph, prohibition. Many Protestant ministers preached against Smith and warned that he be enthralled to the Pope. Hoover won in a landslide. While Smith won handily in the nation’s largest cities, portending future political trends, he lost most of the rest of the country. Even several solidly Democratic southern states pulled the lever for a Republican for the first time since Reconstruction. ((Allan J. Lichtman, Prejudice and the Old Politics: The Presidential Election of 1928 (Chapel Hill, NC: University of North Carolina Press, 1979).))

 

III. Culture of Consumption

“Change is in the very air Americans breathe, and consumer changes are the very bricks out of which we are building our new kind of civilization,” announced marketing expert and home economist Christine Frederick in her influential 1929 monograph, Selling Mrs. Consumer. The book, which was based on one of the earliest surveys of American buying habits, advised manufacturers and advertisers how to capture the purchasing power of women, who, according to Frederick, accounted for 90% of household expenditures. Aside from granting advertisers insight into the psychology of the “average” consumer, Frederick’s text captured the tremendous social and economic transformations that had been wrought over the course of her lifetime. ((Christine Frederick, Selling Mrs. Consumer, (New York: The Business Bourse, 1929), 29.))

Indeed, the America of Frederick’s birth looked very different from the one she confronted in 1929. The consumer change she studied had resulted from the industrial expansion of the late-nineteenth and early-twentieth centuries. With the discovery of new energy sources and manufacturing technologies, industrial output flooded the market with a range of consumer products such as ready-to-wear clothing to convenience foods to home appliances. By the end of the nineteenth century, output had risen so dramatically that many contemporaries feared supply had outpaced demand and that the nation would soon face the devastating financial consequences of overproduction. American businessmen attempted to avoid this catastrophe by developing new merchandising and marketing strategies that transformed distribution and stimulated a new culture of consumer desire. ((T.J. Jackson Lears, From Salvation to Self-Realization: Advertising and the Therapeutic Roots of the Consumer Culture, 1880-1930, in The Culture of Consumption: Critical Essays in American History, 1880-1980, edited by Richard Wightman Fox and T.J. Jackson Lears (New York: Pantheon Books, 1983), 1-38.))

The department store stood at the center of this early consumer revolution. By the 1880s, several large dry goods houses blossomed into modern retail department stores. These emporiums concentrated a broad array of goods under a single roof, allowing customers to purchase shirtwaists and gloves alongside toy trains and washbasins. To attract customers, department stores relied on more than variety. They also employed innovations in service—such as access to restaurants, writing rooms, and babysitting—and spectacle—such as elaborately decorated store windows, fashion shows, and interior merchandise displays. Marshall Field & Co. was among the most successful of these ventures. Located on State Street in Chicago, the company pioneered many of these strategies, including establishing a tearoom that provided refreshment to the well-heeled women shoppers that comprised the store’s clientele. Reflecting on the success of Field’s marketing techniques, Thomas W. Goodspeed, an early trustee of the University of Chicago wrote, “Perhaps the most notable of Mr. Field’s innovations was that he made a store in which it was a joy to buy.” ((Thomas W. Goodspeed, “Marshall Field,” University of Chicago Magazine, Vol. III (Chicago: University of Chicago Press, 1922), 48.))

The joy of buying infected a growing number of Americans in the early twentieth century as the rise of mail-order catalogs, mass-circulation magazines, and national branding further stoked consumer desire. The automobile industry also fostered the new culture of consumption by promoting the use of credit. By 1927, more than sixty percent of American automobiles were sold on credit, and installment purchasing was made available for nearly every other large consumer purchase. Spurred by access to easy credit, consumer expenditures for household appliances, for example, grew by more than 120 percent between 1919 and 1929. Henry Ford’s assembly line, which advanced production strategies practiced within countless industries, brought automobiles within the reach of middle-income Americans and fruther drove the spirit of consumerism. By 1925, Ford’s factories were turning out a Model-T every 10 seconds. The number of registered cars ballooned from just over nine million in 1920 to nearly twenty-seven million by the decade’s end. Americans owned more cars than Great Britain, Germany, France, and Italy combined. In the late 1920s, eighty percent of the world’s cars drove on American roads.

 

IV. Culture of Escape

As transformative as steam and iron had been in the previous century, gasoline and electricity—embodied most dramatically for many Americans in automobiles, film, and radio—propelled not only consumption, but also the famed popular culture in the 1920s. “We wish to escape,” wrote Edgar Burroughs, author of the Tarzan series. “The restrictions of manmade laws, and the inhibitions that society has placed upon us.” Burroughs authored a new Tarzan story nearly every year from 1914 until 1939. “We would each like to be Tarzan,” he said. “At least I would; I admit it.” Like many Americans in the 1920s, Burroughs sought to challenge and escape the constraints of a society that seemed more industrialized with each passing day. ((LeRoy Ashby, With Amusement for All: A History of American Popular Culture Since 1830 (Lexington: University Press of Kentucky, 2006), 177.))

Just like Burroughs, Americans escaped with great speed. Whether through the automobile, Hollywood’s latest films, jazz records produced on Tin Pan Alley, or the hours spent listening to radio broadcasts of Jack Dempsey’s prizefights, the public wrapped itself in popular culture. One observer estimated that Americans belted out the silly musical hit “Yes, We Have No Bananas” more than “The Star Spangled Banner” and all the hymns in all the hymnals combined. ((Ibid., 183.))

As the automobile became more popular and more reliable, more people traveled more frequently and attempted greater distances. Women increasingly drove themselves to their own activities as well as those of their children. Vacationing Americans sped to Florida to escape northern winters. Young men and women fled the supervision of courtship, exchanging the staid parlor couch for sexual exploration in the backseat of a sedan. In order to serve and capture the growing number of drivers, Americans erected gas stations, diners, motels, and billboards along the roadside. Automobiles themselves became objects of entertainment: nearly one hundred thousand people gathered to watch drivers compete for the $50,000 prize of the Indianapolis 500.

The automobile changed American life forever. Rampant consumerism, the desire to travel, and the affordability of cars allowed greater numbers of Americans to purchase automobiles. This was possible only through innovations in automobile design and manufacturing led by Henry Ford in Detroit, Michigan. Ford was a lifelong inventor, creating his very first automobile – the quadricycle – in his home garage. From The Truth About Henry Ford by Sarah T. Bushnell, 1922. Wikimedia, http://commons.wikimedia.org/wiki/File:Mr_and_Mrs_Henry_Ford_in_his_first_car.jpg.

The automobile changed American life forever. Rampant consumerism, the desire to travel, and the affordability of cars allowed greater numbers of Americans to purchase automobiles. This was possible only through innovations in automobile design and manufacturing led by Henry Ford in Detroit, Michigan. Ford was a lifelong inventor, creating his very first automobile – the quadricycle – in his home garage. From The Truth About Henry Ford by Sarah T. Bushnell, 1922. Wikimedia.

Meanwhile, the United States dominated the global film industry. By 1930, as movie-making became more expensive, a handful of film companies took control of the industry. Immigrants, mostly of Jewish heritage from Central and Eastern Europe, originally “invented Hollywood” because most turn-of-the-century middle and upper class Americans viewed cinema as lower-class entertainment. After their parents emigrated from Poland in 1876, Harry, Albert, Sam, and Jack Warner (who were given the name when an Ellis Island official could not understand their surname) founded Warner Bros. in 1918. Universal, Paramount, Columbia, and MGM were all founded by or led by Jewish executives. Aware of their social status as outsiders, these immigrants (or sons of immigrants) purposefully produced films that portrayed American values of opportunity, democracy, and freedom.

Not content with distributing thirty-minute films in nickelodeons, film moguls produced longer, higher-quality films and showed them in palatial theaters that attracted those who had previously shunned the film industry. But as filmmakers captured the middle and upper classes, they maintained working-class moviegoers by blending traditional and modern values. Cecil B. DeMille’s 1923 epic The Ten Commandments depicted orgiastic revelry, for instance, while still managing to celebrate a biblical story. But what good was a silver screen in a dingy theater? Moguls and entrepreneurs soon constructed picture palaces. Samuel Rothafel’s Roxy Theater in New York held more than six thousand patrons who could be escorted by a uniformed usher past gardens and statues to their cushioned seat. In order to show The Jazz Singer (1927), the first movie with synchronized words and pictures, the Warners spent half a million to equip two theaters. “Sound is a passing fancy,” one MGM producer told his wife, but Warner Bros.’ assets, which increased from just $5,000,000 in 1925 to $230,000,000 in 1930, tell a different story. ((Ibid., 216.))

Americans fell in love with the movies. Whether it was the surroundings, the sound, or the production budgets, weekly movie attendance skyrocketed from sixteen million in 1912 to forty million in the early 1920s. Hungarian immigrant William Fox, founder of Fox Film Corporation, declared that “the motion picture is a distinctly American institution” because “the rich rub elbows with the poor” in movie theaters. With no seating restriction, the one-price admission was accessible for nearly all Americans (African Americans, however, were either excluded or segregated). Women represented more than sixty percent of moviegoers, packing theaters to see Mary Pickford, nicknamed “America’s Sweetheart,” who was earning one million dollars a year by 1920 through a combination of film and endorsements contracts. Pickford and other female stars popularized the “flapper,” a woman who favored short skirts, makeup, and cigarettes.

Mary Pickford’s film personas led the glamorous and lavish lifestyle that female movie-goers of the 1920s desired so much. Mary Pickford, 1920. Library of Congress, http://www.loc.gov/pictures/item/2003666664.

Mary Pickford’s film personas led the glamorous and lavish lifestyle that female movie-goers of the 1920s desired so much. Mary Pickford, 1920. Library of Congress, http://www.loc.gov/pictures/item/2003666664.

As Americans went to the movies more and more, at home they had the radio. Italian scientist Guglielmo Marconi transmitted the first transatlantic wireless (radio) message in 1901, but radios in the home did not become available until around 1920, when they boomed across the country. Around half of American homes contained a radio by 1930. Radio stations brought entertainment directly into the living room through the sale of advertisements and sponsorships, from The Maxwell House Hour to the Lucky Strike Orchestra. Soap companies sponsored daytime drams so frequently that an entire genre—“soap operas”—was born, providing housewives with audio adventures that stood in stark contrast to common chores. Though radio stations were often under the control of corporations like the National Broadcasting Company (NBC) or the Columbia Broadcasting System (CBS), radio programs were less constrained by traditional boundaries in order to capture as wide an audience as possible, spreading popular culture on a national level.

Radio exposed Americans to a broad array of music. Jazz, a uniquely American musical style popularized by the African-American community in New Orleans, spread primarily through radio stations and records. The New York Times had ridiculed jazz as “savage” because of its racial heritage, but the music represented cultural independence to others. As Harlem-based musician William Dixon put it, “It did seem, to a little boy, that . . . white people really owned everything. But that wasn’t entirely true. They didn’t own the music that I played.” The fast-paced and spontaneity-laced tunes invited the listener to dance along. “When a good orchestra plays a ‘rag,’” dance instructor Vernon Castle recalled, “One has simply got to move.” Jazz became a national sensation, played and heard by whites and blacks both. Jewish Lithuanian-born singer Al Jolson—whose biography inspired The Jazz Singer and who played the film’s titular character—became the most popular singer in America. ((Ibid., 210.))

The 1920s also witnessed the maturation of professional sports. Play-by-play radio broadcasts of major collegiate and professional sporting events marked a new era for sports, despite the institutionalization of racial segregation in most. Suddenly, Jack Dempsey’s left crosses and right uppercuts could almost be felt in homes across the United States. Dempsey, who held the heavyweight championship for most of the decade, drew million-dollar gates and inaugurated “Dempseymania” in newspapers across the country. Red Grange, who carried the football with a similar recklessness, helped to popularize professional football, which was then in the shadow of the college game. Grange left the University of Illinois before graduating to join the Chicago Bears in 1925. “There had never been such evidence of public interest since our professional league began,” recalled Bears owner George Halas of Grange’s arrival. ((Ibid., 181.))

Perhaps no sports figure left a bigger mark than did Babe Ruth. Born George Herman Ruth, the “Sultan of Swat” grew up in an orphanage in Baltimore’s slums. Ruth’s emergence onto the national scene was much needed, as the baseball world had been rocked by the so-called black Sox scandal in which eight players allegedly agreed to throw the 1919 World Series. Ruth hit fifty-four home runs in 1920, which was more than any other team combined. Baseball writers called Ruth a superman, and more Americans could recognize Ruth than they could then-president Warren G. Harding.

After an era of destruction and doubt brought about by the First World War, Americans craved heroes that seemed to defy convention and break boundaries. Dempsey, Grange, and Ruth dominated their respective sport, but only Charles Lindbergh conquered the sky. On May 21, 1927, Lindbergh concluded the first ever non-stop solo flight from New York to Paris. Armed with only a few sandwiches, some bottles of water, paper maps, and a flashlight, Lindbergh successfully navigated over the Atlantic Ocean in thirty-three hours. Some historians have dubbed Lindbergh the “hero of the decade,” not only for his transatlantic journey, but because he helped to restore the faith of many Americans in individual effort and technological advancement. Devastated in war by machine guns, submarines, and chemical weapons, Lindbergh’s flight demonstrated that technology could inspire and accomplish great things. Outlook Magazine called Lindbergh “the heir of all that we like to think is best in America.” ((John W. Ward, “The Meaning of Lindbergh’s Flight, in Studies in American Culture: Dominant Ideas and Images, edited by Joseph J. Kwiat and Mary C. Turpie (Minneapolis: University of Minnesota Press, 1960), 33.))

The decade’s popular culture seemed to revolve around escape. Coney Island in New York marked new amusements for young and old. Americans drove their sedans to massive theaters to enjoy major motion pictures. Radio towers broadcasted the bold new sound of jazz, the adventure of soap operas, and the feats of amazing athletes. Dempsey and Grange seemed bigger, stronger, and faster than any who dared to challenge them. Babe Ruth smashed home runs out of ball parks across the country. And Lindbergh escaped earth’s gravity and crossed entire ocean. Neither Dempsey nor Ruth nor Lindbergh made Americans forget the horrors of the First World War and the chaos that followed, but they made it seem as if the future would be that much brighter.

Babe Ruth’s incredible talent attracted widespread attention to the sport of baseball, helping it become America’s favorite pastime. Ruth’s propensity to shatter records with the swing of his bat made him a national hero during a period when defying conventions was the popular thing to do. “[Babe Ruth, full-length portrait, standing, facing slightly left, in baseball uniform, holding baseball bat],” c. 1920. Library of Congress, http://www.loc.gov/pictures/item/92507380/.

Babe Ruth’s incredible talent attracted widespread attention to the sport of baseball, helping it become America’s favorite pastime. Ruth’s propensity to shatter records with the swing of his bat made him a national hero during a period when defying conventions was the popular thing to do. “[Babe Ruth, full-length portrait, standing, facing slightly left, in baseball uniform, holding baseball bat],” c. 1920. Library of Congress, http://www.loc.gov/pictures/item/92507380/.

V. “The New Woman”

This “new breed” of women – known as the flapper – went against the gender proscriptions of the era, bobbing their hair, wearing short dresses, listening to jazz, and flouting social and sexual norms. While liberating in many ways, these behaviors also reinforced stereotypes of female carelessness and obsessive consumerism that would continue throughout the twentieth century. Bain News Service, “Louise Brooks,” undated. Library of Congress, http://www.loc.gov/pictures/item/ggb2006007866/.

This “new breed” of women – known as the flapper – went against the gender proscriptions of the era, bobbing their hair, wearing short dresses, listening to jazz, and flouting social and sexual norms. While liberating in many ways, these behaviors also reinforced stereotypes of female carelessness and obsessive consumerism that would continue throughout the twentieth century. Bain News Service, “Louise Brooks,” undated. Library of Congress, http://www.loc.gov/pictures/item/ggb2006007866/.

The rising emphasis on spending and accumulation nurtured a national ethos of materialism and individual pleasure. These impulses were embodied in the figure of the flapper, whose bobbed hair, short skirts, makeup, cigarettes, and carefree spirit captured the attention of American novelists such as F. Scott Fitzgerald and Sinclair Lewis. Rejecting the old Victorian values of desexualized modesty and self-restraint, young “flappers” seized opportunities for the public coed pleasures offered by new commercial leisure institutions, such as dance halls, cabarets, and nickelodeons, not to mention the illicit blind tigers and speakeasies spawned by Prohibition. So doing, young American women had helped to usher in a new morality that permitted women greater independence, freedom of movement, and access to the delights of urban living. In the words of psychologist G. Stanley Hall, “She was out to see the world and, incidentally, be seen of it.”

Such sentiments were repeated in an oft-cited advertisement in a 1930 edition of the Chicago Tribune: “Today’s woman gets what she wants. The vote. Slim sheaths of silk to replace voluminous petticoats. Glassware in sapphire blue or glowing amber. The right to a career. Soap to match her bathroom’s color scheme.” As with so much else in 1920s, however, sex and gender were in many ways a study in contradictions. It was the decade of the “New Woman,” and one in which only 10% of married women worked outside the home. ((See Lynn Dumenil,The Modern Temper:  American Culture and Society in the 1920s (New York: Hill and Wang, 1995), 113.))  It was a decade in which new technologies decreased time requirements for household chores, and one in which standards of cleanliness and order in the home rose to often impossible standards. It was a decade in which women would, finally, have the opportunity to fully exercise their right to vote, and one in which the often thinly-bound women’s coalitions that had won that victory splintered into various causes. Finally, it was a decade in which images such as the “flapper” would give women new modes of representing femininity, and one in which such representations were often inaccessible to women of certain races, ages, and socio-economic classes.

Women undoubtedly gained much in the 1920s. There was a profound and keenly felt cultural shift which, for many women, meant increased opportunity to work outside the home. The number of professional women, for example, significantly rose in the decade. But limits still existed, even for professional women. Occupations such as law and medicine remained overwhelmingly “male”: the majority of women professionals were in “feminized” professions such as teaching and nursing. And even within these fields, it was difficult for women to rise to leadership positions.

Further, it is crucial not to over-generalize the experience of all women based on the experiences of a much-commented upon subset of the population. A woman’s race, class, ethnicity, and marital status all had an impact on both the likelihood that she worked outside the home, as well as the types of opportunities that were available to her. While there were exceptions, for many minority women, work outside the home was not a cultural statement but rather a financial necessity (or both), and physically demanding, low-paying domestic service work continued to be the most common job type. Young, working class white women were joining the workforce more frequently, too, but often in order to help support their struggling mothers and fathers.

For young, middle-class, white women—those most likely to fit the image of the carefree flapper—the most common workplace was the office. These predominantly single women increasingly became clerks, jobs that had been primarily “male” earlier in the century. But here, too, there was a clear ceiling. While entry-level clerk jobs became increasingly feminized, jobs at a higher, more lucrative level remained dominated by men. Further, rather than changing the culture of the workplace, the entrance of women into the lower-level jobs primarily changed the coding of the jobs themselves. Such positions simply became “women’s work.”

The frivolity, decadence, and obliviousness of the 1920s was embodied in the image of the flapper, the stereotyped carefree and indulgent woman of the Roaring Twenties depicted by Russell Patterson’s drawing. Russell Patterson, artist, “Where there's smoke there's fire,” c. 1920s. Library of Congress, http://www.loc.gov/pictures/item/2009616115/.

The frivolity, decadence, and obliviousness of the 1920s was embodied in the image of the flapper, the stereotyped carefree and indulgent woman of the Roaring Twenties depicted by Russell Patterson’s drawing. Russell Patterson, artist, “Where there’s smoke there’s fire,” c. 1920s. Library of Congress, http://www.loc.gov/pictures/item/2009616115/.

Finally, as these same women grew older and married, social changes became even subtler. Married women were, for the most part, expected to remain in the domestic sphere. And while new patterns of consumption gave them more power and, arguably, more autonomy, new household technologies and philosophies of marriage and child-rearing increased expectations, further tying these women to the home—a paradox that becomes clear in advertisements such as the one in the Chicago Tribune. Of course, the number of women in the workplace cannot exclusively measure changes in sex and gender norms. Attitudes towards sex, for example, continued to change in the 1920s, as well, a process that had begun decades before. This, too, had significantly different impacts on different social groups. But for many women—particularly young, college-educated white women—an attempt to rebel against what they saw as a repressive “Victorian” notion of sexuality led to an increase in premarital sexual activity strong enough that it became, in the words of one historian, “almost a matter of conformity.” ((Nancy Cott, The Grounding of Modern Feminism (New Haven: Yale University Press, 1987), 150.))

In the homosexual community, meanwhile, a vibrant gay culture grew, especially in urban centers such as New York. While gay males had to contend with increased policing of the gay lifestyle (especially later in the decade), in general they lived more openly in New York in the 1920s than they would be able to for many decades following World War II. ((George Chauncey, Gay New York: Gender, Urban Culture, and the Makings of the Gay Male World, 1890-1940 (New York: Basic Books, 1994).)) At the same time, for many lesbians in the decade, the increased sexualization of women brought new scrutiny to same-sex female relationships previously dismissed as harmless. ((Cott, Grounding, 160.))

Ultimately, the most enduring symbol of the changing notions of gender in the 1920s remains the flapper. And indeed, that image was a “new” available representation of womanhood in the 1920s. But it is just that: a representation of womanhood of the 1920s. There were many women in the decade of differing races, classes, ethnicities, and experiences, just as there were many men with different experiences. For some women, the 1920s were a time of reorganization, new representations and new opportunities. For others, it was a decade of confusion, contradiction, new pressures and struggles new and old.

 

VI. “The New Negro”

Just as cultural limits loosened across the nation, the 1920s represented a period of serious self-reflection among African Americans, most especially those in northern ghettos. New York City was a popular destination of American blacks during the Great Migration. The city’s black population grew 257%, from 91,709 in 1910 to 327,706 by 1930 (the white population grew only 20%). ((Mark R. Schneider, “We Return Fighting”: The Civil Rights Movement in the Jazz Age. Boston: Northeastern University Press, 2002), 21.)) Moreover, by 1930, some 98,620 foreign-born blacks had migrated to the U.S. Nearly half made their home in Manhattan’s Harlem district. ((Philip Kasinitz, Caribbean New York: Black Immigrants and the Politics of Race (Ithaca: Cornell University Press, 1992), 25.))

Harlem originally lay between Fifth Avenue to Eighth Avenue and 130th Street to 145th Street. By 1930, the district had expanded to 155th Street and was home to 164,000 mostly African Americans. Continuous relocation to “the greatest Negro City in the world” exacerbated problems with crime, health, housing, and unemployment. ((The New Negro of the Harlem Renaissance (New York: Simon & Schuster, 1997/1925), 301.)) Nevertheless, it importantly brought together a mass of black people energized by the population’s World War I military service, the urban environment, and for many, ideas of Pan-Africanism or Garveyism. Out of the area’s cultural ferment emerged the Harlem Renaissance, or what was then termed “New Negro Movement.” While this stirring in self-consciousness and racial pride was not confined to Harlem, this district was truly as James Weldon Johnson described: “The Culture Capital.” ((Ibid.)) The Harlem Renaissance became a key component in African Americans’ long history of cultural and intellectual achievements.

Alain Locke did not coin “New Negro”, but he did much to popularize it. In the 1925 The New Negro, Locke proclaimed that the generation of subservience was no more—“we are achieving something like a spiritual emancipation.” Bringing together writings by men and women, young and old, black and white, Locke produced an anthology that was of African Americans, rather than only about them. The book joined many others. Popular Harlem Renaissance writers published some twenty-six novels, ten volumes of poetry, and countless short stories between 1922 and 1935. ((Joan Marter, editor, The Grove Encyclopedia of American Art, Volume 1 (Oxford: Oxford University Press, 2011), 448.)) Alongside the well-known Langston Hughes and Claude McKay, women writers like Jessie Redmon Fauset and Zora Neale Hurston published nearly one-third of these novels. While themes varied, the literature frequently explored and countered pervading stereotypes and forms of American racial prejudice.

The Harlem Renaissance was manifested in theatre, art, and music. For the first time, Broadway presented black actors in serious roles. The 1924 production, Dixie to Broadway, was the first all-black show with mainstream showings. ((James F. Wilson, Bulldaggers, Pansies, and Chocolate Babies: Performance, Race, and Sexuality in the Harlem Renaissance (Ann Arbor: University of Michigan Press, 2010), 116.)) In art, Meta Vaux Warrick Fuller, Aaron Douglas, and Palmer Hayden showcased black cultural heritage as well as captured the population’s current experience. In music, jazz rocketed in popularity. Eager to hear “real jazz,” whites journeyed to Harlem’s Cotton Club and Smalls. Next to Greenwich Village, Harlem’s nightclubs and speakeasies (venues where alcohol was publicly consumed) presented a place where sexual freedom and gay life thrived. Unfortunately, while headliners like Duke Ellington were hired to entertain at Harlem’s venues, the surrounding black community was usually excluded. Furthermore, black performers were often restricted from restroom use and relegated to service door entry. As the Renaissance faded to a close, several Harlem Renaissance artists went on to produce important works indicating that this movement was but one component in African American’s long history of cultural and intellectual achievements. ((Cary D, Wintz, and Paul Finkelman, Encyclopedia of the Harlem Renaissance,>Volume 2 (New York: Routledge, 2004), 910-11.))

Marcus Garvey inspired black American activists disappointed with the lack of progress since emancipation to create a world-wide community to fight injustice. One of the many forms of social activism in the 1920s, Garveyism was seen by some as too radical to engender any real change. Yet Garveyism formed a substantial following, and was a major stimulus for later black nationalistic movements like the Black Panthers. Photograph of Marcus Garvey, August 5, 1924. Library of Congress, http://www.loc.gov/pictures/item/2003653533/.

Garveyism, criticized as too radical, nevertheless formed a substantial following, and was a major stimulus for later black nationalistic movements. Photograph of Marcus Garvey, August 5, 1924. Library of Congress.

The explosion of African American self-expression found multiple outlets in politics. In the 1910s and 20s, perhaps no one so attracted disaffected black activists as Marcus Garvey. Garvey was a Jamaican publisher and labor organizer who arrived in New York City in 1916. Within just a few years of his arrival, he built the largest black nationalist organization in the world, the Universal Negro Improvement Association (UNIA). ((For Garvey, see Colin Grant, Negro with a Hat: The Rise and Fall of Marcus Garvey (New York: Oxford University  Press, 2008); Judith Stein, The World of Marcus Garvey: Race and Class in Modern Society (Baton Rouge: Louisiana State University Press, 1986); Ula Yvette Taylor, The Veiled Garvey: The Life and Times of Amy Jacques Garvey (Chapel Hill: University of North Carolina Press, 2002).)) Inspired by Pan-Africanism and Booker T. Washington’s model of industrial education, and critical of what he saw as DuBois’s elitist strategies in service of black elites, Garvey sought to promote racial pride, encourage black economic independence, and root out racial oppression in Africa and the Diaspora. ((Winston James, Holding Aloft the Banner of Ethiopia: Caribbean Radicalism in Early Twentieth-Century America (London: Verso, 1998).))

Headquartered in Harlem, the UNIA published a newspaper, Negro World, and organized elaborate parades in which members, “Garveyites,” dressed in ornate, militaristic regalia and marched down city streets. The organization criticized the slow pace of the judicial focus of the National Association for the Advancement of Colored People (NAACP), as well as this organization’s acceptance of memberships and funds from whites. “For the Negro to depend on the ballot and his industrial progress alone,” Garvey opined, “will be hopeless as it does not help him when he is lynched, burned, jim-crowed, and segregated.” In 1919, the UNIA announced plans to develop a shipping company called the Black Star Line as part of a plan that pushed for blacks to reject the political system and to “return to Africa” instead.” Most of the investments came in the form of shares purchased by UNIA members, many of whom heard Garvey give rousing speeches across the country about the importance of establishing commercial ventures between African Americans, Afro-Caribbeans, and Africans. ((Grant; Stein; Taylor.))

Garvey’s detractors disparaged these public displays and poorly managed business ventures, and they criticized Garvey for peddling empty gestures in place of measures that addressed the material concerns of African Americans. NAACP leaders depicted Garvey’s plan as one that simply said, “Give up! Surrender! The struggle is useless.” Enflamed by his aggressive attacks on other black activists and his radical ideas of racial independence, many African American and Afro-Caribbean leaders worked with government officials and launched the “Garvey Must Go” campaign, which culminated in his 1922 indictment and 1925 imprisonment and subsequent deportation for “using the mails for fraudulent purposes.” The UNIA never recovered its popularity or financial support, even after Garvey’s pardon in 1927, but his movement made a lasting impact on black consciousness in the United States and abroad. He inspired the likes of Malcolm X, whose parents were Garveyites, and Kwame Nkrumah, the first president of Ghana. Garvey’s message, perhaps best captured by his rallying cry, “Up, you mighty race,” resonated with African Americans who found in Garveyism a dignity not granted them in their everyday lives. In that sense, it was all too typical of the Harlem Renaissance. ((Ibid.))

 

VII. Culture War

For all of its cultural ferment, however, the 1920s were also a difficult time for radicals and immigrants and anything “modern.” Fear of foreign radicals led to the executions of Nicola Sacco and Bartolomeo Vanzetti, two Italian anarchists, in 1927. In May 1920, the two had been arrested for robbery and murder connected with an incident at a Massachusetts factory. Their guilty verdicts were appealed for years as the evidence surrounding their convictions was slim. For instance, while one eyewitness claimed that Vanzetti drove the get-away car, but accounts of others described a different person altogether. Nevertheless, despite around world lobbying by radicals and a respectable movement among middle class Italian organizations in the U.S., the two men were executed on August 23, 1927. Vanzetti conceivably provided the most succinct reason for his death, saying, “This is what I say . . . . I am suffering because I am a radical and indeed I am a radical; I have suffered because I was an Italian, and indeed I am an Italian.” ((Nicola Sacco, and Bartolomeo Vanzetti, The Letters of Sacco and Vanzetti (New York: Octagon Books, 1971 [1928]), 272.))

Many Americans expressed anxieties about the changes that had remade the United States and, seeking scapegoats, many middle-class white Americans pointed to Eastern European and Latin American immigrants (Asian immigration had already been almost completely prohibited), or African Americans who now pushed harder for civil rights and after migrating out of the American South to northern cities as a part of the Great Migration, that mass exodus which carried nearly half a million blacks out of the South between just 1910-1920. Protestants, meanwhile, continued to denounce the Roman Catholic Church and charged that American Catholics gave their allegiance to the Pope and not to their country.

In 1921, Congress passed the Emergency Immigration Act as a stopgap immigration measure, and then, three years later, permanently established country-of-origin quotas through the National Origins Act. The number of immigrants annually admitted to the United States from each nation was restricted to two percent of the population who had come from that country and resided in the United States in 1890. (By pushing back three decades, past the recent waves of “new” immigrants from Southern and Eastern Europe, Latin America, and Asia, the law made it extremely difficult for immigrants outside of Northern Europe to legally enter the United States.) The act also explicitly excluded all Asians, though, to satisfy southern and western growers, temporarily omitted restrictions on Mexican immigrants. The Sacco and Vanzetti trial and sweeping immigration restrictions pointed to a rampant nativism. A great number of Americans worried about a burgeoning America that did not resemble the one of times past. Many wrote of an American riven by a cultural war.

 

VIII. Fundamentalist Christianity

In addition to alarms over immigration and the growing presence of Catholicism and Judaism, a new core of Christian fundamentalists were very much concerned about relaxed sexual mores and increased social freedoms, especially as found in city centers. Although never a centralized group, most fundamentalists lashed out against what they saw as a sagging public morality, a world in which Protestantism seemed challenged by Catholicism, women exercised ever greater sexual freedoms, public amusements encouraged selfish and empty pleasures, and critics mocked prohibition through bootlegging and speakeasies.

Christian Fundamentalism arose most directly from a doctrinal dispute among Protestant leaders. Liberal theologians sought to intertwine religion with science and secular culture. These “Modernists,” influenced by the Biblical scholarship of nineteenth century German academics, argued that Christian doctrines about the miraculous might be best understood metaphorically. The church, they said, needed to adapt itself to the world. According to the Baptist pastor Harry Emerson Fosdick, the “coming of Christ” might occur “slowly…but surely, [as] His will and principles [are] worked out by God’s grace in human life and institutions.” ((Harry Emerson Fosdick, “Shall the Fundamentalists Win?” Christian Work 102 (June 10, 1922): 716–722.)) The social gospel, which encouraged Christians to build the Kingdom of God on earth by working against social and economic inequality, was very much tied to liberal theology.

During the 1910s, funding from oil barons Lyman and Milton Stewart enabled the evangelist A. C. Dixon to commission some ninety essays to combat religious liberalism. The collection, known as The Fundamentals, became the foundational documents of Christian fundamentalism, from which the movement’s name is drawn. Contributors agreed that Christian faith rested upon literal truths, that Jesus, for instance, would physically return to earth at the end of time to redeem the righteous and damn the wicked. Some of the essays put forth that human endeavor would not build the Kingdom of God, while others covered such subjects as the virgin birth and biblical inerrancy. American Fundamentalists spanned Protestant denominations and borrowed from diverse philosophies and theologies, most notably the holiness movement, the larger revivalism of the nineteenth century and new dispensationalist theology (in which history proceeded, and would end, through “dispensations” by God). They did, however, all agree that modernism was the enemy and the Bible was the inerrant word of God. It was a fluid movement often without clear boundaries, but it featured many prominent clergymen, including the well-established and extremely vocal John Roach Straton (New York), J. Frank Norris (Texas), and William Bell Riley (Minnesota). ((George Marsden, Fundamentalism and American Culture (New York: Oxford University Press, 1980).))

On March 21, 1925 in a tiny courtroom in Dayton, Tennessee, Fundamentalists gathered to tackle the issues of creation and evolution. A young biology teacher, John T. Scopes, was being tried for teaching his students evolutionary theory in violation of the Butler Act, a state law preventing evolutionary theory or any theory that denied “the Divine Creation of man as taught in the Bible” from being taught in publically-funded Tennessee classrooms. Seeing the act as a threat to personal liberty, the American Civil Liberties Union (ACLU) immediately sought a volunteer for a “test” case, hoping that the conviction and subsequent appeals would lead to a day in the Supreme Court, testing the constitutionality of the law. It was then that Scopes, a part-time teacher and coach, stepped up and voluntarily admitted to teaching evolution (Scopes’ violation of the law was never in question). Thus the stage was set for the pivotal courtroom showdown—“the trial of the century”—between the champions and opponents of evolution that marked a key moment in an enduring American “culture war.” ((Edward J. Larson, Summer for the Gods: The Scopes Trial and America’s Continuing Debate over Science and Religion (Cambridge: Harvard University Press, 1997).))

The case became a public spectacle. Clarence Darrow, an agnostic attorney and a keen liberal mind from Chicago, volunteered to aid the defense came up against William Jennings Bryan. Bryan, the “Great Commoner,” was the three-time presidential candidate who in his younger days had led the political crusade against corporate greed. He had done so then with a firm belief in the righteousness of his cause, and now he defended biblical literalism in similar terms. The theory of evolution, Bryan said, with its emphasis on the survival of the fittest, “would eliminate love and carry man back to a struggle of tooth and claw.” ((Leslie H. Allen, editor, Bryan and Darrow at Dayton: The Record and Documents of the “Bible-Evolution” Trial (New York: Arthur Lee, 1925).))

The Scopes trial signified a pivotal moment when science and religion became diametrically opposed. The Scopes defense team, three of whom are seen in this photograph, argued a literal interpretation of the Bible that is still popular in many fundamentalist Christian circles today. “Dudley Field Malone, Dr. John R. Neal, and Clarence Darrow in Chicago, Illinois.” The Clarence Darrow Digital Collection, http://darrow.law.umn.edu/photo.php?pid=874.

During the Scopes Trial, Clarence Darrow (right) savaged the idea of a literal interpretation of the Bible. “Dudley Field Malone, Dr. John R. Neal, and Clarence Darrow in Chicago, Illinois.” The Clarence Darrow Digital Collection.

Newspapermen and spectators flooded the small town of Dayton. Across the nation, Americans tuned their radios to the national broadcasts of a trial that dealt with questions of religious liberty, academic freedom, parental rights, and the moral responsibility of education. For six days in July, the men and women of America were captivated as Bryan presented his argument on the morally corrupting influence of evolutionary theory (and pointed out that Darrow made a similar argument about the corruptive potential of education during his defense of the famed killers Nathan Leopold and Richard Loeb a year before). Darrow eloquently fought for academic freedom. ((Larson.))

At the request of the defense, Bryan took the stand as an “expert witness” on the Bible. At his age, he was no match for Darrow’s famous skills as a trial lawyer and his answers came across as blundering and incoherent, particularly as he was not in fact a literal believer in all of the Genesis account (believing—as many anti-evolutionists did—that the meaning of the word “day” in the book of Genesis could be taken as allegory) and only hesitantly admitted as much, not wishing to alienate his fundamentalist followers. Additionally, Darrow posed a series of unanswerable questions: Was the “great fish” that swallowed the prophet Jonah created for that specific purpose? What precisely happened astronomically when God made the sun stand still? Bryan, of course, could cite only his faith in miracles. Tied into logical contradictions, Bryan’s testimony was a public relations disaster, although his statements were expunged from the record the next day and no further experts were allowed—Scopes’ guilt being established, the jury delivered a guilty verdict in minutes. The case was later thrown out on technicality. But few cared about the verdict. Darrow had, in many ways, at least to his defenders, already won: the fundamentalists seemed to have taken a beating in the national limelight. Journalist and satirist H. L. Mencken characterized the “circus in Tennessee” as an embarrassment for fundamentalism, and modernists remembered the “Monkey Trial” as a smashing victory. If fundamentalists retreated from the public sphere, they did not disappear entirely. Instead, they went local, built a vibrant subculture, and emerged many decades later stronger than ever. ((Ibid.))

 

IX. Rebirth of the Ku Klux Klan (KKK)

Suspicions of immigrants, Catholics, and modernists contributed to a string of reactionary organizations. None so captured the imaginations of the country as the reborn Ku Klux Klan (KKK), a white supremacist organization that expanded beyond its Reconstruction Era anti-black politics to now claim to protect American values and way of life from blacks, feminists (and other radicals), immigrants, Catholics, Jews, atheists, bootleggers, and a host of other imagined moral enemies.

Two events in 1915 are widely credited with inspiring the rebirth of the Klan: the lynching of Leo Frank and the release of The Birth of the Nation, a popular and groundbreaking film that valorized the Reconstruction Era Klan as a protector of feminine virtue and white racial purity. Taking advantage of this sudden surge of popularity, Colonel William Joseph Simmons organized what is often called the “second” Ku Klux Klan in Georgia in late 1915. This new Klan, modeled after other fraternal organizations with elaborate rituals and a hierarchy, remained largely confined to Georgia and Alabama until 1920, when Simmons began a professional recruiting effort that resulted in individual chapters being formed across the country and membership rising to an estimated five million. ((Nancy MacLean, Behind the Mask of Chivalry: The Making of the Second Ku Klux Klan (New York: Oxford University Press. 1994).))

Partly in response to the migration of Southern blacks to Northern cities during World War I, the KKK expanded above the Mason-Dixon. Membership soared in Philadelphia, Detroit, Chicago, and Portland, while Klan-endorsed mayoral candidates won in Indianapolis, Denver, and Atlanta. (( Kenneth T. Jackson, The Ku Klux Klan in the City, 1915–1930 (New York: Oxford University Press, 1967).)) The Klan often recruited through fraternal organizations such as the Freemasons and through various Protestant churches. In many areas, local Klansmen would visit churches of which they approved and bestow a gift of money upon the presiding minister, often during services. The Klan also enticed people to join through large picnics, parades, rallies, and ceremonies. The Klan established a women’s auxiliary in 1923 headquartered in Little Rock, Arkansas. The Women of the Ku Klux Klan mirrored the KKK in practice and ideology and soon had chapters in all forty-eight states, often attracting women who were already part of the prohibition movement, the defense of which was a centerpiece of Klan activism. ((MacLean.))

Contrary to its perception of as a primarily Southern and lower-class phenomenon, the second Klan had a national reach composed largely of middle-class people. Sociologist Rory McVeigh surveyed the KKK newspaper Imperial Night-Hawk for the years 1923 and 1924, at the organization’s peak, and found the largest number of Klan-related activities to have occurred in Texas, Pennsylvania, Indiana, Illinois, and Georgia. The Klan was even present in Canada, where it was a powerful force within Saskatchewan’s Conservative Party. In many states and localities, the Klan dominated politics to such a level that one could not be elected without the support of the KKK. For example, in 1924, the Klan supported William Lee Cazort for governor of Arkansas, leading his opponent in the Democratic Party primary, Thomas Terral, to seek honorary membership through a Louisiana klavern so as not to be tagged as the anti-Klan candidate. In 1922, Texans elected Earle B. Mayfield, an avowed Klansman who ran openly as that year’s “klandidate,” to the United States Senate. At its peak the Klan claimed between four and five million members. ((George Brown Tindall, The Emergence of the New South: 1913-1945 (Baton Rouge: Louisiana State University Press, 1967).))

Despite the breadth of its political activism, the Klan is today remembered largely as a violent vigilante group—and not without reason. Members of the Klan and affiliated organizations often carried out acts of lynching and “nightriding”—the physical harassment of bootleggers, union activists, civil rights workers, or any others deemed “immoral” (such as suspected adulterers) under the cover of darkness or while wearing their hoods and robes. In fact, Klan violence was extensive enough in Oklahoma that Governor John C. Walton placed the entire state under martial law in 1923. Witnesses testifying before the military court disclosed accounts of Klan violence ranging from the flogging of clandestine brewers to the disfiguring of a prominent black Tulsan for registering African Americans to vote. In Houston, Texas, the Klan maintained an extensive system of surveillance that included tapping telephone lines and putting spies into the local post office in order to root out “undesirables.” A mob that organized and led by Klan members in Aiken, South Carolina, lynched Bertha Lowman and her two brothers in 1926, but no one was ever prosecuted: the sheriff, deputies, city attorney, and state representative all belonged to the Klan. ((MacLean; Wyn Craig Wade, The Fiery Cross: The Ku Klux Klan in America (New York, Oxford University Press, 1998).))

The Klan dwindled in the face of scandal and diminished energy over the last years of the 1920s. By 1930, the Klan only had about 30,000 members and it was largely spent as a national force, only to appear again as a much diminished force during the civil rights movement in the 1950s and 60s.

 

X. Conclusion

In his inauguration speech in 1929, Herbert Hoover told Americans that the Republican Party had brought prosperity. Even ignoring stubbornly large rates of poverty and unparalleled levels of inequality, he could not see the weaknesses behind the decade’s economy. Even as the new culture of consumption promoted new freedoms, it also promoted new insecurities. An economy built on credit exposed the nation to tremendous risk. Flailing European economies, high tariffs, wealth inequality, a construction bubble, and an ever-more flooded consumer market loomed dangerously until the Roaring Twenties would grind to a halt. In a moment the nation’s glitz and glamour seemed to give way to decay and despair. For farmers, racial minorities, unionized workers, and other populations that did not share in 1920s prosperity, the veneer of a Jazz Age and a booming economy had always been a fiction. But for them, as for millions of Americans, end of an era was close. The Great Depression loomed.

 

Contributors

This chapter was edited by Brandy Thomas Wells, with content contributions by Micah Childress, Mari Crabtree, Maggie Flamingo, Guy Lancaster, Emily Remus, Colin Reynolds, Kristopher Shields, and Brandy Thomas Wells.

 

Recommended Reading

  1. Chauncey, George. Gay New York: Gender, Urban Culture, and the Making of the Gay Male World, 1890–1940. New York: Basic Books, 1995.
  2. Cohen, Lizabeth. Making a New Deal: Industrial Workers in Chicago, 1919–1939. New York: Cambridge University Press, 1990.
  3. Douglas, Ann. Terrible Honesty: Mongrel Manhattan in the 1920s. New York: Farrar Straus and Giroux, 1995.
  4. The Culture of Consumption: Critical Essays in American History, 1880- 1980. Edited by Fox, Richard Wightman, and Lears, T. J. Jackson. New York: Pantheon, 1983.
  5. Grant, Colin. Negro with a Hat: The Rise and Fall of Marcus Garvey. New York: Oxford University Press, 2008.
  6. Huggins, Nathan. Harlem Renaissance. New York: Oxford University Press, 1971.
  7. Larson, Edward. Summer for the Gods: The Scopes Trial and America’s Continuing Debate over Science and Religion. Cambridge: Harvard University Press, 1997.
  8. MacLean, Nancy. Behind the Mask of Chivalry: The Making of the Second Ku Klux Klan. New York: Oxford University Press, 1994.
  9. Marsden, George M. Fundamentalism and American Culture: The Shaping of Twentieth- Century Evangelicalism: 1870-1925. New York: Oxford University Press, 1980.
  10. Montgomery, David. The Fall of the House of Labor: The Workplace, the State, and American Labor Activism, 1865-1925. Cambridge University Press, 1988.
  11. Ngai, Mae M., Impossible Subjects: Illegal Aliens and the Making of Modern America. Princeton: Princeton University Press, 2004.
  12. Okrent, Daniel. Last Call: The Rise and Fall of Prohibition. New York: Scribner, 2010.
  13. Sanchez, George. Becoming Mexican American: Ethnicity, Culture, and Identity in Chicano Los Angeles, 1900-1945. New York: Oxford University Press, 1993.
  14. Tindall, George Brown. The Emergence of the New South, 1913-1945. Baton Rouge: Louisiana State University Press, 1967.
  15. Wilkerson, Isabel. The Warmth of Other Sons: The Epic Story of America’s Great Migration. New York: Vintage, 2010.

 

Notes