Category Archives: Uncategorized

30. The Recent Past

New York City, before September 11, 2001, via Library of Congress.

New York City, before September 11, 2001, via Library of Congress.

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

Time marches forever on. The present becomes the past and the past becomes history. But, as William Faulkner wrote, “The past is never dead. It’s not even past.” The last several decades of American history have culminated in the present, an era of innovation and advancement but also of stark partisan division, sluggish economic growth, widening inequalities, widespread military interventions, and pervasive anxieties about the present and future of the United States. Through boom and bust, national tragedy, foreign wars, and the maturation of a new generation, a new chapter of American history awaits.

 

II. American Politics from George H.W. Bush to September 11, 2001

The conservative “Reagan Revolution” lingered over an open field of candidates from both parties as voters approached the presidential election of 1988. At stake was the legacy of a newly empowered conservative movement, a movement that would move forward with Reagan’s vice president, George H. W. Bush, who triumphed over Massachusetts Governor Michael Dukakis with a promise to continue the conservative work that had commenced in the 1980s.

George H. W. Bush was one of the most experienced men every to rise to the presidency. Bush’s father, Prescott Bush, was a United States Senator from Connecticut. George H. W. Bush served as chair of the Republican National Committee, Director of the Central Intelligence Agency, and was elected to the House of Representatives from his district in Texas. He was elected vice president in 1980 and president eight years later. His election signaled Americans’ continued embrace of Reagan’s conservative program.

The dissolution of the Soviet Union left the United States as the world’s only remaining superpower. Global capitalism seemed triumphant. The 1990s brought the development of new markets in Southeast Asia and Eastern Europe. Observers wondered if some final stage of history had been reached, if the old battles had ended, a new global consensus had been reached and a future of peace and open markets would reign forever.

The post-Cold War world was not without international conflicts. Congress granted President Bush approval to intervene in Kuwait in Operation Desert Shield and Operation Desert Storm, commonly referred to as the first Gulf War. With the memories of Vietnam still fresh, many Americans were hesitant to support military action that could expand into a protracted war or long-term commitment of troops. But the war was a swift victory for the United States. President Bush and his advisers opted not to pursue the war into Baghdad and risk an occupation and insurgency. And so the war was won. Many wondered if the “ghosts of Vietnam” had been exorcised. Bush won enormous popularity. Gallup polls showed a job approval rating as high as 89% in the weeks after the end of the war.

The Iraqi military set fire to Kuwait’s oil fields during the Gulf War, many of which burned for months and caused massive pollution. Photograph of oil well fires outside Kuwait City, March 21, 1991. Wikimedia, http://commons.wikimedia.org/wiki/File:Operation_Desert_Storm_22.jpg.

The Iraqi military set fire to Kuwait’s oil fields during the Gulf War, many of which burned for months and caused massive pollution. Photograph of oil well fires outside Kuwait City, March 21, 1991. Wikimedia, http://commons.wikimedia.org/wiki/File:Operation_Desert_Storm_22.jpg.

President Bush’s popularity seemed to suggest an easy reelection in 1992. Bush faced a primary challenge from political commentator Patrick Buchanan, a former Reagan and Nixon White House adviser, who cast Bush as a moderate, an unworthy steward of the conservative movement who was unwilling to fight for conservative Americans in the nation’s ongoing “culture war.” Buchanan did not defeat Bush in the Republican primaries, but he inflicted enough damage to weaken his candidacy.

The Democratic Party nominated a relative unknown, Arkansas Governor Bill Clinton. Dogged by charges of marital infidelity and draft-dodging during the Vietnam War, Clinton was a consummate politician and had both enormous charisma and a skilled political team. He framed himself as a “New Democrat,” a centrist open to free trade, tax cuts, and welfare reform. Twenty-two years younger than Bush, and the first Baby Boomer to make a serious run at the presidency, Clinton presented the campaign as a generational choice. During the campaign he appeared on MTV. He played the saxophone on the Arsenio Hall Show. And he told voters that he could offer the United States a new way forward.

Bush ran on his experience and against Clinton’s moral failings. The GOP convention in Houston that summer featured speeches from Pat Buchanan and religious leader Pat Robertson decrying the moral decay plaguing American life. Clinton was denounced as a social liberal that would weaken the American family with his policies and his moral character. But, Clinton was able to convince voters that his moderated Southern brand of liberalism would be more effective than the moderate conservatism of George Bush. Bush’s candidacy, of course, was most crippled by a sudden economic recession. “It’s the economy, stupid,” Clinton’s political team reminded the country.

Clinton would win the election, but the Reagan Revolution still reigned. Clinton and his running mate, Tennessee Senator Albert Gore, Jr., both moderate southerners, promised a path away from the old liberalism of the 1970s and 1980s. They were Democrats, but ran conservatively.

In his first term Clinton set out an ambitious agenda that included an economic stimulus package, universal health insurance, a continuation of the Middle East peace talks initiated by Bush’s Secretary of State James Baker, welfare reform, and a completion of the North American Free Trade Agreement (NAFTA) to abolish trade barriers between the U.S., Mexico, and Canada.

With NAFTA, Clinton, reversed decades of Democratic opposition to free trade and opened the nation’s northern and southern borders to the free flow of capital and goods. Critics, particularly in the Midwest’s Rust Belt, blasted the agreement for opening American workers to deleterious competition by low-paid foreign workers. Many American factories did relocate by setting up shops–maquilas–in northern Mexico that took advantage of Mexico’s low wages. Thousands of Mexicans rushed to the maquilas. Thousands more continued on past the border.

If NAFTA opened American borders to goods and services, people still navigated strict legal barriers to immigration. Policymakers believed that free trade would create jobs and wealth that would incentivize Mexican workers to stay home, and yet multitudes continued to leave for opportunities in el norte. The 1990s proved that prohibiting illegal migration was, if not impossible, exceedingly difficult. Poverty, political corruption, violence, and hopes for a better life in the United States–or simply higher wages–continued to lure immigrants across the border. Between 1990 and 2000, the proportion of foreign-born individuals in the United States grew from 7.9 percent to 12.9 percent, and the number of undocumented immigrants tripled from 3.5 million to 11.2 during the same period. While large numbers continued to migrate to traditional immigrant destinations—California, Texas, New York, Florida, New Jersey, and Illinois—the 1990s also witnessed unprecedented migration to the American South. Among the fastest-growing immigrant destination states were Kentucky, Tennessee, Arkansas, Georgia, and North Carolina, all of which had immigration growth rates in excess of 100% during the decade.

In response to the continued influx of immigrants and the vocal complaints of anti-immigration activists, policymakers responded with such initiatives as Operations Gatekeeper and Hold the Line, which attempted to make crossing the border more prohibitive. By strengthening physical barriers and beefing up Border Patrol presence in border cities and towns, a new strategy of “funneling” immigrants to dangerous and remote crossing areas emerged. Immigration officials hoped the brutal natural landscape would serve as a natural deterrent.

In his first weeks in office, Clinton reviewed Department of Defense policies that restricted homosexuals from serving in the armed forces. He pushed through a compromise plan, “Don’t Ask, Don’t Tell,” that removed any questions about sexual preference in induction interview but also required that gay servicemen and women keep their sexual preference private. Social conservatives were outraged and his credentials as a conservative southerner suffered.

In his first term Clinton put forward universal health care as a major policy goal and put First Lady Hillary Rodham Clinton in charge of the initiative. But the push for a national healthcare law collapsed on itself. Conservatives revolted, the health care industry flooded the airwaves with attack ads, and voters bristled.

The mid-term elections of 1994 were a disaster for the Democrats, who lost the House of Representatives for the first time since 1952. Congressional Republicans, led by Georgia Congressman Newt Gingrich and Texas Congressman Dick Armey, offered a new “Contract with America.” Republican candidates from around the nation gathered on the steps of the Capitol to pledge their commitment to a conservative legislative blueprint to be enacted if the GOP won control of the House. The strategy worked.

Social conservatives were mobilized by an energized group of religious activists, especially the Christian Coalition, led by Pat Robertson and Ralph Reed. Robertson was a television minister and entrepreneur whose 1988 long shot run for the Republican presidential nomination brought him a massive mailing list and network of religiously motivated voters around the country. From that mailing list, the Christian Coalition organized around the country, seeking to influence politics on the local and national level.

In 1996 the generational contest played out again when the Republicans nominated another aging war hero, Senator Bob Dole of Kansas, but Clinton again won the election, becoming the first Democrat to serve back-back terms since Franklin Roosevelt.

Clinton’s presided over a booming economy fueled by emergent computing technologies. Personal computers had skyrocketed in sales and the internet become a mass phenomenon. Communication and commerce were never again the same. But the tech boom was driven by companies and the 90s saw robust innovation and entrepreneurship. Investors scrambled to find the next Microsoft or Apple, the suddenly massive computing companies. But it was the internet, “the world wide web,” that sparked a bonanza. The “dot-com boom” fueled enormous economic growth and substantial financial speculation to find the next Google or Amazon.

Republicans, defeated at the polls in 1996 and 1998, looked for other ways to sink Clinton’s presidency. Political polarization seemed unprecedented as the Republican congress spent millions on investigations hoping to uncover some shred of damning evidence to sink Clinton’s presidency, whether it be real estate deals, White House staffing, or adultery. Rumors of sexual misconduct had always swirled around Clinton, and congressional investigations targeted the allegations. Called to testify before a grand jury and in a statement to the American public, Clinton denied having “sexual relations” with Monica Lewinsky. Republicans used the testimony to allege perjury. Congress voted to impeach the president. It was a radical and wildly unpopular step. On a vote that mostly fell upon party lines, Clinton was acquitted by the Senate.

The 2000 election pit Vice President Albert Gore, Jr. against George W. Bush, the son of the former president who had been elected twice as Texas governor. Gore, wary of Clinton’s recent impeachment despite Clinton’s enduring approval ratings, distanced himself from the president and eight years of relative prosperity and ran as a pragmatic, moderate liberal.

Bush, too, ran as a moderate, distancing himself from the cruelties of past Republican candidates by claiming to represent a “compassionate conservatism” and a new faith-based politics. Bush was an outspoken evangelical. In a presidential debate, he declared Jesus Christ his favorite political philosopher. He promised to bring church leaders into government and his campaign appealed to churches and clergy to get out the vote. Moreover, he promised to bring honor, dignity, and integrity to the Oval Office, a clear reference Clinton. Utterly lacking the political charisma that had propelled Clinton, Gore withered under Bush’s attacks. Instead of trumpeting the Clinton presidency, Gore found himself answering the media’s questions about whether he was sufficiently an “alpha male” and whether he had “invented the internet.”

Few elections have been as close and contentious as the 200 election, which ended in a deadlock. Gore had won the popular vote by 500,000 votes, but the Electoral College math seemed to have failed him. On election night the media had called Florida for Gore, but then Bush made gains and news organizations backpedaled and then they declared the state for Bush—and Bush the probable president-elect. Gore conceded privately to Bush, then backpedaled as the counts edged back toward Gore yet again. When the nation awake the next day, it was unclear who had been elected president. The close Florida vote triggered an automatic recount.

Lawyers descended on Florida. The Gore campaign called for manual recounts in several counties. Local election boards, Florida Secretary of State Kathleen Harris, and the Florida Supreme Court all weighed in until the United Supreme Court stepped in and, in an unprecedented 5-4 decision in Bush v. Gore, ruled that the recount had to end. Bush was awarded Florida by a margin of 537 votes, enough to win him the state, a majority in the Electoral College, and the presidency.

In his first months in office, Bush fought to push forward enormous tax cuts skewed toward America’s highest earners and struggled with an economy burdened by the bursting of the dot-com-bubble. Old fights seemed ready to be fought, and then everything changed.

 

III. September 11 and the War on Terror

On the morning of September 11, 2001, 19 operatives of the al-Qaeda terrorist organization hijacked four passenger planes on the East Coast. American Airlines Flight 11 crashed into the North Tower of the World Trade Center in New York City at 8:46 a.m. EDT. United Airlines Flight 175 crashed into the South Tower at 9:03. American Airlines Flight 77 crashed into the western façade of the Pentagon at 9:37. At 9:59, the South Tower of the World Trade Center collapsed. At 10:03, United Airlines Flight 93 crashed in a field outside of Shanksville, Pennsylvania, likely brought down by passengers who had received news of the earlier hijackings. And at 10:28, the North Tower collapsed. In less than two hours, nearly 3,000 Americans had been killed.

Six days after the September 11th attacks, the World Trade Center was still crumbling and dozens of men and women were still unaccounted for. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/3/3b/September_17_2001.jpg.

Ground Zero six days after the September 11th attacks. Wikimedia, .

The attacks shocked Americans. Bush addressed the nation and assured the country that “The search is underway for those who are behind these evil acts.” At Ground Zero three days later, Bush thanked the first responders. A worker said he couldn’t hear him. “I can hear you,” Bush shouted back, “The rest of the world hears you. And the people who knocked these buildings down will hear all of us soon.”

American intelligence agencies quickly identified the radical Islamic militant group al-Qaeda, led by the wealthy Saudi Osama Bin Laden, as the perpetrators of the attack. Sheltered in Afghanistan by the Taliban, the country’s Islamic government, al-Qaeda was responsible for a 1993 bombing of the World Trade Center and a string of attacks at U.S. embassies and military bases across the world. Bin Laden’s Islamic radicalism and his anti-American aggression attracted supporters across the region and, by 2001, al-Qaeda was active in over sixty countries.

 

The War on Terror

Although in his campaign Bush had denounced foreign “nation-building,” his administration was populated by “neo-conservatives,” firm believers in the expansion of American democracy and American interests abroad. Bush advanced what was sometimes called the Bush Doctrine, a policy in which the United States would have to the right to unilaterally and pre-emptively make war upon any regime or terrorist organization that posed a threat to the United States or to United States’ citizens. It would lead the United State to protracted conflicts in Afghanistan and Iraq and entangle the United States in nations across the world.

 

The United States and Afghanistan

The United States had a history in Afghanistan. When the Soviet Union invaded Afghanistan in December 1979 to quell an insurrection that threatened to topple Kabul’s communist government, the United States financed and armed anti-Soviet insurgents, the Mujahedeen. In 1981, the Reagan Administration authorized the Central Intelligence Agency (CIA) to provide the Mujahedeen with weapons and training to strengthen the insurgency. An independent wealthy young Saudi, Osama bin Laden, also fought with and funded the mujahedeen. The insurgents began to win. Afghanistan bled the Soviet Union dry. The costs of the war, coupled with growing instability at home, convinced the Soviets to withdraw from Afghanistan in 1989.

Osama bin Laden relocated al-Qaeda to Afghanistan after the country fell to the Taliban in 1996. The United States under Bill Clinton had launched cruise missiles into Afghanistan at al-Qaeda camps in retaliation for al-Qaeda bombings on American embassies in Africa.

Then, after September 11, with a broad authorization of military force, Bush administration officials made plans for military action against al-Qaeda and the Taliban. What would become the longest war in American history began with the launching of Operation Enduring Freedom in October 2001. Air and missile strikes hit targets across Afghanistan. U.S. Special Forces joined with fighters in the anti-Taliban Northern Alliance. Major Afghan cities fell in quick succession. The capital, Kabul, fell on November 13. Bin Laden and Al-Qaeda operatives retreated into the rugged mountains along the border of Pakistan in eastern Afghanistan. The United States military settled in.

 

The United States and Iraq

After the conclusion of the Gulf War in 1991, American officials established economic sanctions, weapons inspections, and “no-fly zones” in Iraq. By mid-1991, American warplanes were routinely patrolling Iraqi skies, where they periodically came under fire from Iraqi missile batteries. The overall cost to the United States of maintaining the two no-fly zones over Iraq was roughly $1 billion a year. Related military activities in the region added about another half million to the annual bill. On the ground in Iraq, meanwhile, Iraqi authorities clashed with U.N. weapons inspectors. Iraq had suspended its program for weapons of mass destruction, but Saddam Hussein fostered ambiguity about the weapons in the minds of regional leaders to forestall any possible attacks against Iraq.

In 1998, a standoff between Hussein and the United Nations over weapons inspections led President Bill Clinton to launch punitive strikes aimed at debilitating what was thought to be a fairly developed chemical weapons program. Attacks began on December 16, 1998. More than 200 cruise missiles fired from U.S. Navy warships and Air Force B-52 bombers flew into Iraq, targeting suspected chemical weapons storage facilities, missile batteries and command centers. Airstrikes continued for three more days, unleashing in total 415 cruise missiles and 600 bombs against 97 targets. The amount of bombs dropped was nearly double the amount used in the 1991 conflict.

The United States and Iraq remained at odds throughout the 1990s and early 2000, when Bush administration officials began considering “regime change.” The Bush Administration began publicly denouncing Saddam Hussein’s regime and its alleged weapons of mass destruction. It was alleged that Hussein was trying to acquire uranium and that it had aluminum tubes used for nuclear centrifuges. George W. Bush said in October, “Facing clear evidence of peril, we cannot wait for the final proof—the smoking gun—that could come in the form of a mushroom cloud.” The United States Congress passed the Authorization for Use of Military Force against Iraq Resolution, giving Bush the power to make war in Iraq.

In late 2002 Iraq began cooperating with U.N. weapons inspectors. But the Bush administration pressed on. On February 6, 2003, Secretary of State Colin Powell, who had risen to public prominence as an Army general during the Persian Gulf War in 1991, presented allegations of a robust Iraqi weapons program to the United Nations.

The first American bombs hit Baghdad on March 20, 2003. Several hundred-thousand troops moved into Iraq and Hussein’s regime quickly collapsed. Baghdad fell on April 9. On May 1, 2003, aboard the USS Abraham Lincoln, beneath a banner reading “Mission Accomplished,” George W. Bush announced that “Major combat operations in Iraq have ended.” No evidence of weapons of mass destruction had been found or would be found. And combat operations had not ended, not really. The insurgency had begun, and the United States would spend the next ten years struggling to contain it.

Despite the celebration of President Bush, combat operations in Iraq would continue for years more. In some ways, it has not ended. Although combat troops were withdrawn from Iraq by December 2011, President Obama announced the use of airstrikes against Iraqi militants in August 2014. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/5/50/USS_Abraham_Lincoln_%28CVN-72%29_Mission_Accomplished.jpg.

Combat operations in Iraq would continue for years. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/5/50/USS_Abraham_Lincoln_%28CVN-72%29_Mission_Accomplished.jpg.

Efforts by various intelligence gathering agencies led to the capture of Saddam Hussein, hidden in an underground compartment near his hometown, on December 13, 2003. The new Iraqi government found him guilty of crimes against humanity and he was hanged on December 30, 2006.

 

IV. The End of the Bush Years

The War on Terror was a centerpiece in the race for the White House in 2004. The Democratic ticket, headed by Massachusetts Senator John F. Kerry, a Vietnam War hero who entered the public consciousness for his subsequent testimony against it, attacked Bush for the ongoing inability to contain the Iraqi insurgency or to find weapons of mass destruction, the revelation, and photographic evidence, that American soldiers had abused prisoners at the Abu Ghraib prison outside of Baghdad, and the inability to find Osama Bin Laden. Moreover, many who had been captured in Iraq and Afghanistan were “detained” indefinitely at a military prison in Guantanamo Bay in Cuba. “Gitmo” became infamous for its harsh treatment, indefinite detentions, and the torture of prisoners. Bush defended the War on Terror and his allies attacked critics for failing to support the troops. Moreover, Kerry had voted for the war. He had to attack what had authorized. Bush won a close but clear victory.

The second Bush term saw the continued deterioration of the wars in Iraq and Afghanistan, but Bush’s presidency would take a bigger hit from his perceived failure to respond to the domestic tragedy that followed Hurricane Katrina’s devastating hit on the Gulf Coast. Katrina had been a category 5 hurricane, what New Orleans Mayor Ray Nagin called “the one we always feared.”

New Orleans suffered a direct hit, the levees broke, and the bulk of the city flooded. Thousands of refugees flocked to the Superdome, where supplies and medical treatment and evacuation were slow to come. Individuals were dying in the heat. Bodies wasted away. Americans saw poor black Americans abandoned. Katrina became a symbol of a broken administrative system, a devastated coastline, and irreparable social structures that allowed escape and recovery for some, and not for others. Critics charged that Bush had staffed his administration with incompetent supporters and had further ignored the displaced poor and black residents of New Orleans.

Hurricane Katrina was one of the deadliest and more destructive hurricanes to hit American soil in U.S. history. It nearly destroyed New Orleans, Louisiana, as well as cities, towns, and rural areas across the Gulf Coast. It sent hundreds of thousands of refugees to near-by cities like Houston, Texas, where they temporarily resided in massive structures like the Astrodome. Photograph, September 1, 2005. Wikimedia, http://commons.wikimedia.org/wiki/File:Katrina-14461.jpg.

Hurricane Katrina was one of the deadliest and more destructive hurricanes to hit American soil in U.S. history. It nearly destroyed New Orleans, Louisiana, as well as cities, towns, and rural areas across the Gulf Coast. It sent hundreds of thousands of refugees to near-by cities like Houston, Texas, where they temporarily resided in massive structures like the Astrodome. Photograph, September 1, 2005. Wikimedia, http://commons.wikimedia.org/wiki/File:Katrina-14461.jpg.

Immigration had become an increasingly potent political issue. The Clinton Administration had overseen the implementation of several anti-immigration policies on the border, but hunger and poverty were stronger incentives than border enforcement policies. Illegal immigration continued, often at great human cost, but nevertheless fanned widespread anti-immigration sentiment among many American conservatives. Many immigrants and their supporters, however, fought back. 2006 saw waves of massive protests across the country. Hundreds of thousands marched in Chicago, New York, and Los Angeles, and tens of thousands marched in smaller cities around the country. Legal change, however, went nowhere. Moderate conservatives feared upsetting business interests’ demand for cheap, exploitable labor and alienating large voting blocs by stifling immigration and moderate liberals feared upsetting anti-immigrant groups by pushing too hard for liberalization of immigration laws.

Afghanistan and Iraq, meanwhile, continued to deteriorate. In 2006, the Taliban reemerged, as the Afghan Government proved both highly corrupt and highly incapable of providing social services or security for its citizens. Iraq only descended further into chaos.

In 2007, 27,000 additional United States forces deployed to Iraq under the command of General David Petraeus. The effort, “the surge,” employed more sophisticated anti-insurgency strategies and, combined with Sunni moves against the disorder, pacified many of Iraq’s cities and provided cover for the withdrawal of American forces. On December 4, 2008, the Iraqi government approved the U.S.-Iraq Status of Forces Agreement and United States combat forces withdrew from Iraqi cities before June 30, 2009. The last US combat forces left Iraq on December 18, 2011. Violence and instability continued to rock the country.

Opened in 2005, this beautiful new mosque at the Islamic Center of America in Dearborn, Michigan, is the largest such religious structure in the United States. Muslims in Dearborn have faced religious and racial prejudice, but the suburb of Detroit continues to be a central meeting-place for American Muslims. Photograph July 8, 2008. Wikimedia, http://commons.wikimedia.org/wiki/File:Islamic_Center_of_America.jpg.

Opened in 2005, the Islamic Center of America in Dearborn, Michigan, is the largest such religious structure in the United States. Muslims in Dearborn have faced religious and racial prejudice, but the suburb of Detroit continues to be a central meeting-place for American Muslims. Photograph July 8, 2008. Wikimedia.

IV. The Great Recession

The Great Recession began, as most American economic catastrophe’s began, with the bursting of a speculative bubble.  Throughout the 1990s and into the new millennium, home prices continued to climb, and financial services firms looked to cash in on what seemed to be a safe but lucrative investment.  Especially after the dot-com bubble burst, investors searched for a secure investment that was rooted in clear value and not trendy technological speculation.  And what could be more secure than real estate?  But mortgage companies began writing increasingly risky loans and then bundling them together and selling them over and over again, sometimes so quickly that it became difficult to determine exactly who owned what.  Decades of lax regulation had again enabled risky business practices to dominate the world of American finance.  When American homeowners began to default on their loans, the whole system tumbled quickly.  Seemingly solid financial services firms disappeared almost overnight. In order to prevent the crisis from spreading, the federal government poured billions of dollars into the industry, propping up hobbled banks. Massive giveaways to bankers created shock waves of resentment throughout the rest of the country.  On the Right, conservative members of the Tea Party decried the cronyism of an Obama administration filled with former Wall Street executives. The same energies also motivated the Occupy Wall Street movement, as mostly young left-leaning New Yorkers protesting an American economy that seemed overwhelmingly tilted toward “the one percent.”

The Great Recession only magnified already rising income and wealth inequalities. According to the Chief Investment Officer at JPMorgan Chase, the largest bank in the United States, “profit margins have reached levels not seen in decades,” and “reductions in wages and benefits explain the majority of the net improvement.” A study from the Congressional Budget authority found that since the late 1970s, after-tax benefits of the wealthiest 1% grew by over 300%. The “average” American’s had grown 35%. Economic trends have disproportionately and objectively benefited the wealthiest Americans. Still, despite some political rhetoric, American frustration has not generated anything like the social unrest of the early twentieth century. A weakened labor movement and a strong conservative base continue to stymie serious attempts at redistributing wealth. Occupy Wall Street managed to generate a fair number of headlines and shift public discussion away from budget cuts and toward inequality, but its membership amounted to only a fraction of the far more influential and money-driven Tea Party. Its presence on the public stage was fleeting.
The Great Recession, however, was not. While American banks quickly recovered and recaptured their steady profits, and the American stock market climbed again to new heights, American workers continued to lag. Job growth would remain miniscule and unemployment rates would remain stubbornly high. Wages froze, meanwhile, and well-paying full-time jobs that were lost were too often replaced by low-paying, part-time work. A generation of workers coming of age within the crisis, moreover, had been savaged by the economic collapse. Unemployment among young Americans hovered for years at rates nearly double the national average.

IV. The Obama Presidency

By the 2008 election, with Iraq still in chaos, Democrats were ready to embrace the anti-war position and sought a candidate who had consistently opposed military action in Iraq. Senator Barack Obama of Illinois had been a member of the state senate when Congress debated the war actions but he had publicly denounced the war, predicting the sectarian violence that would ensue, and remained critical of the invasion through his 2004 campaign for the U.S. Senate. He began running for president almost immediately after arriving in Washington.

A former law professor and community activist, Obama became the first black candidate to ever capture the nomination of a major political party. During the election, Obama won the support of an increasingly anti-war electorate. Already riding a wave of support, however, Bush’s fragile economy finally collapsed in 2007 and 2008. Bush’s policies were widely blamed, and Obama’s opponent, John McCain, was tied to Bush’s policies. Obama won a convincing victory in the fall and became the nation’s first African American president.

President Obama’s first term was marked by domestic affairs, especially his efforts to combat the Great Recession and to pass a national healthcare law. Obama came into office as the economy continued to deteriorate. He managed the bank bailout begun under his predecessor and launched a limited economic stimulus plan to provide countercyclical government spending to spare the country from the worst of the downturn.

Obama’s most substantive legislative achievement proved to be a national healthcare law, the Patient Protection and Affordable Care Act, but typically “Obamacare” by opponents and supporters like. The plan, narrowly passed by Congress, would require all Americans to provide proof of a health insurance plan that measured up to government-established standards. Those who did not purchase a plan would pay a penalty tax, and those who could not afford insurance would be eligible for federal subsidies.

Nationally, as prejudices against homosexuality fell and support for gay marriage reached a majority of the population, the Obama administration moved tentatively. Refusing to push for national interventions on the gay marriage front, Obama did, however, direct a review of Defense Department policies that repealed the “Don’t Ask, Don’t Tell” policy in 2011.

In 2009, President Barack Obama deployed 17,000 additional troops to Afghanistan as part of a counterinsurgency campaign that aimed to “disrupt, dismantle, and defeat” al-Qaeda and the Taliban. Meanwhile, U.S. Special Forces and CIA drones targeted al-Qaeda and Taliban leaders. In May 2011, U.S. Navy SEALs conducted a raid deep into Pakistan that led to the killing of Osama bin Laden. The United States and NATO began a phased withdrawal from Afghanistan in 2011, with an aim of removing all combat troops by 2014. Although weak militarily, the Taliban remained politically influential in south and eastern Afghanistan. Al-Qaeda remained active in Pakistan, but shifted its bases to Yemen and the Horn of Africa. As of December 2013, the war in Afghanistan had claimed the lives of 3,397 U.S. service members.

These former Taliban fighters surrendered their arms to the government of the Islamic Republic of Afghanistan during a reintegration ceremony at the provincial governor’s compound in May 2012. Wikimedia, http://commons.wikimedia.org/wiki/File:Former_Taliban_fighters_return_arms.jpg.

These former Taliban fighters surrendered their arms to the government of the Islamic Republic of Afghanistan during a reintegration ceremony at the provincial governor’s compound in May 2012. Wikimedia, http://commons.wikimedia.org/wiki/File:Former_Taliban_fighters_return_arms.jpg.

Climate change, the role of government, gay marriage, the legalization of marijuana, the rise of China, inequality, surveillance, a stagnant economy, and a host of other issues have confronted recent Americans with sustained urgency.

In 2012, Barack Obama won a second term by defeating Republican Mitt Romney, the former governor of Massachusetts. However, Obama’s inability to pass legislation and the ascendancy of Tea Party Republicans effectively shut down partisan cooperation and stunted the passage of meaningful legislation. Obama was a lame duck before he ever won reelection. Half-hearted efforts to address climate change, for instance, went nowhere. The economy continued its half-hearted recovery. While corporate profits climbed unemployment continued to sag. The Obama administration campaigned on little to address the crisis and accomplished far less.

 

IV. New Horizons

Much public commentary in the early twenty-first century concerned the “millennials,” the new generation that had come of age in the new millennium. Commentators, demographers, and political prognosticators continue to ask what the new generation will bring. Pollsters have found certain features that distinguish the millennials from older Americans. They are, the pollsters say, more diverse, more liberal, less religious, and wracked by economic insecurity.

Millennial attitudes toward homosexuality and gay marriage reflect one of the most dramatic changes in popular attitudes toward recent years. After decades of advocacy, attitudes over the past two decades have shifted rapidly. Gay characters–and characters with depth and complexity–can be found across the cultural landscape and, while national politicians have refused to advocate for it, a majority of Americans now favor the legalization of gay marriage.

Even as anti-immigrant initiatives like California’s Proposition 187 (1994) and Arizona’s SB1070 (2010) reflected the anxieties of many, younger Americans proved far more comfortable with immigration and diversity–which makes sense, given that they are the most diverse American generation in living memory. Since Lyndon Johnson’s Great Society liberalized immigration laws, the demographics of the United States have been transformed. In 2012, nearly one-quarter of all Americans were immigrants or the sons and daughters of immigrants. Half came from Latin America. The ongoing “Hispanicization” of the United States and the ever shrinking proportion of non-Hispanic whites have been the most talked about trends among demographic observers. By 2013, 17% of the nation was Hispanic. In 2014, Latinos surpassed non-Latino whites to became the largest ethnic group in California. In Texas, the image of a white cowboy hardly captures the demographics of a “minority-majority” state in which Hispanic Texans will soon become the largest ethnic group. For the nearly 1.5 million people of Texas’s Rio Grande Valley, for instance, where a majority of residents speak Spanish at home, a full three-fourths of the population is bilingual. Political commentators often wonder what political transformations these populations will bring about when they come of age and begin voting in larger numbers.

Younger Americans are also more concerned about the environment and climate change, and yet, on that front, little has changed. In the 1970s and 1980s, experts substantiated the theory of anthropogenic (human-caused) global warming. Eventually, the most influential of these panels, the UN’s Intergovernmental Panel on Climate Change (IPCC) concluded in 1995 that there was a “discernable human influence on global climate.” This conclusion, though stated conservatively, was by that point essentially a scientific consensus. By 2007, the IPCC considered the evidence “unequivocal” and warned that “unmitigated climate change would, in the long term, be likely to exceed the capacity of natural, managed and human systems to adapt.”

Climate change became a permanent and major topic of public discussion and policy in the twenty-first century. Fueled by popular coverage, most notably, perhaps, the documentary An Inconvenient Truth, based on Al Gore’s book and presentations of the same name, climate change entered much of the American left. And yet American public opinion and political action still lagged far behind the scientific consensus on the dangers of global warming. Conservative politicians, conservative, think tanks, and energy companies waged war against to sow questions in the minds of Americans, who remain divided on the question, and so many others.

Much of the resistance to addressing climate change is economic. As Americans look over their shoulder at China, many refuse to sacrifice immediate economic growth for long-term environmental security. Twenty-first century relations with China are characterized by contradictions and interdependence. After the collapse of the Soviet Union, China reinvigorated its efforts to modernize its country. By liberating and subsidizing much of its economy and drawing enormous foreign investments, China has posted enormous growth rates during the last several decades. Enormous cities rise by the day. In 2000 China had a gross domestic product around an eighth the size of the United States. Based on growth rates and trends, analysts suggest that China’s economy will bypass the United States’ soon. American concerns about China’s political system have persisted, but money sometimes speaks matters more to Americans. China has become one of the country’s leading trade partners. Cultural exchange has increased, and more and more Americans visit China each year, with many settling down to work and study. Conflict between the two societies is not inevitable, but managing bilateral relations will be one of the great challenges of the next decade. It is but one of several aspects of the world confronting Americans of the twenty-first century.

 

V. Conclusion

The collapse of the Soviet Union brought neither global peace nor stability and the later attacks of September 11, 2001 plunged the United States into interminable conflicts around the world. At home, economic recession, entrenched joblessness, and general pessimism infected American life as contentious politics and cultural divisions poisoned social harmony. But trends shift, things change, and history turns. A new generation of Americans look to the future with uncertainty.

 

This chapter was edited by Michael Hammond, with content contributions by Eladio Bobadilla, Andrew Chadwick, Zach Fredman, Leif Fredrickson, Michael Hammond, Richara Hayward, Joseph Locke, Mark Kukis, Shaul Mitelpunkt, Michelle Reeves, Elizabeth Skilton, Bill Speer, and Ben Wright.

cc-by-sa-icon

29. The Rise of the Right

Activist Phyllis Schlafly campaigns against the Equal Rights Amendment in 1978. Bettmann/Corbis.

Activist Phyllis Schlafly campaigns against the Equal Rights Amendment in 1978. Bettmann/Corbis.

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

Speaking to Detroit autoworkers in October of 1980, Republican presidential candidate Ronald Reagan described what he saw as the American Dream under Democratic President Jimmy Carter. The family garage may have still held two cars, cracked Reagan, but they were “both Japanese and they’re out of gas.” The charismatic former governor of California suggested that a once-proud nation was running on empty, but he held out hope for redemption. Stressing the theme of “national decline,” Reagan nevertheless promised to make the United States once again a “shining city upon a hill.” His vision of a dark present and a bright future triumphed.

Reagan stood at the head of a powerful political movement often referred to as the New Right, in contrast to their more moderate conservative predecessors. During the 1970s and 1980s the New Right evolved into the most influential wing of the Republican Party and contributed to its stunning electoral success. During the last quarter of the twentieth century, after decades of liberal dominance, conservative leaders and grassroots activists wrenched the country fully onto a new rightward course. The conservative ascendency built upon the steady unraveling of the New Deal political order during the previous decade and drew in new “Reagan Democrats,” blue-collar voters who lost faith in the old liberal creed, and the emergent religious right, a coalition of conservative religious activists. All the while, enduring conflicts over race, economic policy, gender and sexual politics, and foreign affairs fatally fractured the liberal consensus that had dominated American politics since the presidency of Franklin Roosevelt.

The rise of the right affected Americans’ everyday lives in numerous ways. The Reagan administration embraced “free market” economic theory, dispensing with the principles of income redistribution and social welfare that had animated the New Deal and Great Society. Conservative policymakers tilted the regulatory and legal landscape of the United States toward corporations and wealthy individuals while weakening the “rights” framework that had undergirded advancements by African Americans, Mexican Americans, women, lesbians and gays, and other marginalized groups.

In many ways, however, the rise of the Right promised more than it delivered. Battered but intact, the programs of the New Deal and Great Society survived the 1980s. Despite Republican vows of fiscal discipline, both the federal government and the national debt ballooned. Conservative Christians viewed popular culture as more vulgar and hostile to their values than ever before. In the near term, the New Right registered only partial victories on a range of public policies and cultural issues. Yet, from a long-term perspective conservatives achieved a subtler and more enduring transformation of American politics. In the words of one historian, the conservative movement successfully “changed the terms of debate and placed its opponents on the defensive.” Liberals and their programs and their policies did not disappear, but they increasingly fought battles on terrain chosen by the New Right.

 

II. Conservative Ascendance

The “Reagan Revolution” marked the culmination of a long process of political mobilization on the American right. In the first two decades after World War II the New Deal seemed firmly embedded in American electoral politics and public policy. Even two-term Republican President Dwight D. Eisenhower declined to roll back the welfare state. To be sure, National Review founder William F. Buckley tapped into a deep vein of elite conservatism in 1955 with his call to “stand athwart history yelling ‘stop.’” Senator Joseph McCarthy and John Birch Society founder Robert Welch stirred anti-communist fervor. But in general, the far right lacked organizational cohesion. Following Lyndon Johnson’s resounding defeat of Republican Barry Goldwater in the 1964 presidential election, many observers declared American conservatism finished. New York Times columnist James Reston wrote that Goldwater had “wrecked his party for a long time to come.”

The Conservative insurgency occurred within both major political parties, but the New Right gradually coalesced under the Republican tent. The heightened appeal of conservatism had several causes. The expansive social and economic agenda of Johnson’s Great Society reminded anti-communists of Soviet-style central planning and enflamed fiscal conservatives worried about deficits. Race also drove the creation of the New Right. The civil rights movement, along with the Civil Rights Act and the Voting Rights Act, upended the racial hierarchy of the Jim Crow South. All of these occurred under Democratic leadership, pushing the South toward the Republican Party. In the late 1960s and early 1970s, Black Power, affirmative action, and court-ordered busing of children between schools to achieve racial balance brought “white backlash” to the North, often in cities previously known for political liberalism. To many ordinary Americans, the urban rebellions, antiwar protests, and student uprisings of the late 1960s unleashed social chaos. At the same time, declining wages, rising prices, and growing tax burdens brought economic vulnerability to many working- and middle-class citizens. Liberalism no longer seemed to offer ordinary Americans a roadmap to prosperity, so they searched for new political solutions.

Former Alabama governor and conservative Democrat George Wallace masterfully exploited the racial, cultural, and economic resentments of working-class whites during presidential runs in 1968 and 1972. Wallace’s record as a staunch segregationist made him a hero in the Deep South, where he won five states as a third-party candidate in the 1968 general election. Wallace’s populist message also resonated with blue-collar voters in the industrial North who felt left behind by the rights revolution. On the campaign stump, the fiery candidate lambasted hippies, anti-war protestors, and government bureaucrats. He assailed female welfare recipients for “breeding children as a cash crop” and ridiculed “over-educated, ivory-tower” intellectuals who “don’t know how to park a bicycle straight.” Yet, Wallace also advanced progressive proposals for federal job training programs, a minimum wage hike, and legal protections for collective bargaining. Running as a Democrat in 1972 (and with anti-busing rhetoric as a new arrow in his quiver), Wallace captured the Michigan primary and polled second in Wisconsin, Pennsylvania, and Indiana. In May 1972 an assassin’s bullet left Wallace paralyzed and ended his campaign. Nevertheless, his amalgamation of older, New Deal-style proposals and conservative populism emblemized the rapid re-ordering of party loyalties in the late ’60s and early ’70s. Richard Nixon similarly harnessed the New Right’s sense of grievance through his rhetoric about “law and order” and the “silent majority.” But the New Right remained restive under Nixon and his Republican successor, Gerald Ford.

Religious conservatives also felt themselves under siege from liberalism. In the early 1960s, the Supreme Court decisions prohibiting teacher-led prayer (Engel v. Vitale) and Bible reading in public schools (Abington v. Schempp) led some on the right to conclude that a liberal judicial system threatened Christian values. In the following years, the counterculture’s celebration sex and drugs, along with relaxed obscenity and pornography laws, intensified the conviction that “permissive” liberalism encouraged immorality in private life. Evangelical Protestants—Christians who professed a personal relationship with Jesus Christ, upheld the Bible as an infallible source of truth, and felt a duty to convert, or evangelize, nonbelievers—comprised the core of the so-called “religious right.” The movement also drew energy from devout Catholics. The development of the religious right was not inevitable for several reasons. First, many evangelicals had for decades eschewed politics in favor of spiritual matters; moreover, evangelicalism did not necessarily lead to conservative politics (Democrat Jimmy Carter was an evangelical). Second, Roman Catholics had a long record of loyalty to the Democratic Party. Third, the alliance between evangelicals and Catholics had to overcome decades of mutual antagonism. Only the common enemy of liberalism brought the groups together.

Beginning in the early 1970s the religious right mobilized to protect the “traditional” family. Women comprised a striking number of the religious right’s foot soldiers. Catholic activist Phyllis Schlafly marshaled opposition to the Equal Rights Amendment, while evangelical pop singer Anita Bryant drew national headlines for her successful fight to repeal Miami’s gay rights ordinance in 1977. In 1979, Beverly LaHaye (whose husband Tim—an evangelical pastor in San Diego—would later co-author the Left Behind novels) founded Concerned Women for America, which linked small groups of local activists opposed to the ERA abortion, homosexuality, and no-fault divorce.

Activists like Schlafly and LaHaye valorized motherhood as the highest calling of all women and abortion therefore struck at the core of female identity. More than perhaps any other issue, abortion drew different segments of the religious right—Catholics and Protestants, women and men—together. The Supreme Court’s 1973 Roe v. Wade ruling outraged many devout Catholics, including Long Island housewife and political novice Ellen McCormack. In 1976, McCormack entered the Democratic presidential primaries in an unsuccessful attempt to steer the party to a pro-life position. Roe v. Wade also intensified anti-abortion sentiment among evangelicals (who had been less universally opposed to the procedure than their Catholic counterparts). Christian author Francis Schaeffer cultivated evangelical opposition to abortion through the 1979 documentary film Whatever Happened to the Human Race? arguing that the “fate of the unborn is the fate of the human race.” With the procedure framed in stark, existential terms, many evangelicals felt compelled to combat the procedure through political action.

This book cover succinctly demonstrates the mindset of many conservatives in the Reagan era: what happened to the human race? http://users.aber.ac.uk/jrl/scan0001.jpg.

This book cover succinctly demonstrates the mindset of many conservatives in the Reagan era: what happened to the human race? http://users.aber.ac.uk/jrl/scan0001.jpg.

Grassroots passion drove anti-abortion activism, but a set of religious and secular institutions turned the various strands of the New Right into a sophisticated movement.

In 1979 Jerry Falwell—a Baptist minister and religious broadcaster from Lynchburg, Virginia—founded the Moral Majority, an explicitly political organization dedicated to advancing a “pro-life, pro-family, pro-morality, and pro-American” agenda. Business-oriented institutions also joined the attack on liberalism, fueled by stagflation and by the federal government’s creation of new regulatory agencies like the Environmental Protection Agency and the Occupational Safety and Health Administration. Conservative business leaders bankrolled new “think tanks” like the Heritage Foundation and the Cato Institute. These organizations provided grassroots activists with ready-made policy prescriptions. Other business leaders took a more direct approach by hiring Washington lobbyists and creating Political Action Committees (PACs) to press their agendas in the halls of Congress and federal agencies. Between 1976 and 1980 the number of corporate PACS rose from under 300 to over 1200.

Grassroots activists and business leaders received unlikely support from a circle of “neoconservatives”—disillusioned intellectuals who had rejected liberalism and become Republicans. Irving Kristol, a former Marxist who championed free-market capitalism as a Wall Street Journal columnist, defined a neoconservative as a “liberal who has been mugged by reality.” Neoconservative journals like Commentary and Public Interest argued that the Great Society had proven counterproductive, perpetuating the poverty and racial segregation that it aimed to cure. By the middle of the 1970s, neoconservatives felt mugged by foreign affairs as well. As ardent Cold Warriors, they argued that Nixon’s policy of détente left the United States vulnerable to the Soviet Union.

In sum, several streams of conservative political mobilization converged in the late 1970s. Each wing of the burgeoning conservative movement—disaffected blue-collar workers, white Southerners, evangelicals and devout Catholics, business leaders, disillusioned intellectuals, and Cold War hawks—turned to the Republican Party as the most effective vehicle for their political counter-assault on liberalism and the New Deal political order. After years of mobilization, the domestic and foreign policy catastrophes of the Carter administration provided the head winds that brought the conservative movement to shore.

 

III. The Conservatism of the Carter Years

The election of Jimmy Carter in 1976 brought a Democrat to the White House for the first time since 1969. Large Democratic majorities in Congress provided the new president with an opportunity to move aggressively on the legislative front. With the infighting of the early 1970s behind them, many Democrats hoped the Carter administration would update and expand the New Deal. But Carter won the presidency on a wave of post-Watergate disillusionment with government that did not translate into support for liberal ideas. Events outside Carter’s control helped discredit liberalism, but the president’s own policies also pushed national politics further to the right. In his 1978 State of the Union address, Carter lectured Americans that “[g]overnment cannot solve our problems…it cannot eliminate poverty, or provide a bountiful economy, or reduce inflation, or save our cities, or cure illiteracy, or provide energy.” The statement neatly captured the ideological transformation of the county. Rather than leading a resurgence of American liberalism, Carter became, as one historian put it, “the first president to govern in a post-New Deal framework.”

In its early days the Carter administration embraced several policies backed by liberals. It pushed an economic stimulus package containing $4 billion in public works, extended food stamp benefits to 2.5 million new recipients, enlarged the Earned Income Tax Credit for low-income households, and expanded the Nixon-era Comprehensive Employment and Training Act (CETA)­. But the White House quickly realized that Democratic control of Congress did not guarantee support for the administration’s left-leaning economic proposals. Many of the Democrats elected to Congress in the aftermath of Watergate were more moderate than their predecessors who had been catechized in the New Deal gospel. These conservative Democrats sometimes partnered with Congressional Republicans to oppose Carter, most notably in response to the administration’s proposal for a federal office of consumer protection.

At a deeper level, Carter’s own temperamental and philosophical conservatism hamstrung the administration. Early in his first term, Carter began to worry about the size of the federal deficit and killed a tax rebate he had proposed and Congressional Democrats had embraced. The president’s comprehensive national urban policy veered to the right by transferring many programs to state and local governments, relying on privatization, and endorsing voluntarism and self-help. Organized labor felt abandoned by Carter, who remained cool to several of their highest legislative priorities. The president offered tepid support for national health insurance proposal and declined to lobby aggressively for a package of modest labor law reforms. The business community rallied to defeat the latter measure, in what AFL-CIO chief George Meany described as “an attack by every anti-union group in America to kill the labor movement.” In 1977 and 1978 liberals Democrats rallied behind the Humphrey-Hawkins Full Employment and Training Act, which promised to achieve full employment through government planning. The bill aimed not only to guarantee a job to every American but also to re-unite the interracial, working-class Democratic coalition that had been fractured by deindustrialization and affirmative action. “We must create a climate of shared interests between the needs, the hopes, and the fears of the minorities, and the needs, the hopes, and the fears of the majority,” wrote Senator Hubert Humphrey, Lyndon Johnson’s vice president and the bill’s co-sponsor. Carter’s lack of enthusiasm for the proposal allowed conservatives from both parties to water down the bill to a purely symbolic gesture. Liberals, like labor leaders, came to regard the president as an unreliable ally.

Carter also came under fire from Republicans, especially the religious right. His administration incurred the wrath of evangelicals in 1978 when the Internal Revenue Service established new rules revoking the tax-exempt status of racially segregated, private Christian schools. The rules only strengthened a policy instituted by the Nixon administration; however, the religious right accused Carter of singling out Christian institutions. Republican activist Richard Viguerie described the IRS controversy as the “spark that ignited the religious right’s involvement in real politics.” Race sat just below the surface of the IRS fight. After all, many of the schools had been founded to circumvent court-ordered desegregation. But the IRS ruling allowed the New Right to rain down fire on big government interference while downplaying the practice of racial exclusion at the heart of the case.

While the IRS controversy flared, economic crises multiplied. Unemployment, which had fallen in Carter’s first years in office, rose above 7% by 1980. The rate of inflation averaged 11.3% in 1979, sending prices upward. In another bad omen, the iconic Chrysler Corporation appeared close to bankruptcy. The administration responded to these challenges in fundamentally conservative ways. First, Carter proposed a tax cut for the upper-middle class, which Congress passed in 1978. Second, the White House embraced a long-time goal of the conservative movement by deregulating the airline and trucking industries in 1978 and 1980, respectively. Third, Carter proposed balancing the federal budget—much to the dismay of liberals, who would have preferred that he use deficit spending to finance a new New Deal. Finally, to halt inflation, Carter turned to Paul Volcker, his appointee as Chair of the Federal Reserve. Volcker raised interest rates and tightened the money supply—policies designed to reduce inflation in the long run but which increased unemployment in the short run. Liberalism was on the run.

The “energy crisis” in particular brought out the Southern Baptist moralist in Carter. On July 15, 1979, the president delivered a nationally televised speech on energy policy in which he attributed the country’s economic woes to a “crisis of confidence.” Carter lamented that “too many of us now tend to worship self-indulgence and consumption.” The president’s push to reduce energy consumption was reasonable, and the country’s initial response to the speech was favorable. Yet Carter’s emphasis on discipline and sacrifice, his spiritual diagnosis for economic hardship, sidestepped deeper questions of large-scale economic change and downplayed the harsh toll of inflation on regular Americans.

 

IV. The Election of 1980

These domestic challenges, combined with the Soviet invasion of Afghanistan and the hostage crisis in Iran, hobbled Carter heading into his 1980 reelection campaign. Many Democrats were dismayed by his policies. The president of the International Association of Machinists dismissed Carter as “the best Republican President since Herbert Hoover.” Angered by the White House’s refusal to back national health insurance, Massachusetts Senator Ted Kennedy challenged Carter in the Democratic primaries. Running as the party’s liberal standard-bearer and heir to the legacy of his slain older brothers, Kennedy garnered support from key labor unions and leftwing Democrats. He won the Michigan and Pennsylvania primaries—states where Democrats had embraced George Wallace eight years earlier. Carter ultimately vanquished Kennedy, but the close primary tally betrayed the president’s vulnerability.

Carter’s opponent in the general election was Ronald Reagan, who ran as a staunch fiscal conservative and a Cold War hawk. He vowed to reduce government spending and shrink the federal bureaucracy while eliminating the departments of Energy and Education that Carter created. As in his 1976 primary challenge to Gerald Ford, Reagan accused his opponent of failing to confront the Soviet Union. The GOP candidate vowed steep increases in defense spending and hammered Carter for canceling the B-1 bomber and signing the Panama Canal and Salt II treaties. Carter responded by labeling Reagan a warmonger, but events in Afghanistan and Iran discredited Carter’s foreign policy in the eyes of many Americans.

The incumbent fared no better on domestic affairs. Unemployment reached 7.8% in May 1980, while the Fed’s anti-inflation measures pushed interest rates to an unheard-of 18.5%. On the campaign trail Reagan brought down the house by proclaiming: “A recession is when your neighbor loses his job, and a depression is when you lose your job.” Reagan would pause before concluding, “And a recovery is when Jimmy Carter loses his job.” Carter reminded voters that his opponent opposed the creation of Medicare in 1965 and warned that Reagan would slash popular programs if elected. But the anemic economy prevented Carter’s blows from landing.

Social and cultural issues presented yet another challenge for the president. Despite Carter’s background as a “born-again” Christian and Sunday School teacher, he struggled to court the religious right. Carter scandalized devout Christians by admitting to lustful thoughts during an interview with Playboy magazine in 1976, telling the reporter, “I’ve committed adultery in my heart many times.” Although Reagan was only a nominal Christian and rarely attended church, the religious right embraced him. Reverend Jerry Falwell directed the full weight of the Moral Majority behind Reagan. The organization registered an estimated 2 million new voters in 1980. Ellen McCormack, the New York Catholic who ran for president as a Democrat on an anti-abortion platform in 1976, moved over to the GOP in 1980. Reagan also cultivated the religious right by denouncing abortion and endorsing prayer in school. The IRS tax exemption issue resurfaced as well, with the 1980 Republican platform vowing to “halt the unconstitutional regulatory vendetta launched by Mr. Carter’s IRS commissioner against independent schools.” Early in the primary season, Reagan condemned the policy during a speech at South Carolina’s Bob Jones University, which had recently sued the IRS after losing its tax-exempt status because of the fundamentalist institution’s ban on interracial dating.

Jerry Falwell, the wildly popular TV evangelist, founded the Moral Majority political organization in the late 1970s. Decrying the demise of the nation’s morality, the organization gained a massive following, helping to cement the status of the New Christian Right in American politics. Photograph, date unknown. Wikimedia, http://commons.wikimedia.org/wiki/File:Jerry_Falwell_portrait.jpg.

Jerry Falwell, the wildly popular TV evangelist, founded the Moral Majority political organization in the late 1970s. Decrying the demise of the nation’s morality, the organization gained a massive following, helping to cement the status of the New Christian Right in American politics. Photograph, date unknown. Wikimedia, http://commons.wikimedia.org/wiki/File:Jerry_Falwell_portrait.jpg.

Reagan’s campaign appealed subtly but unmistakably to the racial hostilities of white voters. The candidate held his first post-nominating convention rally at the Neshoba Count Fair near Philadelphia, Mississippi, the town where three civil rights workers had been murdered in 1964. In his speech, Reagan championed the doctrine of states rights, which had been the rallying cry of segregationists in the 1950s and 1960s. In criticizing the welfare state, Reagan had long employed thinly veiled racial stereotypes about a “welfare queen” in Chicago who drove a Cadillac while defrauding the government or a “strapping young buck” purchasing T-bone steaks with food stamps. Like George Wallace before him, Reagan exploited the racial and cultural resentments of struggling white working-class voters. And like Wallace, he attracted blue-collar workers in droves.

With the wind at his back on almost every issue, Reagan only needed to blunt Carter’s characterization of him as an angry extremist. Reagan did so during their only debate by appearing calm and amiable. “Are you better off than you were four years ago?” he asked the American people at the conclusion of the debate. The answer was no. Reagan won the election with 51% of the popular vote to Carter’s 41%. (Independent John Anderson captured 7%.) Despite capturing only a slim majority, Reagan scored a decisive 489-49 victory in the Electoral College. Republicans gained control of the Senate for the first time since 1955 by winning 12 seats. Liberal Democrats George McGovern, Frank Church, and Birch Bayh went down in defeat, as did liberal Republican Jacob Javits. The GOP picked up 33 House seats, narrowing the Democratic advantage in the lower chamber. The New Right had arrived in Washington, DC.

 

V. The New Right in Power

Harkening back to Jeffersonian politics of limited government, a viewpoint that would only increase in popularity over the next three decades, Ronald Reagan launched his campaign by saying bluntly, "I believe in states' rights." Reagan secured the presidency through appealing to the growing conservatism of much of the country. Ronald Reagan and wife Nancy Reagan waving from the limousine during the Inaugural Parade in Washington, D.C. on Inauguration Day, 1981. Wikimedia, http://commons.wikimedia.org/wiki/File:The_Reagans_waving_from_the_limousine_during_the_Inaugural_Parade_1981.jpg.

Harkening back to Jeffersonian politics of limited government, a viewpoint that would only increase in popularity over the next three decades, Ronald Reagan launched his campaign by saying bluntly, “I believe in states’ rights.” Reagan secured the presidency through appealing to the growing conservatism of much of the country. Ronald Reagan and wife Nancy Reagan waving from the limousine during the Inaugural Parade in Washington, D.C. on Inauguration Day, 1981. Wikimedia.

In his first inaugural address Reagan proclaimed that “government is not the solution to the problem, government is the problem.” In reality, Reagan focused less on eliminating government than on redirecting government to serve new ends. In line with that goal, his administration embraced “supply-side” economic theories that had recently gained popularity among the New Right. While the postwar gospel of Keynesian economics had focused on stimulating consumer demand, supply-side economics held that lower personal and corporate tax rates would encourage greater private investment and production. The resulting wealth would “trickle down” to lower-income groups through job creation and higher wages. Conservative economist Arthur Laffer predicted that lower tax rates would generate so much economic activity that federal tax revenues would actually increase. The administration touted the so-called “Laffer Curve” as justification for the tax cut plan that served as the cornerstone of Reagan’s first year in office. Keynesian logic viewed tax cuts as inflationary, stifling the economy. But Republican Congressman Jack Kemp, an early supply-side advocate and co-sponsor of Reagan’s tax bill, promised that it would unleash the “creative genius that has always invigorated America.”

The Iranian hostage crisis ended literally during President Reagan’s inauguration speech. By a coincide of timing, then, the Reagan administration received credit for ending the conflict. This group photograph shows the former hostages in the hospital before being released back to the U.S. Johnson Babela, Photograph, 1981. Wikimedia, http://commons.wikimedia.org/wiki/File:DF-SN-82-06759.jpg.

The Iranian hostage crisis ended literally during President Reagan’s inauguration speech. By a coincide of timing, then, the Reagan administration received credit for ending the conflict. This group photograph shows the former hostages in the hospital before being released back to the U.S. Johnson Babela, Photograph, 1981. Wikimedia, http://commons.wikimedia.org/wiki/File:DF-SN-82-06759.jpg.

The tax cut faced early skepticism from Democrats and even some Republicans. Vice president George H.W. Bush had belittled supple-side theory as “voodoo economics” during the 1980 Republican primaries. But a combination of skill and serendipity pushed the bill over the top. Reagan aggressively and effectively lobbied individual members of Congress for support on the measure. Then on March 30, 1981, Reagan survived an assassination attempt by John Hinckley. Public support swelled for the hospitalized president. Congress ultimately approved a $675-billion tax cut in July 1981 with significant Democratic support. The bill reduced overall federal taxes by more than one quarter and lowered the top marginal rate from 70% to 50%, with the bottom rate dropping from 14% to 11%. It also slashed the rate on capital gains from 28% to 20%The next month, Reagan scored another political triumph in response to a strike called by the Professional Air Traffic Controllers Organization (PATCO). During the 1980 campaign, Reagan had wooed organized labor, describing himself as “an old union man” (he had led the Screen Actor’s Guild from 1947 to 1952) who still held Franklin Roosevelt in high regard. PATCO had been one of the few labor unions to endorse Reagan. Nevertheless, the president ordered the union’s striking air traffic controllers back to work and fired more than 11,000 who refused. Reagan’s actions crippled PATCO and left the American labor movement reeling. For the rest of the 1980s the economic terrain of the United States—already unfavorable to union organizing—shifted decisively in favor of employers. The unionized portion of the private-sector workforce fell from 20% in 1980 to 12% in 1990. Reagan’s defeat of PATCO and his tax bill enhanced the economic power of corporations and high-income households; the conflicts confirmed that a conservative age had dawned in American politics.

The new administration appeared to be flying high in the fall of 1981, but other developments challenged the rosy economic forecasts emanating from the White House. As Reagan ratcheted up tension with the Soviet Union, Congress approved his request for $1.2 trillion in new military spending. Contrary to the assurances of David Stockman—the young supply-side disciple who headed the Office of Management and Budget—the combination of lower taxes and higher defense budgets caused the national debt to balloon. (By the end of Reagan’s first term it equaled 53% of GDP, as opposed to 33% in 1981.) Meanwhile, Federal Reserve Chairman Paul Volcker continued his policy from the Carter years of combating inflation by maintain high interest rates—they surpassed 20% in June 1981. The Fed’s action increased the cost of borrowing money and stifled economic activity.

As a result, the United States experienced a severe economic recession in 1981 and 1982. Unemployment rose to nearly 11%, the highest figure since the Great Depression. Reductions in social welfare spending heightened the impact of the recession on ordinary people. Congress had followed Reagan’s lead by reducing funding for food stamps and Aid to Families with Dependent Children, eliminated the CETA program and its 300,000 jobs, and removed a half-million people from the Supplemental Social Security program for the physically disabled. The cuts exacted an especially harsh toll on low-income communities of color. The head of the NAACP declared the administration’s budget cuts had rekindled “war, pestilence, famine, and death.” Reagan also received bipartisan rebuke in 1981 after proposing cuts to Social Security benefits for early retirees. The Senate voted unanimously to condemn the plan, and Democrats framed it as a heartless attack on the elderly. Confronted with recession and harsh public criticism, a chastened White House worked with Democratic House Speaker Tip O’Neil in 1982 on a bill that restored $98 billion of the previous year’s tax cuts. Despite compromising with the administration on taxes, Democrats railed against the so-called “Reagan Recession,” arguing that the president’s economic policies favored the most fortunate Americans. This appeal, which Democrats termed the “fairness issue,” helped them win 26 House seats in the autumn Congressional races. The New Right appeared to be in trouble.

 

VI. Morning in America

Reagan nimbly adjusted to the political setbacks of 1982. Following the rejection of his Social Security proposals, Reagan appointed a bipartisan panel to consider changes to the program. In early 1983, the commission recommended a one-time delay in cost-of-living increases, a new requirement that government employees pay into the system, and a gradual increase in the retirement age from 65 to 67. The commission also proposed raising state and federal payroll taxes, with the new revenue poured into a trust fund that would transform Social Security from a pay-as-you-go system to one with significant reserves. Congress quickly passed the recommendations into law, allowing Reagan to take credit for strengthening a program cherished by most Americans. The president also benefited from an economic rebound. Real disposable income rose 2.5% in 1983 and 5.8% the following year. Unemployment dropped to 7.5% in 1984. Meanwhile, the “harsh medicine” of high interest rates helped lower inflation to 3.5%.

While campaigning for reelection in 1984, Reagan pointed to the improving economy as evidence that it was “morning again in America.” His personal popularity soared. Most conservatives ignored the debt increase and tax hikes of the previous two years. Reagan’s Democratic opponent in 1984 was Walter Mondale, Jimmy Carter’s vice president and a staunch ally of organized labor. In the Democratic primaries Mondale had faced civil rights activist Jesse Jackson and Colorado Senator Gary Hart, who rose to prominence in 1972 as George McGovern’s campaign manager and took office two years later as one of the “Watergate babies.” Jackson offered a thoroughly progressive program but won only two states. Hart’s platform—economically moderate but socially liberal—inverted the political formula of Mondale’s New Deal liberalism. Throughout the primaries, Hart contrasted his “new ideas” with Mondale’s “old-fashioned” labor-liberalism. Mondale eventually secured his party’s nomination but suffered a crushing defeat in the general election. Reagan captured 49 of 50 states, winning 58.8% of the popular vote.

Mondale’s loss demoralized Democrats. The future of their party belonged to post-New Deal liberals like Hart and to the constituency that supported him in the primaries: upwardly mobile professionals and suburbanites. In February 1985, a group of moderates and centrists formed the Democratic Leadership Council (DLC) as a vehicle for distancing the party from organizing labor and cultivating the business community. Jesse Jackson dismissed the DLC as “Democrats for the Leisure Class,” but the organization included many of the party’s future leaders, including Arkansas Governor Bill Clinton. The formation of the DLC illustrated the degree to which to the New Right had transformed American politics.

Reagan entered his second term with a much stronger mandate than in 1981, but the GOP makeover of Washington, DC stalled, especially after Democrats regained control of the Senate in 1986. Democratic opposition prevented Reagan from eliminating means-tested social welfare programs; however, Congress failed to increase benefit levels for welfare programs or raise the minimum wage, decreasing the real value of those benefits. Democrats and Republicans occasionally fashioned legislative compromises, as with the Tax Reform Act of 1986. The bill lowered the top corporate tax rate from 46% to 34% and reduced the highest marginal rate from 50% to 28%, while also simplifying the tax code and eliminating numerous loopholes. Both parties—as well as the White House—claimed credit for the bargain, but it made virtually no net change to federal revenues. In 1986, Reagan also signed into law the Immigration Reform and Control Act. American policymakers hoped to do two things: deal with the millions of undocumented immigrants already in the United States while simultaneously choking off future unsanctioned migration. The former goal was achieved (nearly three million undocumented workers were granted legal status) but the latter proved elusive.

One of Reagan’s most far-reaching victories occurred through judicial appointments. He named 368 district and federal appeals court judges during his two terms. Observers noted that almost all of the appointees were white men. (Seven were African American, fifteen were Latino, and two were Asian American.) Reagan also appointed three Supreme Court justice­s: Sandra Day O’Connor, who to the dismay of the religious right turned out to be a moderate; Anthony Kennedy, a solidly conservative Catholic who occasionally sided with the court’s liberal wing; and arch-conservative Antonin Scalia. The New Right’s transformation of the judiciary had limits. In 1987, Reagan nominated Robert Bork to fill a vacancy on the Supreme Court. Bork, a federal judge and former Yale University law professor, was a staunch conservative. He had opposed the 1964 Civil Rights Act, affirmative action, and the Roe v. Wade decision. After acrimonious confirmation hearings, the Senate rejected Bork’s nomination by a vote of 58-42. African Americans read the nomination as another signal of the conservative movement’s hostility to their social, economic, and political aspirations.

 

VII. African American Life in Reagan’s America

Ronald Reagan’s America presented African Americans with a series of contradictions. Blacks achieved significant advances in politics, culture, and socio-economic status. African Americans continued a trend from the late 1960s and 1970s by gaining control of municipal governments during the 1980s. In 1983, voters in Philadelphia and Chicago elected Wilson Goode and Harold Washington, respectively, as their cities’ first black mayors. At the national level, civil rights leader Jesse Jackson became the first African American man to run for president when he campaigned for the Democratic Party’s nomination in 1984 and 1988. Propelled by chants of “Run, Jesse, Run,” Jackson achieved notable success in 1988, winning nine state primaries and finishing second with 29% of the vote.

Jesse Jackson was only the second African American to mount a national campaign for the presidency. His work as a civil rights activist and Baptist minister garnered him a significant following in the African American community, but never enough to secure the Democratic nomination. His Warren K. Leffler, “IVU w/ [i.e., interview with] Rev. Jesse Jackson,” July 1, 1983. Library of Congress, http://www.loc.gov/pictures/item/2003688127/.

Jesse Jackson was only the second African American to mount a national campaign for the presidency. His work as a civil rights activist and Baptist minister garnered him a significant following in the African American community, but never enough to secure the Democratic nomination. His Warren K. Leffler, “IVU w/ [i.e., interview with] Rev. Jesse Jackson,” July 1, 1983. Library of Congress, http://www.loc.gov/pictures/item/2003688127/.

The excitement created by Jackson’s campaign mirrored the acclaim received by a few prominent African Americans in media and entertainment. Comedian Eddie Murphy rose to stardom on television’s Saturday Night Live, and achieved box office success with movies like 48 Hours and Beverly Hills Cop. In 1982 pop singer Michael Jackson released Thriller, the best-selling album of all time. Oprah Winfrey began her phenomenally successful nationally syndicated talk show in 1985. Comedian Bill Cosby’s sitcom about an African American doctor and lawyer raising their four children drew the highest ratings on television for most of the decade. The popularity of The Cosby Show revealed how class informed perceptions of race in the 1980s. Cosby’s fictional TV family represented the growing number of black middle-class professionals in the United States. Indeed, income for the top fifth of African American households increased faster than that of white households for most of the decade.Middle-class African Americans found new doors open to them in the 1980s, but poor and working-class blacks faced continued challenges. During Reagan’s last year in office the African American poverty rate stood at 31.6%, as opposed to 10.1% for whites. Black unemployment remained double that of whites throughout the decade. By 1990, the median income for black families was $21,423, 42% below white households. The Reagan administration did little to address such disparities and in many ways intensified them. Furthermore, the New Right threatened the legal principles and federal policies of the rights revolution and the Great Society. Reagan appointed conservative opponents of affirmative action to lead the Equal Employment Opportunity Commission (future Supreme Court Justice Clarence Thomas) and the Civil Rights Commission while sharply reduced their funding and staffing levels. Federal spending cuts disproportionately affected AFDC, Medicaid, food stamps, school lunch, and job training programs that provided crucial support to African American households. In 1982 the National Urban League’s annual “State of Black America” report concluded that “[n]ever [since the first report in 1976]…has the state of Black America been more vulnerable. Never in that time have black economic rights been under such powerful attack.”The stigma of violent crime also hung over African American communities during the Reagan years. Homicide was the leading cause of death for black males between 15 and 24, occurring at a rate six times higher than for other Americans. Nonetheless, sensationalistic media reports encouraged widespread anxiety about black-on-white crime in big cities. Ironically, such fear could by itself spark violence. In December 1984 a thirty-seven-year-old white engineer, Bernard Goetz, shot and seriously wounded four black teenagers on a New York City subway car. The so-called “Subway Vigilante” suspected the young men—armed with screwdrivers—planned to rob him. Pollsters found that 90% of white New Yorkers sympathized with Goetz. Race relations often seemed more polarized than ever during the 1980s.The attempts by the Reagan administration to roll back affirmative action and shrink welfare programs did not always succeed. By the end of the decade, “diversity” programs were firmly entrenched in private sector employment. Nonetheless, Reagan’s policies and rhetoric had altered the course of racial politics in the United States. Full economic and social equality remained elusive for African Americans in the 1980s.

VIII. Bad Times and Good Times

Working- and middle-class Americans, especially those of color, struggled to maintain economic equilibrium during the Reagan years. The growing national debt generated fresh economic pain. The federal government borrowed money to finance the debt, raising interest rates to heighten the appeal of government bonds. Foreign money poured into the United States, raising the value of the dollar and attracting an influx of goods from overseas. The imbalance between American imports and exports grew from $36 billion in 1980 to $170 billion in 1987. Foreign competition battered the already anemic manufacturing sector. The appeal of government bonds likewise drew investment away from American industry.

Continuing a recent trend, many steel and automobile factories in the industrial Northeast and Midwest closed or moved overseas during the 1980s. Bruce Springsteen, the bard of blue-collar America, offered eulogies to Rust Belt cities in songs like “Youngstown” and “My Hometown,” in which the narrator laments that his “foreman says these jobs are going boys/and they ain’t coming back.” Meanwhile, a “farm crisis” gripped the rural United States. Expanded world production meant new competition for American farmers, while soaring interest rates caused the already sizable debt held by family farms to mushroom. Farm foreclosures skyrocketed during Reagan’s tenure. In September 1985 prominent musicians including Neil Young and Willie Nelson organized “Live Aid,” a benefit concert at the University of Illinois’s football stadium designed to raise money for struggling farmers.

At the other end of the economic spectrum, wealthy Americans thrived thanks to the policies of the New Right. The financial industry found new ways to earn staggering profits during the Reagan years. Wall Street brokers like “junk bond king” Michael Milken reaped fortunes selling high-risk, high-yield securities. Reckless speculation helped drive the stock market steadily upward until the crash of October 19, 1987. On “Black Friday,” the market plunged 800 points, erasing 13% of its value. Investors lost more than $500 billion. An additional financial crisis loomed in the savings and loan industry, and Reagan’s deregulatory policies bore significant responsibility. In 1982 Reagan signed a bill increasing the amount of federal insurance available to savings and loan depositors, making those financial institutions more popular with consumers. The bill also allowed “S & L’s” to engage in high-risk loans and investments for the first time. Many such deals failed catastrophically, while some S &L managers brazenly stole from their institutions. In the late 1980s, S & L’s failed with regularity, and ordinary Americans lost precious savings. The 1982 law left the government responsible for bailing out S&L’s out at an eventual cost of $132 billion.

 

IX. Culture Wars of the 1980s

Popular culture of the 1980s offered another venue in which conservatives and liberals waged a battle of ideas. Reagan’s militarism and patriotism pervaded movies like Top Gun and the Rambo series, starring Sylvester Stallone as a Vietnam War veteran haunted by his country’s failure to pursue victory in Southeast Asia. In contrast, director Olive Stone offered searing condemnations of the war in Platoon and Born on the Fourth of July. Television shows like Dynasty and Dallas celebrated wealth and glamour, reflecting the pride in conspicuous consumption that emanated from the White House and corporate boardrooms during the decade. At the same time, films like Wall Street and novels like Tom Wolfe’s Bonfire of the Vanities satirized the excesses of the rich. Yet the most significant aspect of 1980s’ popular culture was its lack of politics altogether. Rather, Steven Spielberg’s E.T: The Extra-Terrestrial and his Indiana Jones adventure trilogy topped the box office. Cinematic escapism replaced the serious social examinations of 1970s’ film. Quintessential Hollywood leftist Jane Fonda appeared frequently on television but only to peddle exercise videos.

New forms of media changed the ways in which people experienced popular culture. In many cases, this new media contributed to the privatization of life, as people shifted focus from public spaces to their own homes. Movie theaters faced competition from the video cassette recorder (VCR), which allowed people to watch films (or exercise with Jane Fonda) in the privacy of their living room. Arcades gave way to home video game systems. Personal computers proliferated, a trend spearheaded by the Apple Company and its Apple II computer. Television viewership—once dominated by the “big three” networks of NBC, ABC, and CBS­—fragmented with the rise of cable channels that catered to particular tastes. Few cable channels so captured the popular imagination as MTV, which debuted in 1981. Telegenic artists like Madonna, Prince, and Michael Jackson skillfully used MTV to boost their reputations and album sales. Conservatives condemned music videos for corrupting young people with vulgar, anti-authoritarian messages, but the medium only grew in stature. Critics of MTV targeted Madonna in particular. Her 1989 video “Like a Prayer” drew protests for what some people viewed as sexually suggestive and blasphemous scenes. The religious right increasingly perceived popular culture as hostile to Christian values.

The Apple II computer, introduced in 1977, was the first successful mass-produced microcomputer meant for home use. Rather clunky-looking to our twenty-first-century eyes, this 1984 version of the Apple II was the smallest and sleekest model yet introduced. Indeed, it revolutionized both the substance and design of personal computers. Photograph of the Apple iicb. Wikimedia, http://commons.wikimedia.org/wiki/File:Apple_iicb.jpg.

The Apple II computer, introduced in 1977, was the first successful mass-produced microcomputer meant for home use. Rather clunky-looking to our twenty-first-century eyes, this 1984 version of the Apple II was the smallest and sleekest model yet introduced. Indeed, it revolutionized both the substance and design of personal computers. Photograph of the Apple iicb. Wikimedia, http://commons.wikimedia.org/wiki/File:Apple_iicb.jpg.

Cultural battles were even more heated in the realm of gender and sexual politics. Abortion became an increasingly divisive issue in the 1980s. Pro-life Democrats and pro-choice Republicans grew rare, as the National Abortion Rights Action League enforced pro-choice orthodoxy on the left and the National Right to Life Commission did the same with pro-life orthodoxy on the right. Religious conservatives took advantage of the Republican takeover of the White House and Senate in 1980 to push for new restrictions on abortion—with limited success. Senators Jesse Helms of North Carolina and Orrin Hatch of Utah introduced versions of a “Human Life Amendment” to the U.S. Constitution that defined life as beginning at conception; their efforts failed, though in 1982 Hatch’s amendment came within 18 votes of passage in the Senate. Reagan, more interested in economic issues than social ones, provided only lukewarm support for these efforts. He further outraged anti-abortion activists by appointing Sandra Day O’Connor, a supporter of abortion rights, to the Supreme Court. Despite these setbacks, anti-abortion forces succeeded in defunding some abortion providers. The 1976 Hyde Amendment prohibited the use of federal funds to pay for abortions; by 1990 almost every state had its own version of the Hyde Amendment. Yet some anti-abortion activists demanded more. In 1988 evangelical activist Randall Terry founded Operation Rescue, an organization that targeted abortion clinics and pro-choice politicians with confrontational—and sometimes violent—tactics. Operation Rescue demonstrated that the fight over abortion would grow only more heated in the 1990s.

The emergence of a deadly new illness, Acquired Immunodeficiency Syndrome (AIDS), simultaneously devastated, stigmatized, and energized the nation’s homosexual community. When AIDS appeared in the early 1980s, most of its victims were gay men. For a time the disease was known as GRID—Gay-Related Immunodeficiency Disorder. The epidemic rekindled older pseudo-scientific ideas about inherently diseased nature of homosexual bodies.

The Reagan administration met the issue with indifference, leading Congressman Henry Waxman to rage that “if the same disease had appeared among Americans of Norwegian descent…rather than among gay males, the response of both the government and the medical community would be different.” Some religious figures seemed to relish the opportunity to condemn homosexual activity; Catholic columnist Patrick Buchanan remarked that “the sexual revolution has begun to devour its children.” Homosexuals were left to forge their own response to the crisis. Some turned to confrontation—like New York playwright Larry Kramer. Kramer founded the Gay Men’s Health Crisis, which demanded a more proactive response to the epidemic. Others sought to humanize AIDS victims; this was the goal of the AIDS Memorial Quilt, a commemorative project begun in 1985. By the middle of the decade the federal government began to address the issue haltingly. Surgeon General C. Everett Koop, an evangelical Christian, called for more federal funding on AIDS-related research, much to the dismay of critics on the religious right. By 1987 government spending on AIDS-related research reached $500 million—still only 25% of what experts advocated. In 1987 Reagan convened a presidential commission on AIDS; the commission’s report called for anti-discrimination laws to protect AIDS victims and for more federal spending on AIDS research. The shift encouraged activists. Nevertheless, on issues of abortion and gay rights—as with the push for racial equality—activists spent the 1980s preserving the status quo rather than building on previous gains. This amounted to a significant victory for the New Right.

The AIDS epidemic hit the gay and African American communities particularly hard in the 1980s, prompting awareness campaigns by celebrities like Patti LaBelle. Poster, c. 1980s. Wikimedia, http://commons.wikimedia.org/wiki/File:%22Don%27t_listen_to_rumors_about_AIDS,_get_the_facts!%22_Patti_LaBelle.A025218.jpg.

The AIDS epidemic hit the gay and African American communities particularly hard in the 1980s, prompting awareness campaigns by celebrities like Patti LaBelle. Poster, c. 1980s. Wikimedia, http://commons.wikimedia.org/wiki/File:%22Don%27t_listen_to_rumors_about_AIDS,_get_the_facts!%22_Patti_LaBelle.A025218.jpg.

 

X. The New Right Abroad

If the conservative movement recovered lost ground on the field on gender and sexual politics, it captured the battlefield on American foreign policy in the 1980s­—­for a time, at least. Ronald Reagan entered office a committed Cold Warrior. He held the Soviet Union in contempt, denouncing it in a 1983 speech as an “evil empire.” And he never doubted that the Soviet Union would end up “on the ash heap of history,” as he said in a 1982 speech to the British Parliament. Indeed, Reagan believed it was the duty of the United States to speed the Soviet Union to its inevitable demise. His “Reagan Doctrine” declared that the United States would supply aid to anti-communist forces everywhere in the world. To give this doctrine force, Reagan oversaw an enormous expansion in the defense budget. Federal spending on defense rose from $171 billion in 1981 to $229 billion in 1985, the highest level since the Vietnam War. He described this as a policy of “peace through strength,” a phrase that appealed to Americans who, during the 1970s, feared that the United States was losing its status as the world’s most powerful nation. Yet the irony is that Reagan, for all his militarism, helped bring the Cold War to an end. He achieved it not through nuclear weapons but through negotiation, a tactic he had once scorned.

Reagan’s election came at a time when many Americans feared their country was in an irreversible decline. American forces withdrew in disarray from South Vietnam in 1975. The United States returned control of the Panama Canal to Panama in 1978, despite protests from conservatives. Pro-American dictators were toppled in Iran and Nicaragua in 1979. The Soviet Union invaded Afghanistan that same year, leading conservatives to warn about American weakness in the face of Soviet expansion. Such warnings were commonplace in the 1970s. “Team B,” a group of intellectuals commissioned by the CIA to examine Soviet capabilities, released a report in 1976 stating that “all evidence points to an undeviating Soviet commitment to…global Soviet hegemony.” The Committee on the Present Danger, an organization of conservative foreign policy experts, issued similar statements. When Reagan warned, as he did in 1976, that “this nation has become Number Two in a world where it is dangerous—if not fatal—to be second best,” he was speaking to these fears of decline.

Margaret Thatcher and Ronald Reagan, leaders of two of the world’s most powerful countries, formed an alliance that benefited both throughout their tenures in office. Photograph of Margaret Thatcher with Ronald Reagan at Camp David, December 22, 1984. Wikimedia, http://commons.wikimedia.org/wiki/File:Thatcher_Reagan_Camp_David_sofa_1984.jpg.

Margaret Thatcher and Ronald Reagan, leaders of two of the world’s most powerful countries, formed an alliance that benefited both throughout their tenures in office. Photograph of Margaret Thatcher with Ronald Reagan at Camp David, December 22, 1984. Wikimedia, http://commons.wikimedia.org/wiki/File:Thatcher_Reagan_Camp_David_sofa_1984.jpg.

The Reagan administration made Latin America a showcase for its newly assertive policies. Jimmy Carter had sought to promote human rights in the region, but Reagan and his advisers scrapped this approach and instead focused on fighting communism—a term they applied to all Latin American left-wing movements. Reagan justified American intervention by pointing out Latin America’s proximity to the United States: “San Salvador [in El Salvador] is closer to Houston, Texas, than Houston is to Washington, DC,” he said in one speech, adding, “Central America is America.” And so when communists with ties to Cuba overthrew the government of the Caribbean nation of Grenada in October 1983, Reagan dispatched the United States Marines to the island. Dubbed “Operation Urgent Fury,” the Grenada invasion overthrew the leftist government after less than a week of fighting. Despite the relatively minor nature of the mission, its success gave victory-hungry Americans something to cheer about after the military debacles of the previous two decades.

Operation Urgent Fury, which the U.S. invasion of Grenada came to be called, was broadly supported by the U.S. public, even though it was violation of international law. This support was in large part due to incorrect intelligence disseminated by the U.S. government. This photograph shows the deployment of U.S. Army Rangers into Grenada. Photograph, October 25, 1983. Wikimedia, http://commons.wikimedia.org/wiki/File:US_Army_Rangers_parachute_into_Grenada_during_Operation_Urgent_Fury.jpg.

Operation Urgent Fury, the U.S. invasion of Grenada, was broadly supported by the U.S. public. This photograph shows the deployment of U.S. Army Rangers into Grenada. Photograph, October 25, 1983. Wikimedia.

Grenada was the only time Reagan deployed the American military in Latin America, but the United States also influenced the region by supporting right-wing, anti-communist movements there. From 1981 to 1990, the United States gave more than $4 billion to the government of El Salvador in a largely futile effort to defeat the guerillas of the Farabundo Martí National Liberation Front (FMLN). Salvadoran security forces equipped with American weapons committed numerous atrocities, including the slaughter of almost 1,000 civilians at the village of El Mozote in December 1981. The United States also supported the contras, a right-wing insurgency fighting the leftist Sandinista government in Nicaragua. Reagan, overlooking the contras’ brutal tactics, hailed them as the “moral equivalent of the Founding Fathers.”

The Reagan administration took a more cautious approach in the Middle East, where its policy was determined by a mix of anti-communism and hostility to the Islamic government of Iran. When Iraq invaded Iran in 1980, the United States supplied Iraqi dictator Saddam Hussein with military intelligence and business credits—even after it became clear that Iraqi forces were using chemical weapons. Reagan’s greatest setback in the Middle East came in 1982, when, shortly after Israel invaded Lebanon, he dispatched Marines to the Lebanese city of Beirut to serve as a peacekeeping force. On October 23, 1983, a suicide bomber killed 241 Marines stationed in Beirut. Congressional pressure and anger from the American public forced Reagan to recall the Marines from Lebanon in March 1984. Reagan’s decision demonstrated that, for all his talk of restoring American power, he took a pragmatic approach to foreign policy. He was unwilling to risk another Vietnam by committing American troops to Lebanon.

Though Reagan’s policies toward Central America and the Middle East aroused protest, it was his policy on nuclear weapons that generated the most controversy. Initially Reagan followed the examples of presidents Nixon, Ford, and Carter by pursuing arms limitation talks with the Soviet Union. American officials participated in the Intermediate-range Nuclear Force Talks (INF) that began in 1981 and Strategic Arms Reduction Talks (START) in 1982. But the breakdown of these talks in 1983 led Reagan to proceed with plans to place Pershing II nuclear missiles in Western Europe to counter Soviet SS-20 missiles in Eastern Europe. Reagan went a step further in March 1983, when he announced plans for a “Strategic Defense Initiative,” a space-based system that could shoot down incoming Soviet missiles. Critics derided the program as a “Star Wars” fantasy, and even Reagan’s advisors harbored doubts. “We don’t have the technology to say this,” Secretary of State George Shultz told aides. These aggressive policies fed a growing “nuclear freeze” movement throughout the world. In the United States, organizations like the Committee for a Sane Nuclear Policy organized protests that culminated in a June 1982 rally that drew almost a million people to New York City’s Central Park.

President Reagan proposed space- and ground-based systems to protect the United States from nuclear missiles in his 1984 Strategic Defense Initiative (SDI). Scientists argued it was unrealistic or impossible with contemporary technology, and it was lambasted in the media as “Star Wars.” Indeed, as this artist's representation of SDI shows, it was rather ridiculous. Created October 18, 1984. Wikimedia, http://commons.wikimedia.org/wiki/File:Space_Laser_Satellite_Defense_System_Concept.jpg.

President Reagan proposed space- and ground-based systems to protect the United States from nuclear missiles in his 1984 Strategic Defense Initiative (SDI). Scientists argued it was unrealistic or impossible with contemporary technology, and it was lambasted in the media as “Star Wars.” Indeed, as this artist’s representation of SDI shows, it was rather ridiculous. Created October 18, 1984. Wikimedia, http://commons.wikimedia.org/wiki/File:Space_Laser_Satellite_Defense_System_Concept.jpg.

Protests in the streets were echoed by opposition in Congress. Congressional Democrats opposed Reagan’s policies on the merits; congressional Republicans, though they supported Reagan’s anti-communism, were wary of the administration’s fondness for circumventing Congress. In 1982 the House voted 411-0 to approve the Boland Amendment, which barred the United States from supplying funds to overthrow Nicaragua’s Sandinista government. A second Boland Amendment in 1984 prohibited any funding for the anti-Sandinista contra movement. The Reagan administration’s determination to flout these amendments led to a scandal that almost destroyed Reagan’s presidency. Robert MacFarlane, the president’s National Security Advisor, and Oliver North, a member of the National Security Council, raised money to support the contras by selling American missiles to Iran and funneling the money to Nicaragua. When their scheme was revealed in 1986, it was hugely embarrassing for Reagan. The president’s underlings had not only violated the Boland Amendments but had also, by selling arms to Iran, made a mockery of Reagan’s declaration that “America will never make concessions to the terrorists.” But while the Iran-Contra affair generated comparisons to the Watergate scandal, investigators were never able to prove Reagan knew about the operation. Without such a “smoking gun,” talk of impeaching Reagan remained talk.

Though the Iran-Contra scandal tarnished the Reagan administration’s image, it did not derail Reagan’s most significant achievement: easing tensions with the Soviet Union. This would have seemed impossible in Reagan’s first term, when the president exchanged harsh words with a succession of Soviet leaders—Leonid Brezhnev, Yuri Andropov, and Konstantin Chernenko. In 1985, however, Chernenko’s death handed leadership of the Soviet Union to Mikhail Gorbachev. Gorbachev, a true believer in socialism, nonetheless realized that the Soviet Union desperately needed reform. He instituted a program of perestroika, which referred to the restructuring of the Soviet system, and of glasnost, which meant greater transparency in government. Gorbachev also reached out to Reagan in hopes of negotiating an end the arms race that was bankrupting the Soviet Union. Reagan and Gorbachev met in Geneva, Switzerland in 1985 and Reykjavik, Iceland in 1986, where, although they could not agree on anything concrete—thanks to Reagan’s refusal to limit the Strategic Defense Initiative—they developed a rapprochement unprecedented in the history of US-Soviet relations. This trust made possible the Intermediate Nuclear Forces Treaty of 1987, which committed both sides to a sharp reduction in their nuclear arsenal.

By the late 1980s the Soviet empire was crumbling. Some credit must go to Reagan, who successfully combined anti-communist rhetoric—such as his 1987 speech at the Berlin Wall, where he declared, “General Secretary Gorbachev, if you seek peace…tear down this wall!”—with a willingness to negotiate with Soviet leadership. But the real causes of collapse lay within the Soviet empire itself. Soviet-allied governments in Eastern Europe tottered under pressure from dissident organizations like Poland’s Solidarity and East Germany’s Neues Forum; some of these countries were also pressured from within by the Roman Catholic Church, which had turned toward active anti-communism under Pope John Paul II. When Gorbachev made it clear that he would not send the Soviet military to prop up these regimes, they collapsed one by one in 1989—in Poland, Hungary, Czechoslovakia, Romania, Bulgaria, and East Germany. Within the Soviet Union, Gorbachev’s proposed reforms, rather than bring stability, instead unraveled the decaying Soviet system. By 1991 the Soviet Union itself had vanished, dissolving into a “Commonwealth of Independent States.”

 

XI. Conclusion

Reagan left office with the Cold War waning and the economy booming. Unemployment had dipped to 5% by 1988. Between 1981 and 1986, gas prices fell from $1.38 per gallon to 95¢. The stock market recovered from the crash, and the Dow Jones Industrial Average—which stood at 950 in 1981—reached 2,239 by the end of Reagan’s second term. Yet, the economic gains of the decade were unequally distributed. The top fifth of households enjoyed rising incomes while the rest stagnated or declined. In constant dollars, annual CEO pay rose from $3 million in 1980 to roughly $12 million during Reagan’s last year in the White House. Between 1985 and 1989 the number of Americans living in poverty remained steady at 33 million. Real per capita money income grew at only 2% per year, a rate roughly equal to the Carter years. The American economy saw more jobs created than lost during the 1980s, but half of the jobs eliminated were in high-paying industries. Furthermore, half of the new jobs failed to pay wages above the poverty line. The economic divide was most acute for African Americans and Latinos, one-third of whom qualified as poor. Trickle-down economics, it seemed, rarely trickled down.

The conservative triumph of the Reagan years proved incomplete. The number of government employees actually increased under Reagan. With more than 80% of the federal budget committed to defense, entitlement programs, and interest on the national debt, the right’s goal of deficit elimination floundered for lack of substantial areas to cut. Between 1980 and 1989 the national debt rose from $914 billion to $2.7 trillion. Despite steep tax cuts for corporations and the wealthy, the overall tax burden of the American public basically remained unchanged. Moreover, so-called regressive taxes on payroll and certain goods increased the tax burden on low- and middle-income Americans. Finally, Reagan slowed but failed to vanquish the five-decade legacy of economic liberalism. Most New Deal and Great Society proved durable. Government still offered its neediest citizens a safety net, if a now continually shrinking one.

Yet the discourse of American politics had irrevocably changed. The preeminence of conservative political ideas grew ever more pronounced, even when as controlled Congress or the White House. Indeed, the Democratic Party adapted its own message in response to the conservative mood of the country. The United States was on a rightward path.

 

This chapter was edited by Richard Anderson and William J. Schultz, with content contributions by Richard Anderson, Laila Ballout, Marsha Barrett, Seth Bartee, Eladio Bobadilla, Kyle Burke, Andrew Chadwick, Jennifer Donnally, Leif Fredrickson, Kori Graves, Karissa A. Haugeberg, Jonathan Hunt, Stephen Koeth, Colin Reynolds, William J. Schultz, and Daniel Spillman.

cc-by-sa-icon

28. The Unraveling

Abandoned Packard Automotive Plant in Detroit, Michigan. Via Wikimedia.

Abandoned Packard Automotive Plant in Detroit, Michigan. Via Wikimedia.

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

Like many young Americans in 1969, Meredith Hunter was a fan of rock ‘n’ roll. When news spread that the Rolling Stones were playing a massive free concert at California’s Altamont Motor Speedway, Hunter, who was black, made plans to attend with his white girlfriend. But his sister, Dixie, protested. She later recalled, “It was a time when black men and white women were not supposed to be together.” Their home, Berkeley, was more tolerant but, she explained, “things [were] different in Berkeley than the outskirts of town.” She feared what might happen.

Meredith went anyway. He joined 300,000 others eager to hear classic sixties bands—Jefferson Airplane, the Grateful Dead, and, of course, the Rolling Stones—for free. Altamont was to climax the Stones’ first American tour in three years and would be a feature of the documentary (later released as Gimme Shelter) recording it, but the concert was a disorganized disaster. Inadequate sanitation, a horrid sound system, and tainted drugs contributed to a tense and uneasy atmosphere. The Hell’s Angels biker gang were paid $500 in beer to be the the show’s “security team.”

High on dope and armed with sawed-off pool cues, the Angels indiscriminately beat concert-goers who tried to come on the stage. One of those was Meredith Hunter. High on methamphetamines, Hunter approached the stage multiple times and, growing agitated, brandished a revolver. He was promptly stabbed to death by an Angel and his lifeless body was stomped into the ground. The Stones, unaware of the murder just a few feet away, continued jamming “Sympathy for the Devil.”

If the more famous Woodstock music festival typified an idyllic sixties youth culture, Altamont revealed a darker side of American culture, one in which drugs and music were associated not with peace and love but with violence, anger, and death. While many Americans continued to celebrate the political and cultural achievements of the 1960s, a more anxious, conservative mood afflicted many Americans. For some, the United States had not gone nearly far enough to promote greater social equality. For others, the nation had gone too far, had unfairly trampled the rights of one group to promote the selfish needs of others. Onto these brewing dissatisfactions the 1970s dumped the divisive remnants of a failed war, the country’s greatest political scandal, and an intractable economic crisis. To many, it seemed as if the nation stood ready to unravel.

 

II. Vietnam

Frank Wolfe, Vietnam War protestors at the March on the Pentagon, Lyndon B. Johnson Library via Wikimedia, http://commons.wikimedia.org/wiki/File:Vietnam_War_protestors_at_the_March_on_the_Pentagon.jpg.

Frank Wolfe, Vietnam War protestors at the March on the Pentagon, Lyndon B. Johnson Library via Wikimedia.

Perhaps no single issue contributed more to public disillusionment than the Vietnam War. The “domino theory”—the idea that if a country fell to communism, then neighboring states would soon follow—governed American foreign policy. After the communist takeover of China in 1949, the United States financially supported the French military’s effort to retain control over its colonies in Vietnam, Cambodia and Laos. But the French were defeated in 1954 and Vietnam was divided into the communist North and anti-communist South.

The American public remained largely unaware of Vietnam in the early 1960s, even as President John F. Kennedy deployed over sixteen thousand military advisers to help South Vietnam suppress a domestic communist insurgency. This all changed in 1964, when Congress passed the Gulf of Tonkin Resolution after a minor episode involving American and North Vietnamese naval forces. The Johnson administration distorted the incident to provide a pretext for escalating American involvement in Vietnam. The resolution authorized the president to send bombs and troops into Vietnam. Only two senators opposed the resolution.

The first combat troops arrived in South Vietnam in 1965 and as the war deteriorated the Johnson administration escalated the war. Soon hundreds of thousands of troops were deployed. Stalemate, body counts, hazy war aims, and the draft all catalyzed the anti-war movement and triggered protests throughout the United States and Europe. With no end in sight, protesters burned their draft cards, refused to pay income tax, occupied government buildings, and delayed trains loaded with war materials. By 1967, anti-war demonstrations drew crowds in the hundreds of thousands. In one protest, hundreds were arrested after surrounding the Pentagon.

Vietnam was the first “living room war.” Television, print media, and liberal access to the battlefield provided unprecedented coverage of the war’s brutality. Americans confronted grisly images of casualties and atrocities. In 1965, CBS Evening News aired a segment in which United States Marines burned the South Vietnamese village of Cam Ne with little apparent regard for the lives of its occupants, who had been accused of aiding Viet Cong guerrillas. President Johnson berated the head of CBS, yelling “Your boys just shat on the American flag.”

While the U. S. government imposed no formal censorship on the press during Vietnam, the White House and military nevertheless used press briefings and interviews to paint a positive image of the war effort. The United States was winning the war, officials claimed. They cited numbers of enemies killed, villages secured, and South Vietnamese troops trained. American journalists in Vietnam, however, quickly realized the hollowness of such claims (the press referred to afternoon press briefing in Saigon as “the Five O’Clock Follies”). Editors frequently toned down their reporters’ pessimism, often citing conflicting information received from their own sources, who were typically government officials. But the evidence of a stalemate mounted. American troop levels climbed yet victory remained elusive. Stories like CBS’s Cam Ne piece exposed the “credibility gap,” the yawning chasm between the claims of official sources and the reality on the ground in Vietnam.Nothing did more to expose this gap than the 1968 Tet Offensive. In January, communist forces engaged in a coordinated attack on more than one hundred American and South Vietnamese sites throughout South Vietnam, including the American embassy in Saigon. While U.S. forces repulsed the attack and inflicted heavy casualties on the Viet Cong, Tet demonstrated that, despite repeated claims by administration officials, after years of war the enemy could still strike at will anywhere in the country. Subsequent stories and images eroded public trust even further. In 1969, investigative reporter Seymour Hersh revealed that U.S. troops had massacred hundreds of civilians in the village of My Lai. Three years later, Americans cringed at Nick Ut’s wrenching photograph of a naked Vietnamese child fleeing an American napalm attack. More and more American voices came out against the war.

Reeling from the war’s growing unpopularity, on March 31, 1968, President Johnson announced on national television that he would not seek reelection. Eugene McCarthy and Robert F. Kennedy unsuccessfully battled against Johnson’s vice president, Hubert Humphrey, for the Democratic Party nomination (Kennedy was assassinated in June). At the Democratic Party’s national convention in Chicago, local police brutally assaulted protestors on national television. In a closely fought contest, Republican challenger Richard Nixon, running on a platform of “law and order” and a vague plan to end the War. Well aware of domestic pressure to wind down the war, Nixon sought, on the one hand, to appease anti-war sentiment by promising to phase out the draft, train South Vietnamese troops, and gradually withdraw American troops. He called it “Vietnamization.” At the same time, however, Nixon appealed to the so-called “silent majority” of Americans who still supported the war and opposed the anti-war movement by calling for an “honorable” end to the war (he later called it “peace with honor”). He narrowly edged Humphrey in the fall’s election.

“Tragedy at Kent,” May 15, 1970, Life Magazine, http://life.tumblr.com/post/50507601384/on-this-day-in-life-may-15-1970-tragedy-at.

“Tragedy at Kent,” May 15, 1970, Life Magazine.

Public assurances of American withdrawal, however, masked a dramatic escalation of conflict. Looking to incentivize peace talks, Nixon pursued a “madman strategy” of attacking communist supply lines across Laos and Cambodia, hoping to convince the North Vietnamese that he would do anything to stop the war. Conducted without public knowledge or Congressional approval, the bombings failed to spur the peace process and talks stalled before the American imposed November 1969 deadline. News of the attacks renewed anti-war demonstrations. Police and National Guard troops killed six students in separate protests at Jackson State University in Mississippi, and, more famously, Kent State University in Ohio in 1970.

Another three years passed—and another 20,000 American troops died—before an agreement was reached. After Nixon threatened to withdraw all aid and guaranteed to enforce a treaty militarily, the North and South Vietnamese governments signed the Paris Peace Accords in January of 1973, marking the official end of U. S. force commitment to the Vietnam War. Peace was tenuous, and when war resumed North Vietnamese troops quickly overwhelmed Southern forces. By 1975, despite nearly a decade of direct American military engagement, Vietnam was united under a communist government.

The fate of South Vietnam illustrates of Nixon’s ambivalent legacy in American foreign policy. By committing to peace in Vietnam, Nixon lengthened the war and widened its impact. Nixon and other Republicans later blamed the media for America’s defeat, arguing that negative reporting undermined public support for the war. In 1971, the Nixon administration tried unsuccessfully to sue the New York Times and the Washington Post to prevent the publication of the Pentagon Papers, a confidential and damning history of U. S. involvement in Vietnam that was commissioned by the Defense Department and later leaked. Nixon faced a rising tide of congressional opposition to the war, led by prominent senators such as William Fulbright. Congress asserted unprecedented oversight of American war spending. And in 1973, Congress passed the War Powers Resolution, which dramatically reduced the president’s ability to wage war without congressional consent.

The Vietnam War profoundly shaped domestic politics. Moreover, it poisoned Americans’ perceptions of their government and its role in the world. And yet, while the anti-war demonstrations attracted considerable media attention and stand as a hallmark of the sixties counterculture so popularly remembered today, many Americans nevertheless continued to regard the war as just. Wary of the rapid social changes that reshaped American society in the 1960s and worried that anti-war protests further threatened an already tenuous civil order, a growing number of Americans criticized the protests and moved closer to a resurgent American conservatism that brewed throughout the 1970s.

 

III. The Politics of Love, Sex, and Gender

Warren K. Leffler, Demonstrators opposed to the ERA in front of the White House, 1977, http://www.loc.gov/item/2002712194/.

Warren K. Leffler, Demonstrators opposed to the ERA in front of the White House, 1977 via Library of Congress.

Many looked optimistically at what the seventies might offer. Many hoped, like George Clinton’s funk band Funkadelic, that Americans might dance together under a disco glitter ball as “one nation under a groove.” Many Americans—feminists, gay men, lesbians, and married couples across—carried the sexual revolution further. Whether women rejected the monogamy and rigid gender roles at the heart of the nuclear family, American women had fewer children, cohabitation without marriage spiked, straight couples married later (if at all), and divorce levels climbed. Sexuality, decoupled from marriage and procreation, was transformed into a source of personal fulfillment and a worthy political cause.

At the turn of the decade, sexuality was considered a private matter, but one closely linked to civil rights. American law defined legitimate sexual expression within the confines of patriarchal, procreative, middle-class marriage. Interracial marriage was illegal until 1967 and remained largely taboo throughout the 1970s, while government-led sterilization programs threatened the reproductive freedom of poor women of color. Same-sex intercourse and cross-dressing were criminalized in most states, and gay men, lesbians, and transgender people were vulnerable to violent police enforcement as well as discrimination in housing and employment.

Two landmark legal rulings in 1973 established the battle lines for the “sex wars” of the 1970s: First, the Supreme Court’s 7-1 ruling in Roe v. Wade struck down a Texas law that prohibited abortion in all cases when a mother’s life was not in danger. The Court’s decision built upon precedent from a 1965 ruling that, in striking down a Connecticut law prohibiting married couples from using birth control, recognized a constitutional “right to privacy.” In Roe, the Court reasoned that “this right of privacy . . . is broad enough to encompass a woman’s decision whether or not to terminate her pregnancy.” The Court held that states could not interfere with a woman’s right to an abortion during the first trimester of pregnancy and could only fully prohibit abortions during the third trimester. Other Supreme Court rulings, however, held that sexual privacy could be sacrificed for the sake of “community” good. Another 1973 decision, Miller v. California, held that the first amendment did not protect “obscene” material, defined by the Court as anything with sexual appeal that lacked “serious literary, artistic, political, or scientific value.” The ruling expanded states’ abilities to pass laws prohibiting materials like hardcore pornography. State laws were unevenly enforced, however, and pornographic theaters and sex shops proliferated. Americans debated whether these were immoral atrocities, the “vanguard of a new ‘pansexual’ utopia,” as one bathhouse owner called it, or “the ultimate conclusion of sexist logic,” as poet and lesbian feminist Rita Mae Brown charged.

Furthermore, new laws prohibiting employment discrimination increased opportunities for women to make a living outside of the home and marriage. Women—haltingly and with significant disparities—advanced into traditional male occupations, into politics, and into corporate management.

The seventies saw the reform of divorce law. Between 1959 and 1979 the American divorce rate doubled. Close to half of all marriages formed in the 1970s ending in divorce. The stigma attached to divorce evaporated and American culture encouraged individuals to leave abusive or unfulfilling marriages. Before 1969, most states required one spouse to prove that the other was guilty of a specific offense, such as adultery. The difficulty of getting a divorce under this system encouraged widespread lying in divorce courts. Even couples desiring an amicable split were sometimes forced to claim that one spouse had cheated on the other even if neither (or both) had. Other couples temporarily relocated to states with more lenient divorce laws, such as Nevada. Widespread recognition of such practices prompted reforms. In 1969, California adopted the first no-fault divorce law. By the end of the 1970s, almost every state had adopted some form of no-fault divorce. The new laws allowed for divorce on the basis of “irreconcilable differences,” even if only one party felt that he or she could not stay in the marriage.

As straight couples eased the bonds of matrimony, gay men and women negotiated a harsh world that stigmatized homosexuality as a mental illness or depraved immoral act. Building upon postwar efforts by gay rights organizations to bring homosexuality into the mainstream of American culture, young gay activists of the late sixties and seventies began to challenge what they saw as the conservative gradualism of the “homophile” movement. Inspired by the burgeoning radicalism of the Black Power movement, the New Left protests of the Vietnam War, and the counterculture movement for sexual freedom, gay and lesbian activists agitated for a broader set of sexual rights that emphasized an assertive notion of “liberation” rooted not in mainstream assimilation, but in pride of sexual difference.

Perhaps no single incident did more to galvanize gay and lesbian activism than the 1969 uprising at the Stonewall Inn in New York City’s Greenwich Village. Police regularly raided gay bars and hangouts. But when police raided the Stonewall in June 1969, the bar patrons protested and sparked a multi-day street battle that catalyzed a national movement for gay liberation. Seemingly overnight, calls for homophile respectability were replaced with chants of “Gay Power!”

The window under the Stonewall sign reads: “We homosexuals plead with our people to please help maintain peaceful and quiet conduct on the streets of the Village--Mattachine.” Stonewall Inn 1969, Wikimedia, http://commons.wikimedia.org/wiki/File:Stonewall_Inn_1969.jpg.

The window under the Stonewall sign reads: “We homosexuals plead with our people to please help maintain peaceful and quiet conduct on the streets of the Village–Mattachine.” Stonewall Inn 1969, via Wikimedia.

In the seventies, gay activists attacked the popular culture that demanded them to keep their sexuality hidden. Activists urged gay Americans to “come out.” Gay rights organizations cited statistics proving that secrecy contributed to stigma and “coming out” could reduce suicide rates. All movements, however, proceed haltingly. Transgender people were often banned from participating in Gay Pride rallies and lesbian feminist conferences and they, in turn, mobilized to fight the high incidence of rape, abuse, and murder of transgender people. Activists now declared “all power to Trans Liberation.”

Throughout the following years, gay Americans gained unparalleled access to private and public spaces. A step towards the “normalization” of homosexuality occurred in 1973, when the American Psychiatric Association stopped classifying homosexuality as a mental illness. Pressure mounted on politicians. In 1982, Wisconsin became the first state to ban discrimination based on sexual orientation and more than eighty cities and nine states followed over the following decade. But progress proceeded unevenly and gay Americans continued to suffer hardships from a hostile culture.

As events in the 1970s broadened sexual freedoms and promoted greater gender equality, so too did they generate sustained and organized opposition. Evangelical Christians and other moral conservatives, for instance, mobilized to reverse gay victories. In 1977, Dade County, Florida used the slogan “Save Our Children” to overturn an ordinance banning discrimination based on sexual orientation. A leader of the brewing religious right, Jerry Falwell, said in 1980 that “It is now time to take a stand on certain moral issues …. We must stand against the Equal Rights Amendment, the feminist revolution, and the homosexual revolution. We must have a revival in this country.”

The most stunning conservative counterattack of the seventies was the defeat of the Equal Rights Amendment (ERA). Versions of the Amendment, which declared, “Equality of rights under the law shall not be denied or abridged by the United States or any state on account of sex,” were introduced to Congress each year since 1923. It finally passed amid the revolutions of the sixties and seventies and went to the states for ratification in March 1972. With high approval ratings, the ERA seemed destined to swiftly pass through state legislatures and become the Twenty-Seventh Amendment. Hawaii ratified the Amendment the same day it was passed by Congress. Within a year, thirty states had done likewise. But then the Amendment stalled. It took years for passage in more states. In 1977, Indiana became the thirty-fifth and last state to ratify.

By 1977, anti-ERA forces had gathered and deployed their strength against the Amendment. At a time when many women shared Betty Friedan’s frustration that society seemed to confine women to the role of homemaker, Phyllis Schlafly’s STOP ERA organization (“Stop Taking Our Privileges”) trumpeted the value and advantages of homemakers and mothers. Schlafly worked tirelessly to stifle the ERA. She lobbied legislators and organized counter-rallies to ensure that Americans heard “from the millions of happily married women who believe in the laws which protect the family and require the husband to support his wife and children.” The Amendment had needed only three more states for ratification. It never got them. In 1982 the ratification crusade expired.

The failed battle for the ERA uncovered the limits of the feminist crusade. And it illustrated the women’s movement’s inherent incapacity to represent fully the views of fifty percent of the country’s population, a population riven by class differences, racial disparities, and cultural and religious divisions.

 

IV. Race and Social and Cultural Anxieties

Los Angeles police hustle rioter into car, August 13, 1965, Wikimedia, http://commons.wikimedia.org/wiki/File:Wattsriots-policearrest-loc.jpg.

Los Angeles police hustle rioter into car, August 13, 1965, Wikimedia.

The lines of race and class and culture ruptured American life throughout the 1970s. Americans grew disenchanted with the pace of social change: it was insufficient, some said; it was excessive, said others. The idealism of the 1960s died. Alienation took its place.

As the monolith of American culture shattered—a monolith pilloried in the fifties and sixties as exclusively white, male-dominated, conservative, and stifling—the culture seemed to fracture and Americans retreated into tribal subcultures. Mass culture became segmented. Marketers targeted particular products to ever smaller pieces of the population, including previously neglected groups such as African Americans, who, despite continuing inequality, acquired more disposable income. Subcultures often revolved around certain musical styles, whether pop, disco, hard rock, punk rock, country, or hip-hop. Styles of dress and physical appearance likewise aligned with cultures of choice.

If the popular rock acts of the sixties appealed to a new counterculture, the seventies witnessed the resurgence of cultural forms that appealed to a white working class confronting the social and political upheavals of the 1960s. Country hits such as Merle Haggard’s “Okie from Muskogee” evoked simpler times and places where people “still wave Old Glory down at the courthouse” and they “don’t let our hair grow long and shaggy like the hippies out in San Francisco.” A popular television sitcom, All in the Family, became an unexpected hit among “middle America” The main character Archie Bunker was designed to mock reactionary middle-aged white men. “Isn’t anyone interested in upholding standards?” he lamented in an episode dealing with housing integration. “Our world is coming crumbling down. The coons are coming!”

CBS Television, All in the Family Cast 1973, Wikimedia, http://commons.wikimedia.org/wiki/File:All_In_the_Family_cast_1973.JPG.

CBS Television, All in the Family Cast 1973, Wikimedia.

As Bunker knew, African-Americans were becoming much more visible in American culture. While black cultural forms had been prominent throughout American history, they assumed new popular forms in the 1970s. Disco offered a new, optimistic, racially integrated pop music. Behind the scenes, African American religious styles became an outsized influence on pop music. Musicians like Aretha Franklin, Andre Crouch, and “fifth Beatle” Billy Collins brought their background in church performance to their own recordings as well as to the work of white artists like the Rolling Stones, with whom they collaborated. And by the end of the decade African American musical artists had introduced American society to one of the most significant musical innovations in decades: the Sugarhill Gang’s 1979 record, Rapper’s Delight. A lengthy paean to black machismo, it became the first rap single to reach the top 40.

Just as rap represented a hyper-masculine black cultural form, Hollywood popularized its white equivalent. Films such as 1971’s Dirty Harry captured a darker side of the national mood. Clint Eastwood’s titular character exacted violent justice on clear villains, working within the sort of brutally simplistic ethical standard that appealed to Americans anxious about a perceived breakdown in “law and order” (more than one critic slammed the film’s glorified “fascism”) and the need for violent reprisals.

Violence increasingly marked American race relations. No longer confined to the anti-black terrorism that struck the southern civil rights movement in the 1950s and 1960s, violence now broke out across the country among blacks in urban riots and among whites protesting new civil rights programs. In the mid-1970s, for instance, protests over the use of busing to integrate public schools in Boston erupted in violence among whites and blacks.

Racial violence in the nation’s cities tainted many white Americans’ perception of the civil rights movement and urban life in general. Civil unrest broke out across the country, but the riots in Watts/Los Angeles (1965), Newark (1967), and Detroit (1967) were most shocking. In each, a physical altercation between white police officers and African Americans spiraled into days of chaos and destruction. Tens of thousands participated in urban riots. Many looted and destroyed white-owned business. There were dozens of deaths, tens of millions of dollars in property damage, and an exodus of white capital that only further isolated urban poverty.

In 1967, President Johnson appointed the Kerner Commission to investigate the causes of America’s riots. Their report became an unexpected bestseller. The Commission cited black frustration with the hopelessness of urban poverty. As the head of the black National Business League testified, “It is to be more than naïve—indeed, it is a little short of sheer madness—for anyone to expect the very poorest of the American poor to remain docile and content in their poverty when television constantly and eternally dangles the opulence of our affluent society before their hungry eyes.” A Newark rioter who looted several boxes of shirts and shoes put it more simply: “They tell us about that pie in the sky but that pie in the sky is too damn high.” But white conservatives blasted the conclusion that white racism and economic hopelessness were to blame for the violence. African Americans wantonly destroying private property, they said, was not a symptom of America’s intractable racial inequalities, but the logical outcome of a liberal culture of permissiveness that tolerated, even encouraged, nihilistic civil disobedience. Many moderates and liberals, meanwhile, saw the explosive violence as a sign African Americans had rejected the nonviolent strategies of the civil rights movement.

The unrest of the late ‘60s did, in fact, reflect a real and growing disillusionment among African Americans with the fate of the civil rights crusade. Political achievements such as the 1964 Civil Rights Act and 1965 Voting Rights Act were indispensable legal preconditions for social and political equality, but the movement’s long (and now often forgotten) goal of economic justice proved as elusive as ever. 1968 found Martin Luther King, Jr., organized the Poor People’s Campaign, a multi-racial struggle to uproot America’s entrenched poverty. “I worked to get these people the right to eat cheeseburgers,” Martin Luther King Jr. supposedly said to Bayard Rustin as they toured the devastation in Watts some years earlier, “and now I’ve got to do something…to help them get the money to buy it.” What good was the right to enter a store without money for purchases?

 

V. Deindustrialization and the Rise of the Sunbelt

Abandoned Youngstown factory, via Flickr user stu_spivack.

Abandoned Youngstown factory, via Flickr user stu_spivack.

Though black leaders like King and Rustin denounced urban violence, they recognized the frustrations that fueled it. In the still-moldering ashes of Jim Crow, African Americans in Watts and similar communities across the country bore the burdens of lifetimes of legally sanctioned discrimination in housing, employment, and credit. The inner cities had become traps that too few could escape.

Segregation survived the legal dismantling of Jim Crow. The perseverance into the present day of stark racial and economic segregation in nearly all American cities destroyed any simple distinction between southern “de jure” segregation and non-southern “de facto” segregation.

Meanwhile, whites and white-owned businesses fled the inner cities, depleted municipal tax bases, and left behind islands of poverty. This flight of people and capital was felt most acutely in the deindustrializing cities of the Northeast and Midwest. Few cases better illustrate these transformations than Detroit. As the automobile industry expanded and especially as the United States transitioned to a wartime economy during World War II, Detroit boomed. When auto manufacturers like Ford and General Motors converted their assembly lines to build machines for the American war effort, observers dubbed the city the “arsenal of democracy.” Newcomers from around the country flooded the city looking for work. Between 1940 and 1947, manufacturing employment increased by 40 percent, and between 1940 and 1943 the number of unemployed workers fell from 135,000 to a mere four thousand. Thanks to New Deal labor legislation and the demands of war, unionized workers in Detroit and elsewhere enjoyed secure employment and increased wages. A vast middle class populated a thriving city with beautiful public architecture, theaters, and libraries.

Workers made material gains throughout the 1940s and 1950s. During the so-called “Great Compression,” Americans of all classes enjoyed in postwar prosperity. A highly progressive tax system and powerful unions lowered income inequality. Rich and poor advanced together. Working class standard-of-living nearly doubled between 1947 and 1973 and unemployment continually fell.

But general prosperity masked deeper vulnerabilities. After the war automobile firms began closing urban factories and moving to outlying suburbs. Several factors fueled the process. Some cities partly deindustrialized themselves. Municipal governments in San Francisco, St. Louis, and Philadelphia banished light industry to make room for high-rise apartments and office buildings. Mechanization seemed to contribute to the decline of American labor. A manager at a newly automated Ford engine plant in postwar Cleveland captured the interconnections between these concerns when he glibly noted to United Automobile Workers (UAW) president Walter Reuther, “you are going to have trouble collecting union dues from all of these machines.” More importantly, however, manufacturing firms sought to lower labor costs by automating, downsizing, and relocating to areas with “business friendly” policies such as low tax rates, anti-union “right-to-work” laws, and low wages.

Detroit began to bleed industrial jobs. Between 1950 and 1958, Chrysler cut its Detroit production workforce in half. In the years between 1953 and 1959, East Detroit lost ten plants and over seventy-one thousand jobs. Detroit was a single-industry town, it was built upon the auto industry. Decisions made by the “Big Three” automakers, therefore, reverberated across the city’s industrial landscape. When auto companies mechanized or moved their operations, ancillary suppliers such as machine tool companies were cut out of the supply chain and were likewise forced to cut their own workforce. Between 1947 and 1977, the number of manufacturing firms in the city dropped from 3,272 to fewer than two thousand. The labor force was gutted. Manufacturing jobs fell from 338,400 to 153,000 over the same three decades.

Industrial restructuring decimated all workers, and many middle-class blacks managed to move out of the city’s ghettoes, but deindustrialization fell heaviest on the city’s African Americans. By 1960, 19.7 percent of black autoworkers in Detroit were unemployed, compared to just 5.8 percent of whites. Overt discrimination in housing and employment had for decades confined blacks to segregated neighborhoods where they were forced to pay exorbitant rents for slum housing. Subject to residential intimidation and cut off from traditional sources of credit, few blacks could afford to follow industry as it left the city for the suburbs and other parts of the country. Detroit devolved into a mass of unemployment, crime, and crippled municipal resources. When riots rocked Detroit in 1967, 25 to 30 percent of blacks between age eighteen and twenty-four were unemployed.

Deindustrialization went hand in hand with the long assault on unionization that had begun in the aftermath of World War II. Without the political support they had enjoyed during the New Deal years, unions such as the Congress of Industrial Organizations (CIO) and the United Auto Workers (UAW) shifted tactics and accepted labor-management accords in which cooperation, not agitation, was the strategic objective. This accord held mixed results for workers. On the one hand, management encouraged employee loyalty through privatized welfare systems that offered workers health benefits and pensions. Grievance arbitration and collective bargaining also allowed workers official channels in which to criticize and push for better conditions. At the same time, unions became increasingly weighed down by bureaucracy and corruption. Union management came to hold primary influence in what was ostensibly a “pluralistic” power relationship, and workers—though still willing to protest—by necessity pursued a more moderate agenda compared to the union workers of the 1930s and 40s.

The decline of labor coincided with ideological changes within American liberalism. Labor and its political concerns undergirded Roosevelt’s New Deal coalition, but by the 1960s many liberals had forsaken working class politics. More and more saw poverty as stemming not from structural flaws in the national economy, but from the failure of individuals to take full advantage of the American system. For instance, while Roosevelt’s New Deal might have attempted to rectify unemployment with government jobs, Johnson’s Great Society and its imitators funded government-sponsored job training, even in places without available jobs. Union leaders in the ‘50s and ‘60s typically supported such programs and philosophies.

Widely shared postwar prosperity leveled off and began to retreat by the mid-1970s. Growing international competition, technological inefficiency, and declining productivity gains stunted working- and middle-class wages. As the country entered recession, wages decreased and the pay gap between workers and management began its long widening. The tax code became less progressive and labor lost its foothold in the marketplace. Unions representing a third of the workforce in the 1950s, but only one in ten workers belonged to one as of 2006.

Geography dictated much of labor’s fall. American firms fled from pro-labor states in the 1970s and 1980s. Some went overseas in the wake of new trade treaties to exploit low-wage foreign workers, but others turned to the anti-union states in the South and West stretching from Virginia to Texas to southern California. Factories shuttered in the North and Midwest and by the 1980s commentators had dubbed America’s former industrial heartland the “the Rust Belt.”

Coined by journalist Kevin Phillips in 1969, the “Sun Belt” refers to the swath of southern and western states that saw unprecedented economic, industrial, and demographic growth after World War II. During the New Deal, President Franklin D. Roosevelt declared the American South “the nation’s No. 1 economic problem” and injected massive federal subsidies, investments, and military spending into the region. During the Cold War, Sun Belt politicians lobbied hard for military installations and government contracts for their states.

Meanwhile, the state’s hostility toward labor beckoned corporate leaders. The Taft-Hartley Act in 1949 facilitated southern states’ frontal assault on unions. Thereafter, cheap, nonunionized labor, low wages, and lax regulations stole northern industries away from the Rust Belt. Skilled northern workers followed the new jobs southward and westward, lured by cheap housing and a warm climate slowly made more tolerable by modern air conditioning.

The South attracted business but struggled to share their profits. Middle class whites grew prosperous, but often these were recent transplants, not native southerners. As the cotton economy shed farmers and laborers, poor white and black southerners found themselves mostly excluded from the fruits of the Sun Belt. Public investments were scarce: white southern politicians channeled federal funding away from primary and secondary public education and toward high-tech industry and university-level research. The Sun Belt inverted Rust Belt realities: the South and West brought growing numbers of high-skill, high-wage jobs but lacked the social and educational infrastructure needed to supply the native poor and middle classes with those same jobs.

Although massive federal investments sparked the Sun Belt’s explosive growth, the New Right took its firmest hold there. The South ran rife with conservative religious ideas, which it exported westward. The leading figures of the nascent religious right rose to prominence in the Sun Belt. Moreover, business-friendly politicians successfully synthesized conservative Protestantism and free-market ideology, creating a potent new political force.

Sunbelt cities were automobile cities. They sprawled across the landscapes. Public space was more limited than in older, denser cities. Politics often revolved around suburban life. Housewives organized reading groups in their homes, and from those reading groups sprouted new organized political activities. Prosperous and mobile, old and new suburbanites gravitated towards an individualistic vision of free enterprise espoused by the Republican Party. Some, especially those most vocally anti-communist, joined groups such as the Young Americans for Freedom and the John Birch Society.  Less radical suburban voters, however, still gravitated towards the more moderate brand of conservatism promoted by Richard Nixon.

 

VI. Nixon

Richard Nixon campaigns in Philadelphia during the 1968 presidential election. National Archives via Wikimedia

Richard Nixon campaigns in Philadelphia during the 1968 presidential election. National Archives via Wikimedia

Once installed in the White House, Richard Nixon focused his energies on shaping American foreign policy. He publicly announced the “Nixon Doctrine” in 1969. While asserting the supremacy of American democratic capitalism, and conceding that the U. S. would continue supporting its allies financially, he denounced previous administrations’ willingness to commit American forces to third world conflicts and warned other states to assume responsibility for their own defense. He was turning America away from the policy of active, anti-communist containment, and toward a new strategy of “détente.”

Promoted by national security advisor and eventual Secretary of State Henry Kissinger, détente sought to stabilize the international system by “thawing” relations with Cold War rivals and bilaterally freezing arms levels. Taking advantage of tensions between the People’s Republic of China (PRC) and the Soviet Union, Nixon pursued closer relations in order to de-escalate tensions and strengthen the United States’ position relative to both countries. The strategy seemed to work. Nixon became the first American president to visit communist China and the first to visit the Soviet Union in 1971 and 1972, respectively. Direct diplomacy and cultural exchange programs with both countries grew and culminated with the formal normalization of U. S.-Chinese relations and the signing of two U. S.-Soviet arms agreements: the anti-ballistic missile (ABM) treaty and the Strategic Arms Limit Treaty (SALT I). By 1973, after almost thirty years of Cold War tension, peaceful coexistence suddenly seemed possible. Short-term gains, however, failed to translate into long-term stability. By the decade’s end, a fragile calm gave way once again to Cold War instability.

A brewing energy crisis interrupted Nixon’s presidency. In November 1973, Nixon appeared on television to inform Americans that energy had become “a serious national problem” and that the United States was “heading toward the most acute shortages of energy since World War II.” The previous month Arab members of the Organization of Petroleum Exporting Countries (OPEC), a cartel of the world’s leading oil producers, embargoed oil exports to the United States in retaliation for American intervention in the Middle East. The embargo caused an “oil shock” and launched the first energy crisis. By the end of 1973, the global price of oil had quadrupled. Drivers waited in line for hours to fill up their cars. Individual gas stations ran out of gas. American motorists worried that oil could run out at any moment. A Pennsylvania man died when his emergency stash of gasoline ignited in his trunk. OPEC rescinded its embargo in 1974, but the economic damage had been done and the energy crisis nevertheless extended into the late 1970s.

Like the Vietnam War, the oil crisis showed that small countries perceived could still hurt the United States. At a time of anxiety about the nation’s future, Vietnam and the energy crisis accelerated Americans’ disenchantments with the United States’ role in the world and the efficacy and quality of its leaders. Furthermore, scandals in the 1970s and early 80s sapped trust in America’s public institutions. Watergate catalyzed the disenchantment of the Unraveling.

On June 17, 1972, five men were arrested inside the offices of the Democratic National Committee (DNC) in the Watergate Complex in downtown Washington, D.C. After being tipped by a security guard, police found the men attempting to install sophisticated bugging equipment. One of those arrested was a former CIA employee then working as a security aide for the Nixon administration’s Committee to Reelect the President (lampooned as “CREEP”).

While there is no direct evidence that Richard Nixon ordered the Watergate break-in, Nixon had been recorded in conversation with his Chief of Staff requesting the DNC chairman be illegally wiretapped to obtain the names of the committee’s financial supporters, which could then be given to the Justice Department and the IRS to conduct spurious investigations into their personal affairs. (Nixon was also recorded ordering his Chief of Staff to break into the offices of the Brookings Institute and take files relating to the war in Vietnam, saying, “Goddamnit, get in and get those files. Blow the safe and get it.”) Whether or not the president ordered the Watergate break-in, the White House launched a massive cover-up. Administration officials ordered the CIA to halt the FBI investigation and paid hush-money to the burglars and White House aides. Nixon distanced himself from the incident publicly and went on to win a landslide election victory in November 1972. But, thanks largely to two persistent journalists at the Washington Post, Bob Woodward and Carl Bernstein, information continued to surface that tied the burglaries ever closer to the CIA, the FBI, and the White House. The Senate held televised hearings. Nixon fired his Chief of Staff and appointed a special prosecutor to investigate the burglary and ongoing investigation, and then, when the investigation progressed, ordered the Attorney General to fire that same prosecutor. Citing “executive privilege,” Nixon refused to comply with orders to produce tapes from the White House’s secret recording system. In July 1974, the House Judiciary Committee approved a bill to impeach the president. Nixon resigned before the full House could vote on impeachment. He became the first and only American president to resign his office.

Vice President Gerald Ford was sworn in as his successor and a month later granted Nixon a full presidential pardon. Nixon disappeared from public life without ever publicly apologizing, accepting responsibility, or facing charges stemming from the scandal.


VII. Carter

Pumpkins carved in the likeness of President Jimmy Carter in Polk County, Florida, October 1980, State Library and Archives of Florida via Flickr, https://www.flickr.com/photos/floridamemory/10554725056/in/photolist-7YvoG-5T4LFb-9L84Cp-dkZJrj-2ZWj62-2PZWZy-2PVsYB-dpHgKe-h5FJ3d-63diMF-dUscrS-8Xrcgm-2PVujx-BtVQR-2ikusL-8SqgFG-9YhFMv-9Y2e3M-8Xrcm5-6raVQ1-8ELVDa-868hpW-5ZqsiS-9BY7r8-nvHn8g-5suU6Y-5suW69-3gEeUd-vTudu-cPabXu-4xsepv-2PVuEZ-2PVtzR-cPaaTj-yeSxD-m3veAn-nvHDjq-8ESf1U-7nPjSi-2PVws4-nNaapU-868inq-8vhd9Z-9tjaTo-f9Ardt-4FYKJ8-cQk1Co-f8deRv-5nQJeN-dUmA4a.

Pumpkins carved in the likeness of President Jimmy Carter in Polk County, Florida, October 1980, State Library and Archives of Florida via Flickr.

Watergate weighed on voters’ minds. Nixon’s disgrace netted big congressional gains for the Democrats in the 1974 mid-term elections. President Ford, the presumptive Republican nominee in 1976, damaged his popularity by pardoning Nixon. Voters seemed to want a Washington outsider untainted by the Beltway politics of the previous decade.

A wide field of Democratic presidential hopefuls reflected the diversity and disunity of the party. According to late January Gallup polls, segregationist Alabama governor George Wallace and moderate former Vice President Hubert Humphrey led with eighteen and seventeen percent respectively. A distant third at five percent was conservative Washington Senator Henry “Scoop” Jackson. In fourth place, with four percent, was former Georgia governor Jimmy Carter, a nuclear physicist and peanut farmer who represented the rising generation of younger, racially liberal “New South” Democrats.

After the chaos of the 1968 Chicago convention, the Democrats reformed party rules to bring in women, African Americans, young people, and Spanish speakers. One way the committee sought to improve popular participation (as well as stifle backstage maneuvering and public bickering) to increase the weight of caucuses and primaries on the presidential nomination process, reducing the machinations of party officials at the convention. Jimmy Carter and his energetic staff of Georgians understood the importance of these primaries and spent two years traveling the country, getting to know local Democrats and winning grassroots support.

Unlike his Democratic opponents—and unlike President Ford—Carter was a Washington outsider. He was unidentified with either his party’s liberal or conservative wings. Indeed, his appeal was more personal and moral than political. He ran on no great political issues. Instead, crafting an optimistic campaign centered on the slogan, “Why not the best?,” he let his background as a hardworking, honest, Southern Baptist navy-man ingratiate him to voters around the country, especially in his native South, where support of Democrats had wavered in the wake of the civil rights movement. Carter’s wholesome image was painted in direct contrast to the memory of Nixon and by association with the man who pardoned him, his vice president, Gerald Ford. Carter sealed his party’s nomination in June and won a close victory in November.

When Carter assumed took the oath of office on January 20, 1977, he became president of a nation in the midst of economic turmoil. Oil shocks, inflation, stagnant growth, unemployment, and sinking wages weighed down the nation’s economy. The age of affluence was over, and the unraveling had begun, culminating deeply rooted problems that had lain dormant during the long postwar prosperity.

The 1979 energy crisis prompted a panic for consumers who remembered the 1973 oil shortage, prompting many Americans to buy oil in huge quantities. Long lines and high gas prices characterized 1979, oil prices to remained quite high until the mid-1980s. Warren K. Leffler, “Gasoline lines,” June 15, 1979. Library of Congress, http://www.loc.gov/pictures/item/2003677600/.

The 1979 energy crisis prompted a panic for consumers who remembered the 1973 oil shortage, prompting many Americans to buy oil in huge quantities. Long lines and high gas prices characterized 1979, oil prices to remained quite high until the mid-1980s. Warren K. Leffler, “Gasoline lines,” June 15, 1979. Library of Congress.

At the end of the Second World War, American leaders erected a complex system of trade policies to help rebuild the shattered economies of Western Europe and Asia. In the glow of the Cold War, American diplomats and politicians used trade relationships to win influence and allies around the globe and they saw the economic health of their allies, particularly West Germany and Japan, as a crucial bulwark against the expansion of communism. Americans encouraged these nations to develop vibrant export-oriented economies and tolerated restrictions on U.S. imports. This came at great cost to the United States. As the American economy stalled, Japan and West Germany soared and became major forces in the global production for autos, steel, machine tools, and electrical products. By 1970, the United States began to run massive trade deficits. The value of American exports dropped and the prices of its imports skyrocketed. Coupled with the huge cost of the Vietnam War and the rise of oil-producing states in the Middle East, growing trade deficits sapped the United States’ dominant position in the global economy.

American leaders didn’t know how to respond. After a series of negotiations with leaders from France, Great Britain, West Germany, and Japan in 1970 and 1971, the Nixon administration allowed these rising industrial nations to continue flouting the principles of free trade by maintaining trade barriers that sheltered their domestic markets from foreign competition while at the same time exporting growing amounts of goods to the United States, which no longer maintained so comprehensive a tariff system. By 1974, in response to U. S. complaints and their own domestic economic problems, many of these industrial nations overhauled their protectionist practices but developed even subtler methods, such as state subsidies for key industries, to nurture their economies.

Carter, like Ford before him, presided over a hitherto unimagined economic dilemma: the simultaneous onset of inflation and economic stagnation, a combination popularized as “stagflation.” Neither Carter nor Ford had the means nor the ambition to protect American jobs and goods from foreign competition. As firms and financial institutions invested, sold goods, and manufactured in new rising economies, such as Mexico, Taiwan, Japan, Brazil, and elsewhere, American politicians allowed them to sell their often less costly products in the United States.

As American officials institutionalized this new unfettered global trade, many suffering American manufacturers perceived only one viable path to sustained profitability: moving overseas, often by establishing foreign subsidiaries or partnering with foreign firms. Investment capital, especially in manufacturing, fled the U. S. looking for overseas investments and hastened the decline in the productivity of American industry while rising export-oriented industrial nations flooded the world market with their cheaply produced goods. Global competition swiftly undermined the dominance enjoyed by American firms. By the end of the decade, the United States suffered from perennial trade deficits and weakened U.S. industry while many Americans suffered eroded job security and stagnating incomes.

As Carter failed to slow the unraveling of the American economy, he also struggled to shift American foreign policy away from blind anti-communism toward a human-rights based agenda. Carter was a one-term Georgia governor with little foreign policy experience and few knew what to expect from his presidency. Carter did not make human rights a central theme of his campaign. Only in May 1977 did the new president offer a definitive statement when, speaking before the graduating class at the University of Notre Dame, he declared his wish to move away from a foreign policy in which “an inordinate fear of communism” caused American leaders to “adopt the flawed and erroneous principles and tactics of our adversaries.” (Cold War foreign policy, he said, had resulted in the “profound moral crisis” of the Vietnam War.) Carter proposed instead “a policy based on constant decency in its values and on optimism in our historical vision.” Carter’s focus on human rights, mutual understanding, and peaceful solutions to international crises resulted in some successes. Under Carter, the U. S. either reduced aid to or ceased aiding altogether the American-supported right-wing dictators guilty of extreme human rights abuses in places such as South Korea, Argentina, and the Philippines. And despite intense domestic opposition, in September 1977, partly under the belief that such a treaty would signal a renewed American commitment to fairness and respect for all nations, Carter negotiated the return of the Panama Canal to Panamanian control.

Camp David, Menachem Begin, and Anwar Sadat, 1978, Wikimedia, http://commons.wikimedia.org/wiki/File:Camp_David,_Menachem_Begin,_Anwar_Sadat,_1978.jpg.

Camp David, Menachem Begin, and Anwar Sadat, 1978, Wikimedia.

Carter’s arguably greatest foreign policy achievement was the Camp David Accords. In September 1978, Carter negotiated a peace treaty between Israeli Prime Minister Menachem Begin and Egyptian President Anwar Sadat. After thirteen days of secret negotiations hosted by Carter at the presidency’s rural Maryland retreat, Camp David, two agreements were reached. The first established guidelines for Palestinian autonomy and a set of principles that would govern Israel’s relations to its Arab neighborhoods. The second provided the basis for Egyptian-Israeli peace by returning the Sinai Peninsula to Egyptian control and opening the Suez Canal to Israeli ships. The Accords, however, had significant limits. Though Sadat and Begin won a Nobel Peace Prize for their efforts, the Accords were as significant for what they left unresolved as for what they achieved. Though they represented the first time since the establishment of Israel that Palestinians were promised self-government and the first time that an Arab state fully recognized Israel as a nation, most of the Arab world rejected the Accords. The agreement ensured only limited individual rights for Palestinians and precluded territorial control or the possibility of statehood. Indeed, Palestinian Liberation Organization (PLO) chairman Yasser Arafat later described the Accords’ version of Palestinian autonomy as, “no more than managing the sewers.”

Carter, however, could not balance his insistence on human rights with the realities of the Cold War. While his administration reduced aid to some authoritarian states, the U.S. continued to provide military and financial support to allies it considered truly vital to American interests—most notably, the oil-rich nation of Iran. When the President and First Lady (Rosalynn Carter) visited Tehran in January 1978, the President praised the nation’s dictatorial ruler, Shah Reza Pahlavi and remarked on the “respect and the admiration and love” of Iranians for their leader. A year later, the Iranian Revolution deposed the Shah. In November, 1979, revolutionary Iranians, irate over America’s interventions in Iranian affairs and its long support of the Shah, stormed the U. S. embassy in Tehran and took fifty-two Americans hostage. At the same time, Americans again felt the energy pinch when revolutionaries shut down Iranian oil fields, spiking the price of oil for the second time in a decade. Americans not only felt the nation’s weakness at the gas pump, they watched it every night on national television: for many Americans, the hostage crisis that stretched across the next 444 days became a source of both jingoistic unity and a constant reminder of the country’s new global impotence. The nation that had defeated the Nazis and the Empire of Japan in the Second World War found itself, thirty years later, humiliated by both half of an obscure southeast Asian country and a relatively small and unstable Middle Eastern nation.

With his popularity plummeting Carter, in April 1980, ordered a secret rescue mission, Operation Eagle Claw, that ended in disaster. A U. S. helicopter crashed in the middle of the desert, killing eight servicemen, and leading Carter to take responsibility for the losses and the continued inability to free the American hostages.

Moreover, Carter’s efforts to ease the Cold War by achieving a new nuclear arms control agreement (SALT II) disintegrated under domestic opposition led by conservative hawks such as Ronald Reagan. They accused Carter of weakness, and cited Soviet support for African leftist revolutionaries as evidence of Soviet duplicity. And then the Soviets invaded Afghanistan in December 1979, returning the Cold War to the forefront of U.S. foreign policy. A month later, a beleaguered Carter committed the United States to defending its “interests” in the Middle East against Soviet incursions, declaring that “an assault [would] be repelled by any means necessary, including military force.” Known as the “Carter Doctrine,” the President’s declaration signaled the administration’s ambivalent commitment to human rights and a renewed reliance on military force in its anti-communist foreign policy. The seeds of Ronald Reagan’s more aggressive foreign policy had been sown.

 

VII. Conclusion

Though American politics moved right after Lyndon Johnson’s resignation, Nixon’s 1968 election had marked no conservative counterrevolution. American politics and society remained in flux throughout the 1970s. American politicians on the right and the left pursued relatively moderate courses compared to those in the preceding and succeeding decades. But a groundswell of anxieties and angers brewed beneath the surface. The world’s greatest military power had floundered in Vietnam and an American president stood flustered by Middle Eastern revolutionaries. The cultural clashes from the 60s persisted and accelerated. While cities burned, a more liberal sexuality permeated American culture. The economy crashed, leaving America’s cities prone before poverty and crime and its working class gutted by deindustrialization and globalization. American weakness was everywhere. And so, by 1980, many Americans—especially white middle- and upper-class Americans—felt a nostalgic desire for simpler times and simpler answers to the frustratingly complex geopolitical, social, and economic problems crippling the nation. The appeal of Carter’s soft drawl and Christian humility had signaled this yearning, but his utter failure to stop the unraveling opened the way for a new movement, with new personalities and a new conservatism, which promised to undo the damage and restore the United States to its nostalgic image of itself.

 

This chapter was edited by Edwin Breeden, with content contributions by Seth Anziska, Jeremiah Bauer, Edwin Breeden, Kyle Burke, Alexandra Evans, Sean Fear, Anne Grey Fischer, Destin Jenkins, Matthew Kahn, Suzanne Kahn, Brooke Lamperd, Katherine McGarr, Matthew Pressman, Adam Parsons, Emily Prifogle, John Rosenberg, Brandy Thomas Wells, and Naomi R. Williams.

cc-by-sa-icon

27. The Sixties

"Participants, some carrying American flags, marching in the civil rights march from Selma to Montgomery, Alabama in 1965," via Library of Congress.

“Participants, some carrying American flags, marching in the civil rights march from Selma to Montgomery, Alabama in 1965,” via Library of Congress.

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

Perhaps no decade is so immortalized in American memory as the 1960s. Couched in the colorful rhetoric of peace and love, complemented by stirring images of the civil rights movement, and fondly remembered for its music, art, and activism, for many the decade brought hopes for a more inclusive, forward-thinking nation. But the decade was also plagued by strife, tragedy, and chaos. It was the decade of the Vietnam War, of inner-city riots, and assassinations that seemed to symbolize the death of a new generation’s idealistic ambitions. A decade of struggle and disillusionment rocked by social, cultural, and political upheaval, the 1960s are remembered because so much changed, and because so much did not.

 

II. The Civil Rights Movement Continues

So much of the energy and character of “the sixties” emerged from the civil rights movement, which won its greatest victories in the early years of the decade. The movement itself was changing. Many of the civil rights activists pushing for school desegregation in the 1950s were middle-class and middle-aged. In the 1960s, a new student movement arose whose members wanted swifter changes in the segregated South. Confrontational protests, marches, boycotts, and sit-ins accelerated.

The tone of the modern U.S. civil rights movement changed at a North Carolina department store in 1960, when four African American students participated in a “sit-in” at a whites-only lunch counter. The 1960 Greensboro sit-ins were typical. Activists sat at segregated lunch counters in an act of defiance, refusing to leave until being served and willing to be ridiculed, attacked, and arrested if they were not. It drew resistance but it forced the desegregation of Woolworth’s department store. It prompted copycat demonstrations across the South. The protests offered evidence that student-led direct action could enact social change and established the civil rights movement’s direction in the forthcoming years.

The following year, civil rights advocates attempted a bolder variation of a “sit-in” when they participated in the Freedom Rides. Activists organized interstate bus rides following a Supreme Court decision outlawing segregation on public buses and trains. The rides intended to test the court’s ruling, which many southern states had ignored. An interracial group of Freedom Riders boarded buses in Washington D.C. with the intention of sitting in integrated patterns on the buses as they traveled through the Deep South. On the initial rides in May 1961, the riders encountered fierce resistance in Alabama. Angry mobs composed of KKK members attacked riders in Birmingham, burning one of the buses and beating the activists who escaped. Despite the fact that the first riders abandoned their trip and decided to fly to their destination, New Orleans, civil rights activists remained vigilant. Additional Freedom Rides launched through the summer and generated national attention amid additional violent resistance. Ultimately, the Interstate Commerce Commission enforced integrated interstate buses and trains in November 1961.

In the fall of 1961, civil rights activists descended on Albany, a small city in southwest Georgia. A place known for entrenched segregation and racial violence, Albany seemed an unlikely place for black Americans to rally and demand civil rights gains. The activists there, however, formed the Albany Movement, a coalition of civil rights organizers that included members of the Student Nonviolent Coordinating Committee (SNCC, or, “snick”), the Southern Christian Leadership Conference (SCLC), and the NAACP. But in Albany the movement was stymied by police chief Laurie Pritchett, who launched mass arrests but refused to engage in police brutality and bailed out leading officials to avoid negative media attention. It was a peculiar scene, and a lesson for southern acvtivists.

Despite its defeat, Albany captured much of the energy of the civil rights movement. The Albany Movement included elements of the Christian commitment to social justice in its platform, with activists stating that all people were “of equal worth” in God’s family and that “no man may discriminate against or exploit another.” In many instances in the 1960s, black Christianity propelled civil rights advocates to action and demonstrated the significance of religion to the broader civil rights movement. King’s rise to prominence underscored the role that African American religious figures played in the 1960s civil rights movement. Protestors sang hymns and spirituals as they marched. Preachers rallied the people with messages of justice and hope. Churches hosted meetings, prayer vigils, and conferences on nonviolent resistance. The moral thrust of the movement strengthened African American activists while also confronting white society by framing segregation as a moral evil.

As the civil rights movement garnered more followers and more attention, white resistance stiffened. In October 1962, James Meredith became the first African American student to enroll at the University of Mississippi. Meredith’s enrollment sparked riots on the Oxford campus, prompting President John F. Kennedy to send in U.S. Marshals and National Guardsmen to maintain order.  On an evening known infamously as the Battle of Ole Miss, segregationists clashed with troops in the middle of campus, resulting in two deaths and hundreds of injuries. Violence despite federal intervention served as a reminder of the strength of white resistance to the civil rights movement, particularly in the realm of education.

James Meredith, accompanied by U.S. Marshalls, walks to class at the University of Mississippi in 1962. Meredith was the first African-American student admitted to the still segregated Ole Miss. Marion S. Trikosko, “Integration at Ole Miss[issippi] Univ[ersity],” 1962. Library of Congress, http://www.loc.gov/pictures/item/2003688159/.

James Meredith, accompanied by U.S. Marshalls, walks to class at the University of Mississippi in 1962. Meredith was the first African-American student admitted to the still segregated Ole Miss. Marion S. Trikosko, “Integration at Ole Miss[issippi] Univ[ersity],” 1962. Library of Congress, http://www.loc.gov/pictures/item/2003688159/.

The following year, 1963, was perhaps the decade’s most eventful year for civil rights. In April and May, the SCLC organized the Birmingham Campaign, a broad campaign of direct action aiming to topple segregation in Alabama’s largest city. Activists used business boycotts, sit-ins, and peaceful marches as part of the campaign. SCLC leader Martin Luther King Jr. was jailed, prompting his famous handwritten letter urging not only his nonviolent approach but active confrontation to directly challenge injustice. The campaign further added to King’s national reputation and featured powerful photographs and video footage of white police officers using fire hoses and attack dogs on young African American protesters. It also yielded an agreement to desegregate public accommodations in the city: activists in Birmingham scored a victory for civil rights and drew international praise for the nonviolent approach in the face of police-sanctioned violence and bombings.

Images of police brutality against peaceful Civil Rights demonstrators shocked many Americans and helped increase support for the movement. Photograph. http://www.legacy.com/UserContent/ns/Photos/Fire%20hoses%20used%20against%20civil%20rights%20protesters%20in%20Birmingham%201963.jpg.

Images of police brutality against peaceful Civil Rights demonstrators shocked many Americans and helped increase support for the movement. Photograph. Source.

White resistance magnified. In June, Alabama Governor George Wallace famously stood in the door of a classroom building in a symbolic attempt to halt integration at the University of Alabama. President Kennedy addressed the nation that evening, criticizing Wallace and calling for a comprehensive civil rights bill. A day later, civil rights leader Medgar Evers was assassinated at his home in Jackson, Mississippi. Civil rights leaders gathered in August 1963 for the March on Washington. The march called for, among other things, civil rights legislation, school integration, an end to discrimination by public and private employers, job training for the unemployed, and a raise in the minimum wage. On the steps of the Lincoln Memorial, King delivered his famous “I Have a Dream” speech, an internationally renowned call for civil rights and against racism that raised the movement’s profile to unprecedented heights. The year would end on a somber note with the assassination of President Kennedy, a public figure considered an important ally of civil rights, but it did not halt the civil rights movement.

White activists increasingly joined African Americans in the Civil Rights Movement during the 1960s. This photograph shows Martin Luther King, Jr., and other black civil rights leaders arm-in-arm with leaders of the Jewish community. Photograph, August 28, 1963. Wikimedia, http://commons.wikimedia.org/wiki/File:March_on_washington_Aug_28_1963.jpg.

White activists increasingly joined African Americans in the Civil Rights Movement during the 1960s. This photograph shows Martin Luther King, Jr., and other black civil rights leaders arm-in-arm with leaders of the Jewish community. Photograph, August 28, 1963. Wikimedia, http://commons.wikimedia.org/wiki/File:March_on_washington_Aug_28_1963.jpg.

President Lyndon Johnson embraced the civil rights movement. The following summer he signed the Civil Rights Act of 1964, widely considered to be among the most important pieces of civil rights legislation in American history. The comprehensive act barred segregation in public accommodations and outlawed discrimination based on race, ethnicity, gender, and national or religious origin.

Lyndon B. Johnson sits with Civil Rights Leaders in the White House. One of Johnson’s greatest legacies would be his staunch support of civil rights legislation. Photograph, January 18, 1964. Wikimedia, http://commons.wikimedia.org/wiki/File:Lyndon_Johnson_meeting_with_civil_rights_leaders.jpg.

Lyndon B. Johnson sits with Civil Rights Leaders in the White House. One of Johnson’s greatest legacies would be his staunch support of civil rights legislation. Photograph, January 18, 1964. Wikimedia, http://commons.wikimedia.org/wiki/File:Lyndon_Johnson_meeting_with_civil_rights_leaders.jpg.

Lyndon B. Johnson was not afraid to use whatever means necessary to get his legislation passed. Johnson was notoriously crude, rude, and irreverent, making the massive amount of legislation he got passed even more incredible. Yoichi R. Okamoto, Photograph of Lyndon B. Johnson pressuring Senator Richard Russell, December 17, 1963. Wikimedia, http://en.wikipedia.org/wiki/File:Lyndon_Johnson_and_Richard_Russell.jpg.

Lyndon B. Johnson was willing to use whatever means necessary to get his legislation passed. Yoichi R. Okamoto, Photograph of Lyndon B. Johnson pressuring Senator Richard Russell, December 17, 1963. Wikimedia.

Direct action continued through the summer, as student-run organizations like SNCC and CORE (The Congress of Racial Equality) helped with the Freedom Summer in Mississippi, a drive to register African American voters in a state with an ugly history of discrimination. Freedom Summer campaigners set up schools for African American children and endured intimidation tactics. Even with progress, violent resistance against civil rights continued, particularly in regions with longstanding traditions of segregation.

Direct action and resistance to such action continued in March 1965, when activists attempted to march from Selma to Montgomery, Alabama, with the support of prominent civil rights leaders on behalf of local African American voting rights. In a narrative that had become familiar, “Bloody Sunday” featured peaceful protesters attacked by white law enforcement with batons and tear gas. After they were turned away violently a second time, marchers finally made the 70-mile trek to the state capitol later in the month. Coverage of the first march prompted President Johnson to present the bill that became the Voting Rights Act of 1965, an act that abolished voting discrimination in federal, state, and local elections with an eye on African American enfranchisement in the South.  In two consecutive years, landmark pieces of legislation had helped to weaken de jure segregation and disenfranchisement in America.

Five leaders of the Civil Rights Movement. From left: Bayard Rustin, Andrew Young, N.Y. Congressman William Ryan, James Farmer, and John Lewis in 1965. Stanley Wolfson, Photograph, 1965. Library of Congress, http://www.loc.gov/pictures/item/98515229/.

Five leaders of the Civil Rights Movement. From left: Bayard Rustin, Andrew Young, N.Y. Congressman William Ryan, James Farmer, and John Lewis in 1965. Stanley Wolfson, Photograph, 1965. Library of Congress, http://www.loc.gov/pictures/item/98515229/.

And then things began to stall. Days after the ratification of the Voting Rights Act, race riots broke out in the Watts District of Los Angeles. Rioting in Watts stemmed from local African American frustrations with residential segregation, police brutality, and racial profiling. Waves of riots would rock American cities every summer thereafter. Particularly destructive riots occurred in 1967—two summers later—in Newark and Detroit. Each resulted in deaths, injuries, arrests, and millions of dollars in property damage. In spite of black achievements, inner-city problems persisted for many African Americans. The phenomenon of “white flight”—when whites in metropolitan areas fled city centers for the suburbs—often resulted in “re-segregated” residential patterns. Limited access to economic and social opportunities in urban areas bred discord. In addition to reminding the nation that the civil rights movement was a complex, ongoing event without a concrete endpoint, the unrest in northern cities reinforced the notion that the struggle did not occur solely in the South. Many Americans also viewed the riots as an indictment of the Great Society, President Johnson’s sweeping agenda of domestic programs that sought to remedy inner-city ills by offering better access to education, jobs, medical care, housing, and other forms of social welfare. The civil rights movement was never the same.

 

III. Beyond Civil Rights

As tension continued to mount in cities through the decade, the tone of the civil rights movement changed yet again. Activists became less conciliatory in their calls for civil rights progress, embracing the more militant message of the burgeoning Black Power Movement and the late Malcolm X, a Nation of Islam (NOI) minister who had encouraged African Americans to pursue freedom, equality, and justice by “any means necessary.” Prior to his death, Malcolm X and the NOI emerged as the radical alternative to the racially integrated, largely Protestant approach of the Martin Luther King, Jr.-led civil rights movement. Malcolm advocated armed resistance in defense for the safety and well being of black Americans, stating, “I don’t call it violence when it’s self-defense, I call it intelligence.” For his part, King and leaders from more mainstream organizations like the NAACP and the Urban League criticized both Malcolm X and the NOI for what they perceived to be racial demagoguery. King believed Malcolm’s speeches were a “great disservice” to black Americans, claiming that X’s speeches lamented the problems of African Americans without offering solutions. The differences between Dr. King and Malcolm X represented a core ideological tension that would inhabit black political thought throughout the 1960s and 1970s.

Like Booker T. Washington and W.E.B. Du Bois before them, Martin Luther King, Jr., and Malcolm X represented two styles of racial uplift while maintaining the same general goal of ending racial discrimination. How they would get to that goal is where the men diverged. Marion S. Trikosko, “[Martin Luther King and Malcolm X waiting for press conference],” March 26, 1964. Library of Congress, http://www.loc.gov/pictures/item/92522562/.

Like Booker T. Washington and W.E.B. Du Bois before them, Martin Luther King, Jr., and Malcolm X represented two styles of racial uplift while maintaining the same general goal of ending racial discrimination. How they would get to that goal is where the men diverged. Marion S. Trikosko, “[Martin Luther King and Malcolm X waiting for press conference],” March 26, 1964. Library of Congress, http://www.loc.gov/pictures/item/92522562/.

By the late 1960s, the Student Nonviolent Coordinating Committee, led by figures such as Stokely Carmichael, had expelled its white members and shunned the interracial effort in the rural South, focusing instead on injustices in northern urban areas. After President Johnson refused to take up the cause of the black delegates in the Mississippi Freedom Democratic Party at the 1964 Democratic National Convention, SNCC activists became frustrated with institutional tactics and turned away from the organization’s founding principle of nonviolence over the course of the next year. This evolving, more aggressive movement called for African Americans to play a dominant role in cultivating black institutions and articulating black interests rather than relying on interracial, moderate approaches. At a June 1966 civil rights march, Carmichael told the crowd, “What we gonna start saying now is black power!” The slogan not only resonated with audiences, it also stood in direct contrast to King’s “Freedom Now!” campaign. The political slogan of black power could encompass many meanings, but at its core stood for the self-determination of blacks in political, economic, and social organizations.

The Black Panther Party used radical and incendiary tactics to bring attention to the continued oppression of blacks in America. Read the bottom paragraph on this rally poster carefully. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/e/e7/Black_Panther_DC_Rally_Revolutionary_People's_Constitutional_Convention_1970.jpg.

The Black Panther Party used radical and incendiary tactics to bring attention to the continued oppression of blacks in America. Read the bottom paragraph on this rally poster carefully. Wikimedia.

While Carmichael asserted that “black power meant black people coming together to form a political force,” to many it also meant violence. In 1966, Huey Newton and Bobby Seale formed the Black Panther Party in Oakland, California. The Black Panthers became the standard-bearers for direct action and self-defense, using the concept of “decolonization” in their drive to liberate black communities from white power structures. The revolutionary organization also sought reparations and exemptions for black men from the military draft. Citing police brutality and racist governmental policies, the Panthers aligned themselves with the “other people of color in the world” against whom America was fighting abroad. Although it was perhaps most well-known for its open display of weapons, military-style dress, and black nationalist beliefs, the Party’s 10-Point Plan also included employment, housing, and education. The Black Panthers worked in local communities to run “survival programs” that provided food, clothing, medical treatment, and drug rehabilitation. They focused on modes of resistance that empowered black activists on their own terms.

By 1968, the civil rights movement looked quite different from the one that had emerged out of the 1960 Greensboro sit-ins. The movement had never been monolithic, but prominent, competing ideologies had now fractured it significantly. King’s assassination on a Memphis hotel room balcony in April sparked another wave of riots in over 100 American cities and brought an abrupt, tragic end to the life of the movement’s most famous figure. Only a week after his assassination, President Johnson signed the Civil Rights Act of 1968, another significant piece of federal legislation that outlawed housing discrimination. Two months later, on June 6, Robert Kennedy was gunned down in a Los Angeles hotel while campaigning to be the Democratic candidate for President.  The assassinations of both national leaders in succession created a sense of national anger and dissolution.

The frustration prompted dozens of national protest organizations to converge on the Democratic National Convention in Chicago at the end of August. A bitterly fractured Democratic Party gathered to assemble a passable platform and nominate a broadly acceptable presidential candidate. Outside the convention hall, numerous student and radical groups—the most prominent being Students for a Democratic Society and the Youth International Party—identified the conference as an ideal venue for demonstrations against the Vietnam War and planned massive protests in Chicago’s public spaces. Initial protests were peaceful, but the situation quickly soured as police issued stern threats and young people began to taunt and goad officials. Many of the assembled students had protest and sit-in experiences only in the relative safe havens of college campuses, and were unaccustomed to the heavily armed, big-city police force, accompanied by National Guard troops in full riot gear. Attendees recounted vicious beatings at the hands of police and Guardsmen, but many young people—convinced that much public sympathy could be won via images of brutality against unarmed protesters—continued stoking the violence. Clashes spilled from the parks into city streets, and eventually the smell of tear gas penetrated upper floors of the opulent hotels hosting Democratic delegates.

The ongoing police brutality against the protesters overshadowed the convention and culminated in an internationally televised standoff in front of the Hilton Hotel, where policeman beat protestors chanting, “the whole world is watching!” For many on both sides, the Chicago riots engendered a growing sense of the chaos rocking American life. The disparity in force between students and police frightened some radicals out of advocacy for revolutionary violence, while some officers began questioning the war and those who waged it. Many more, though, saw disorder and chaos where once they had seen idealism and progress. Ultimately, the violence of 1968 was not the death knell of a struggle simply for the end of black-white segregation, but rather a moment of transition that pointed to the continuation of past oppression and foreshadowed many of the challenges of the future. At decade’s end, civil rights advocates could take pride in significant gains while acknowledging that many of the nation’s racial issues remained unresolved.

IV. Culture and Activism

Epitomizing the folk music and protest culture of 1960s youth, Joan Baez and Bob Dylan are pictured here singing together at the March on Washington in 1963. Photograph, Wikimedia, http://upload.wikimedia.org/wikipedia/commons/3/33/Joan_Baez_Bob_Dylan.jpg.

Epitomizing the folk music and protest culture of 1960s youth, Joan Baez and Bob Dylan are pictured here singing together at the March on Washington in 1963. Photograph, Wikimedia, http://upload.wikimedia.org/wikipedia/commons/3/33/Joan_Baez_Bob_Dylan.jpg.

The 1960s wrought enormous cultural change. The United States that entered the decade looked and sounded nothing like the one that left it. Popular culture often challenged norms from the supposedly hidebound 1950s, promoting rebellion and individualism and, in the process, bringing the counterculture into the mainstream. Native Americans, Chicanos, women, and environmentalists all participated in movements demonstrating that “rights” activism also applied to ethnicity, gender, and the nation’s natural resources. Even established religious institutions like the Catholic Church underwent transformation that reflected an emerging emphasis on freedom and tolerance. In each instance, the decade brought about substantial progress with a reminder that the activism in each cultural realm remained fluid and unfinished.

At the dawn of the 1960s, trends from the 1950s still flourished. While only half of American households owned a television in the mid-1950s, for example, nearly 90 percent of homes had a set by 1962. With the increasing popularity of rock and roll, established white musicians like Elvis Presley continued to imitate and adapt black musical genres. Newcomers also adopted this tactic: the Beatles’ first album featured two covers of popular songs by the Shirelles.

Advertisers continued to appeal to teenagers and the expanding youth market. What differed in the 1960s, perhaps, was the commodification of the counterculture. Popular culture and popular advertising in the 1950s had promoted an ethos of “fitting in” and buying products to conform. The new counterculture ethos, however, touted individuality and rebellion. Some advertisers used this ethos subtly; advertisements for Volkswagens openly acknowledged the flaws of their cars and emphasized their strange look. One ad read, “Presenting America’s slowest fastback,” which “won’t go over 72 mph even though the speedometer shows a wildly optimistic speed of 90.” Another stated, “And if you run out of gas, it’s easy to push.” By marketing the car’s flaws and reframing them as positive qualities, the advertisers commercialized young peoples’ resistance to commercialism. And it positioned the VW as a car for those who didn’t mind standing out in a crowd. A more obviously countercultural ad for the VW Bug showed two cars: one black and one painted multi-color in the hippie style; the contrasting captions read, “We do our thing,” and “You do yours.”

The Volkswagen Beetle became an icon of 1960s culture and a paradigm of a new advertising age. This tongue-in-cheek advertisement attracted laughs and attention from the public and business world. http://www.videosurrey.com/wp-content/uploads/2013/03/beetle-coccinelle-volkswagen-vw-publicite-vintage-03.jpg.

The Volkswagen Beetle became an icon of 1960s culture and a paradigm of a new advertising age. This tongue-in-cheek advertisement attracted laughs and attention from the public and business world. http://www.videosurrey.com/wp-content/uploads/2013/03/beetle-coccinelle-volkswagen-vw-publicite-vintage-03.jpg.

Companies marketed their products as countercultural in and of themselves. One of the more obvious examples was a 1968 ad from Columbia Records, a hugely successful record label since the 1920s. The ad pictured a group of stock rebellious characters—a shaggy-haired white hippie, a buttoned up Beat, two biker types, and a black jazz man sporting an afro—in a jail cell. The counterculture had been busted, the ad states, but “the man can’t bust our music.” Merely buying records from Columbia was an act of rebellion, one that brought the buyer closer to the counterculture figures portrayed in the ad.

Even when pop culture in the 1960s was not tied to counterculture, it still stood in contrast to a more conservative past. The dominant style of women’s fashion in the 1950s was the poodle skirt and the sweater, tight-waisted and buttoned up. The 1960s, however, ushered in an era of much less restrictive clothing. Capri pants became popular casual wear. Skirts became shorter. When Mary Quant invented the miniskirt in 1964, she said it was a garment “in which you could move, in which you could run and jump.” By the late 1960s, the hippies’ more androgynous look had become trendy. Such fashion trends bespoke the overall popular ethos of the 1960s: freedom, rebellion, and individuality.

Fashion can tell us a lot about a generation’s values and world view. Miniskirts – one of the most radical and popular fashions of the 1960s – demonstrated the new sexual openness of young women during this era of free love. Photograph of young woman in Eugene, Oregon, 1966. Wikimedia, http://commons.wikimedia.org/wiki/File:1960s_fashions_(1709303069).jpg.

Fashion can sometimes capture a generation’s world view. Miniskirts – one of the most radical and popular fashions of the 1960s – demonstrated the new sexual openness of young women. Photograph of young woman in Eugene, Oregon, 1966. Wikimedia, http://commons.wikimedia.org/wiki/File:1960s_fashions_(1709303069).jpg.

In a decade plagued by social and political instability, the American counterculture also sought psychedelic drugs as its remedy for alienation. For young, middle-class whites, society had become stagnant and bureaucratic. Psychedelic drug use arose as an alternate form of activism. LSD began its life as a drug used primarily in psychological research before it trickled down into college campuses and out into society at large. The counterculture’s notion that American stagnation could be remedied by a spiritual-psychedelic experience was drawn almost entirely from psychologists and sociologists.

The irony, of course, was that LSD’s popularity outside of science eventually led to its demise within labs. By 1966, enough incidents had been connected to LSD to spur a Senate hearing on the drug; newspapers reported that hundreds of LSD users had been admitted to psychiatric wards. While many of these reports were sensationalistic or altogether untrue, LSD’s uses did become increasingly bizarre and even dangerous throughout the late 1960s. The 1967 Summer of Love failed to live up to its mantra as an idyllic, psychedelic retreat, and the summer was instead characterized by housing shortages and deadly inner-city riots. Similarly, while 1969’s Woodstock embodied the countercultural ethos of creativity and community, the Altamont Free Concert held the same year resulted in riots and deadly violence.

The turmoil and growing grassroots activism in the 1960s among American youth and university students, including Native Americans, created an atmosphere for reform in both Congress and the courts. In the summer of 1961, Native American university students founded a new organization, the National Indian Youth Council (NIYC). While the Council shared many of its core values and goals with the National Congress of American Indians (NCAI)—sovereignty, self-determination, treaty rights, and cultural preservation, the NIYC employed direct action tactics and more combative rhetoric.

The NIYC came from a tradition of student clubs and organizations. The 1944 GI Bill opened the door for many Native Americans to university education, and the increased presence of Native students at universities led to the establishment of Native college clubs and organizations, where members discussed major problems in Indian Country, such as termination policy, treaty rights, and poverty. Many also benefited from summer workshops on American Indian Affairs, designed to prepare Indian youth for future leadership roles. Participants in the workshops overwhelmingly embraced the principles of self-determination and tribal sovereignty. They recognized that regardless of tribal membership, Native people faced similar problems, which could be best confronted through a united, intertribal effort. This view was reinforced at the American Indian Chicago Conference in 1961, where the delegates drafted “The Declaration of Indian Purpose,” a document outlining Indian solutions to Indian problems. Despite the promise of the Chicago Conference, the students were disenchanted with the slow progress of change. The growing frustration of the younger generation, combined with ideas from the workshops and experiences at the Chicago Conference, led to the founding of the NIYC in August 1961.

The first opportunity for the Council to generate support and attract public attention happened in the Pacific Northwest. Washington State tribal nations reserved the right to fish off reservation without being subject to state regulations in their nineteenth-century treaties. This right was challenged by the state in the early 1960s; Native fishermen who fished in violation of state laws were arrested and subsequently required to purchase permissions for off-reservation fishing. With little justice received from the courts, Washington State tribal nations appealed to NIYC for assistance. NIYC members decided to hold a series of “fish-ins,” which involved activists casting nets from their boats and waiting for the police to arrest them. In 1974, fishing rights activists and tribal leaders reached a legal victory in United States v. Washington known as the Boldt Decision, which declared that Native Americans were entitled to up to 50 percent of the fish caught in the “usual and accustomed places” as stated in the 1850s treaties.

NIYC’s militant rhetoric and use of direct action marked the beginning of the Red Power movement. It paved the way for future intertribal activism and gathered a national exposure to Native issues

While the Pan-Indian movement of the 1960s failed, a sign remains of the Native American occupation of Alcatraz Island in the San Francisco Bay. Photograph, July 18, 2006. Wikimedia, http://commons.wikimedia.org/wiki/File:Alcatraz_Island_01_Prison_sign.jpg.

While the Pan-Indian movement of the 1960s failed, a sign remains of the Native American occupation of Alcatraz Island in the San Francisco Bay. Photograph, July 18, 2006. Wikimedia, http://commons.wikimedia.org/wiki/File:Alcatraz_Island_01_Prison_sign.jpg.

through news media. Native Americans created pan-Indian communities in cities and demanded respect for their rights and culture, actively responding to discrimination and violence against them. To prevent police harassment, Native Americans in Minneapolis formed “Indian patrols” to monitor the behavior of police in Indian neighborhoods. From these patrols grew the American Indian Movement (AIM), founded in Minneapolis in 1968. The actions of AIM, while not bringing any specific or immediate results, brought national and international attention to Native issues, and the organization helped to create a more favorable climate for a policy shift. The NCAI, NIYC, and AIM continued their work, with and within the established American political system, to influence new laws on Native issues and concentrate on local problems.

The Chicano movement in the 1960s emerged out of the broader Mexican American civil rights movement of the post-World War II era. While “Chicano” was initially considered a derogatory term for Mexican immigrants, activists in the 1960s reclaimed the term and used it as a catalyst to campaign for political and social change among Mexican Americans. The Chicano movement confronted discrimination in schools, politics, agriculture, and other formal and informal institutions. Organizations like the Mexican American Political Association (MAPA) and the Mexican American Legal Defense Fund (MALDF) buoyed the Chicano movement and patterned themselves after similar influential groups in the African American civil rights movement.

Cesar Chavez became the most well-known figure of the Chicano movement, using nonviolent tactics to campaign for workers’ rights in the grape fields of California. Chavez and activist Dolores Huerta founded the National Farm Workers Association, which eventually merged and became the United Farm Workers of America (UFWA). The UFWA fused the causes of Chicano and Filipino activists protesting subpar working conditions of California farmers on American soil. In addition to embarking on a hunger strike and a boycott of table grapes, Chavez led a 300-mile march in March and April of 1966 from Delano, California to the state capital of Sacramento. The pro-labor campaign garnered the national spotlight and the support of prominent political figures such as Robert Kennedy. Today, Chavez’s birthday (March 31) is observed as a federal holiday in California, Colorado, and Texas.

The United Farm Workers Union become a strong force for bettering working conditions of laborers in California and Florida agriculture. Cesar Chavez (center) and UFW supporters attend an outdoor Mass on the capitol steps in Sacramento, Calif.,  before start of a labor protest march, date unknown. http://i.huffpost.com/gen/1608804/thumbs/o-CESAR-CHAVEZ-facebook.jpg.

The United Farm Workers Union become a strong force for bettering working conditions of laborers in California and Florida agriculture. Cesar Chavez (center) and UFW supporters attend an outdoor Mass on the capitol steps in Sacramento, Calif., before start of a labor protest march, date unknown. Huffington Post.

Rodolfo “Corky” Gonzales was another activist whose calls for Chicano self-determination resonated long past the 1960s. A former boxer and Denver native, Gonzales founded the Crusade for Justice in 1966, an organization that would establish the first annual Chicano Liberation Day at the National Chicano Youth Conference by decade’s end. The conference also yielded the Plan Espiritual de Aztlan, a Chicano nationalist manifesto that reflected Gonzales’ vision of Chicano as a unified, historically grounded, all-encompassing group fighting against discrimination in the United States. By 1970, the Texas-based La Raza Unida political party had a strong foundation for promoting Chicano nationalism and continuing the campaign for Mexican American civil rights.

The 1966 Rio Grande Valley Farm Workers March (“La Marcha”). August 27, 1966. Via the University of Texas-San Antonio Libraries' Special Collections (MS 360: E-0012-187-D-16)

The 1966 Rio Grande Valley Farm Workers March (“La Marcha”). August 27, 1966. Via the University of Texas-San Antonio Libraries’ Special Collections (MS 360: E-0012-187-D-16)

The feminist movement also made great strides in the 1960s. Women were active in both the civil rights movement and the labor movement, but their increasing awareness of gender inequality did not find a receptive audience among male leaders in those movements. In the 1960s, then, many of these women began to form a movement of their own. Soon the country experienced a groundswell of feminist consciousness.

An older generation of women who preferred to work within state institutions figured prominently in the early part of the decade. When John F. Kennedy established the President’s Commission on the Status of Women in 1961, former first lady Eleanor Roosevelt headed the effort. The Commission’s Invitation to Action was released in 1963. Finding discriminatory provisions in the law and practices of industrial, labor, and governmental organizations, the Commission advocated for “changes, many of them long overdue, in the conditions of women’s opportunity in the United States.” Change was necessary in areas of employment practices, federal tax and benefit policies affecting women’s income, labor laws, and services for women as wives, mothers, and workers. This call for action, if heeded, would ameliorate the types of discrimination primarily experienced by middle-class and elite white working women, all of whom were used to advocating through institutional structures like government agencies and unions.

Betty Friedan’s Feminine Mystique hit bookshelves the same year the Commission released its report. Friedan had been active in the union movement, and was by this time a mother in the new suburban landscape of post-war America. In her book, Friedan labeled the “problem that has no name,” and in doing so helped many white middle-class American women come to see their dissatisfaction as housewives not as something “wrong with [their] marriage, or [themselves],” but instead as a social problem experienced by millions of American women. Friedan observed that there was a “discrepancy between the reality of [women’s] lives and the image to which we were trying to conform, the image I call the feminine mystique.” No longer would women allow society to blame the “problem that has no name” on a loss of femininity, too much education, or too much female independence and equality with men.

The 1960s also saw a different group of women pushing for change in government policy. Welfare mothers began to form local advocacy groups in addition to the National Welfare Rights Organization founded in 1966. Mostly African American, these activists fought for greater benefits and more control over welfare policy and implementation. Women like Johnnie Tillmon successfully advocated for larger grants for school clothes and household equipment in addition to gaining due process and fair administrative hearings prior to termination of welfare entitlements.

Yet another mode of feminist activism was the formation of consciousness-raising groups. These groups met in women’s homes and at women’s centers, providing a safe environment for women to discuss everything from experiences of gender discrimination to pregnancy, from relationships with men and women to self-image. The goal of consciousness-raising was to increase self-awareness and validate the experiences of women. Groups framed such individual experiences as examples of society-wide sexism, and claimed that “the personal is political.” Consciousness-raising groups created a wealth of personal stories that feminists could use in other forms of activism and crafted networks of women that activists could mobilize support for protests.

The end of the decade was marked by the Women’s Strike for Equality celebrating the 50th anniversary of women’s right to vote. Sponsored by NOW (the National Organization for Women), the 1970 protest focused on employment discrimination, political equality, abortion, free childcare, and equality in marriage. All of these issues foreshadowed the backlash against feminist goals in the 1970s. Not only would feminism face opposition from other women who valued the traditional homemaker role to which feminists objected, the feminist movement would also fracture internally as minority women challenged white feminists’ racism and lesbians vied for more prominence within feminist organizations.

The women’s movement stagnated after gaining the vote in 1920, but by the 1960s it was back in full force. Inspired by the Civil Rights Movement and fed up with gender discrimination, women took to the streets to demand their rights as American citizens. Warren K. Leffler, “Women's lib[eration] march from Farrugut Sq[uare] to Layfette [i.e., Lafayette] P[ar]k,” August 26, 1970. Library of Congress, http://www.loc.gov/pictures/item/2003673992/.

The women’s movement stagnated after gaining the vote in 1920, but by the 1960s it was back in full force. Inspired by the Civil Rights Movement and fed up with gender discrimination, women took to the streets to demand their rights as American citizens. Warren K. Leffler, “Women’s lib[eration] march from Farrugut Sq[uare] to Layfette [i.e., Lafayette] P[ar]k,” August 26, 1970. Library of Congress, http://www.loc.gov/pictures/item/2003673992/.

American environmentalism made significant gains in the 1960s that piggybacked off the post-World War II trend of Americans using their growing resources and leisure time to explore nature. They backpacked, went to the beach, fished, and joined birding organizations in greater numbers than ever before. These experiences, along with increased formal education, made Americans more aware of threats to the environment and, consequently, to themselves.  Many of these threats increased in the post-war years as developers bulldozed open space for suburbs and new hazards from industrial and nuclear pollutants loomed over all organisms.By the time that biologist Rachel Carson published her landmark book, Silent Spring, in 1962, a nascent environmentalism had emerged in America.  Silent Spring stood out as an unparalleled argument for the interconnectedness of ecological and human health. Pesticides, Carson argued, also posed a threat to human health, and their over-use threatened the ecosystems that supported food production.  Carson’s argument was compelling to many Americans, including President Kennedy, and was virulently opposed by chemical industries that suggested the book was the product of an emotional woman, not a scientist.After Silent Spring, the social and intellectual currents of environmentalism continued to expand rapidly, culminating in the largest demonstration in history, Earth Day, on April 22, 1970, and in a decade of lawmaking that significantly restructured American government. Even before the massive gathering for Earth Day, lawmakers from the local to federal level had pushed for and achieved regulations to clean up the air and water. President Richard Nixon signed the National Environmental Policy Act into law in 1970, requiring environmental impact statements for any project directed or funded by the federal government.  He also created the Environmental Protection Agency, the first agency charged with studying, regulating, and disseminating knowledge about the environment. A raft of laws followed that were designed to offer increased protection for air, water, endangered species, and natural areas.In keeping with the activist themes of the decade, the Catholic Church reevaluated longstanding traditions in the 1960s. The Second Vatican Council became the defining moment for the modern church.  Called by Pope John XXIII to bring the church into closer dialogue with the non-Catholic world, Vatican II functioned as a vehicle for a spirit of aggiornamento, or a bringing up to date, for individual Catholics and their church.The council met from 1962 to 1965, and its members—the bishops of the worldwide Catholic Church—discussed varied topics, ranging from ecumenism and the role of laypeople to religious freedom and the changing nature of the priesthood.  Vatican II went beyond mere discussion, however.  Its proclamations brought about the rise of the vernacular Mass, a larger role for laypeople in the liturgy and in the administration of parishes and dioceses, increased contact with non-Catholics, and renewed recognition of the church as “the people of God” rather than primarily as a body of priests and bishops. A number of American Catholics had long called for such reforms, and the post-conciliar period often saw dramatic changes to the form of worship in Catholic parishes, with many adopting more informal, contemporary styles. Vatican II also opened the way for women to claim a larger degree of power in the life of the Catholic Church. The council, though, was not without controversy. More conservative Catholics often resisted what they perceived as rapid, dangerous changes overtaking their church, which frequently led to tensions between clergy and laity and among laypeople.Priests and male and female religious figures also felt the council’s influence.  Some scholars have cited the general opening, liberalizing effect of Vatican II’s message and its implementation as key factors in the decline of the number of American priests that began in the era of the Second Vatican Council.  Nuns seized the opportunity provided by the council to revisit the rules governing their

Losing membership and influence throughout the world, leaders of the Catholic Church met in 1965 institute new measures to modernize and open the church. This ecumenical council would become known as the Second Vatican Council or Vatican II. Photograph of the grand procession of the Council Fathers at St. Peter's Basilica, October 11, 1962. Wikimedia, http://commons.wikimedia.org/wiki/File:Konzilseroeffnung_1.jpg.

Losing membership and influence throughout the world, leaders of the Catholic Church met in 1965 institute new measures to modernize and open the church. This ecumenical council would become known as the Second Vatican Council or Vatican II. Photograph of the grand procession of the Council Fathers at St. Peter’s Basilica, October 11, 1962. Wikimedia, http://commons.wikimedia.org/wiki/File:Konzilseroeffnung_1.jpg.

communities, and many decided to leave the cloister and do away with older forms of religious garb—including the habit—reflecting one of Vatican II’s goals of more thorough engagement of the church with the outside world.  As with priests, many nuns decided to leave consecrated religious life. Vatican II’s influence and tensions resonated for decades after its conclusion and it remains the lens through which Catholics and non-Catholics alike must view the modern church.

 

V. Politics and Policy

The decade’s political landscape began with a watershed presidential election. Americans were captivated by the 1960 race between Republican Vice President Richard Nixon and Democratic Senator John F. Kennedy, two candidates who pledged to move the nation forward and invigorate an economy experiencing the worst recession since the Great Depression. Kennedy promised to use federal programs to strengthen the economy and address pockets of longstanding poverty, while Nixon called for a reliance on private enterprise and reduction of government spending. Both candidates faced criticism as well; Nixon had to defend Dwight Eisenhower’s domestic policies, while Kennedy, who was attempting to become the first Catholic president, had to counteract questions about his faith and convince voters that he was experienced enough to lead.

One of the most notable events of the Nixon-Kennedy presidential campaign was their televised debate in September, the first of its kind between major presidential candidates. The debate focused on domestic policy and provided Kennedy with an important moment to present himself as a composed, knowledgeable statesman. In contrast, Nixon, an experienced debater who faced higher expectations, looked sweaty and defensive. Radio listeners famously thought the two men performed equally well, but the TV audience was much more impressed by Kennedy, giving him an advantage in subsequent debates. Ultimately, the election was extraordinarily close; in the largest voter turnout in American history up to that point, Kennedy bested Nixon by less than one percentage point (34,227,096 to 34,107,646 votes). Although Kennedy’s lead in electoral votes was more comfortable at 303 to 219, the Democratic Party’s victory did not translate in Congress, where Democrats lost a few seats in both houses. As a result, Kennedy entered office in 1961 without the mandate necessary to achieve the ambitious agenda he would refer to as the New Frontier.

Kennedy’s assassination in Dallas in November of 1963 left the nation in a malaise. With the youthful, popular president gone, Vice President Lyndon Johnson was sworn in and tasked with fulfilling the liberal promises of the New Frontier. On a May morning in 1964, President Johnson laid out a sweeping vision for a package of domestic reforms known as the Great Society. Speaking before that year’s graduates of the University of Michigan, Johnson called for “an end to poverty and racial injustice” and challenged both the graduates and American people to “enrich and elevate our national life, and to advance the quality of our American civilization.” At its heart, he promised, the Great Society would uplift racially and economically disfranchised Americans, too long denied access to federal guarantees of equal democratic and economic opportunity, while simultaneously raising all Americans’ standards and quality of life.

The Great Society’s legislation was breathtaking in scope, and many of its programs and agencies are still with us today. Most importantly, the Civil Rights Act of 1964 and the Voting Rights Act of 1965 codified federal support for many of the civil rights movement’s goals by prohibiting job discrimination, abolishing the segregation of public accommodations, and providing vigorous federal oversight of southern states’ primary and general election laws in order to guarantee minority access to the ballot. Ninety years after Reconstruction, these measures effectively ended Jim Crow.

In addition to this civil rights orientation, however, the Great Society took on a range of quality of life concerns that seemed solvable at last in a society of such affluence. It established the first federal Food Stamp Program. Medicare and Medicaid would ensure access to quality medical care for the aged and poor. In 1965, the Elementary and Secondary Education Act was the first sustained and significant federal investment in public education, totaling more than $1 billion. Significant funds were poured into colleges and universities as well. To “elevate and enrich our national life,” the Great Society also established the National Endowment for the Arts and the National Endowment for the Humanities, federal investments in arts and letters that fund American cultural expression to this day.

While these programs persisted and even thrived, in the years immediately following this flurry of legislative activity, the national conversation surrounding Johnson’s domestic agenda largely focused on the $3 billion spent on War on Poverty programming within the Great Society’s Economic Opportunity Act of 1964. No EOA program was more controversial than Community Action, considered the cornerstone antipoverty program. Johnson’s antipoverty planners felt the key to uplifting disfranchised and impoverished Americans was involving poor and marginalized citizens in the actual administration of poverty programs, what they called “maximum feasible participation.” Community Action Programs would give disfranchised Americans a seat at the table in planning and executing federally funded programs that were meant to benefit themselves—a significant sea change in the nation’s efforts to confront poverty, which had historically relied upon local political and business elites or charitable organizations for administration.

In fact, Johnson himself had never conceived of poor Americans running their own poverty programs. While the president’s rhetoric offered a stirring vision of the future, he had singularly old-school notions for how his poverty policies would work. In contrast to “maximum feasible participation,” the President imagined a second New Deal: local elite-run public works camps that would instill masculine virtues in unemployed young men. Community Action almost entirely bypassed local administrations and sought to build grassroots civil rights and community advocacy organizations, many of which had originated in the broader civil rights movement. Despite widespread support for most Great Society programs, the War on Poverty increasingly became the focal point of domestic criticisms from the left and right. On the left, frustrated liberals recognized the president’s resistance to empowering minority poor and also assailed the growing war in Vietnam, the cost of which undercut domestic poverty spending. As racial unrest and violence swept across urban centers, critics from the right lambasted federal spending for “unworthy” and even criminal citizens. When Richard Nixon was elected in 1968, he moved swiftly to return control over federal poverty spending to local political elites.

Despite the fact that the Civil Rights and Voting Rights Acts and the War on Poverty were crucial catalysts for the rise of Republicans in the South and West, Nixon and subsequent presidents and Congresses have left largely intact the bulk of the Great Society. Many of its programs such as Medicare and Medicaid, food stamps, federal spending for arts and literature, and Head Start are considered by many to be effective forms of government action. Even Community Action programs, so fraught during their few short years of activity, inspired and empowered a new generation of minority and poverty community activists who had never before felt, as one put it, “this government is with us.”

While much of the rhetoric surrounding the 1960s focused on a younger, more liberal generation’s progressive ideas, conservatism maintained a strong presence on the American political scene. Few political figures in the decade embodied the working-class, conservative views held by millions of Americans quite like George Wallace. Wallace’s vocal stance on segregation was immortalized in his 1963 inaugural address as Alabama governor with the phrase: “Segregation now, segregation tomorrow, segregation forever!” Just as the civil rights movement began to gain unprecedented strength, Wallace became the champion of the many white southerners uninterested in the movement’s goals. Consequently, Wallace was one of the best examples of the very real opposition civil rights activists faced in the late twentieth century.

As governor, Wallace used his position to enforce segregation whenever possible. Just five months after becoming governor, in his “Stand in the Schoolhouse Door,” Wallace himself tried to block two African American students from enrolling at the University of Alabama. His efforts were largely symbolic, but they earned him national recognition as a political figure willing to fight for what many southerners saw as their traditional way of life. Wallace made similar efforts to try to block federally mandated integration of his state’s public, elementary, and secondary schools in the fall of 1963. In all cases, President John F. Kennedy had to supersede Wallace’s actions to ensure integration moved forward.

Alabama governor George Wallace stands defiantly at the door of the University of Alabama, blocking the attempted integration of the school. Wallace was perhaps the most notoriously pro-segregation politician of the 1960s, proudly proclaiming in his 1963 inaugural address “segregation now, segregation tomorrow, segregation forever.” Warren K. Leffler, “[Governor George Wallace attempting to block integration at the University of Alabama],” June 11, 1963. Library of Congress, http://www.loc.gov/pictures/item/2003688161/.

Alabama governor George Wallace stands defiantly at the door of the University of Alabama, blocking the attempted integration of the school. Wallace was perhaps the most notoriously pro-segregation politician of the 1960s, proudly proclaiming in his 1963 inaugural address “segregation now, segregation tomorrow, segregation forever.” Warren K. Leffler, “[Governor George Wallace attempting to block integration at the University of Alabama],” June 11, 1963. Library of Congress, http://www.loc.gov/pictures/item/2003688161/.

In contrast to Wallace’s traditional stance on southern race relations, he took a very nontraditional approach to maintain power at the end of his term as governor. Because the state of Alabama only allowed governors to serve one term at that time, Wallace persuaded his wife, Lurleen, to run for governor so that he could use his influence with her to help shape state politics. Not only did Lurleen win, other Wallace supporters helped remove the term limits on governors, opening up future opportunities for him to serve as governor. Wallace entered the national political fray in 1968, when he made an unsuccessful presidential bid as a third-party candidate. After 1970, he served three more terms as governor of Alabama, survived an assassination attempt while campaigning for president, and eventually repudiated the segregationist views that made him so famous.Beleaguered by an unpopular war, inflation, and domestic unrest, President Johnson opted against reelection in March of 1968—an unprecedented move in modern American politics. The forthcoming presidential election was shaped by Vietnam and the aforementioned unrest as much as the campaigns of Democratic nominee Vice President Hubert Humphrey, Republican Richard Nixon, and third-party challenger George Wallace. The Democratic Party was in disarray in the spring of 1968, when senators Eugene McCarthy and Robert Kennedy challenged Johnson’s nomination and the president responded with his shocking announcement. Nixon’s candidacy was aided further by riots that broke out across the country after the assassination of Martin Luther King, Jr., and the shock and dismay experienced after the slaying of Robert Kennedy in June. The Republican nominee’s campaign was defined by shrewd maintenance of his public appearances and a pledge to restore peace and prosperity to what he called “the silent center; the millions of people in the middle of the political spectrum.” This campaign appeal was carefully calibrated to attract suburban Americans by linking liberals in favor of an overbearing federal government with the Silent Majority’s implied inverse: noisy urban minorities. Many embraced Nixon’s message; a September 1968 poll found that 80 percent of Americans believed public order had “broken down.”Meanwhile, Humphrey struggled to distance himself from Johnson and maintain working-class support in northern cities, where voters were drawn to Wallace’s appeals for law and order and a rejection of civil rights. The vice president had a final surge in northern cities with the aid of union support, but it was not enough to best Nixon’s campaign. The final tally was close: Nixon won 43.3 percent of the popular vote (31,783,783), narrowly besting Humphrey’s 42.7 percent (31,266,006). Wallace, meanwhile, carried five states in the Deep South, and his 13.5 percent (9,906,473) of the popular vote constituted an impressive showing for a third-party candidate. The Electoral College vote was more decisive for Nixon; he earned 302 electoral votes, while Humphrey and Wallace received only 191 and 45 votes, respectively. Although Republicans won a few seats, Democrats retained control of both the House and Senate and made Nixon the first president in 120 years to enter office with the opposition party controlling both houses.

VI. Foreign Affairs

The United States entered the 1960s unaccustomed to stark foreign policy failures, having emerged from World War II as a global superpower before waging a Cold War against the Soviet Union in the 1950s. In the new decade, unsuccessful conflicts in Cuba and Vietnam would yield embarrassment, fear, and tragedy, stunning a nation used to triumph and altering the way many thought of America’s role in international affairs.

On January 8, 1959, Fidel Castro and his forces triumphantly entered Havana and initiated a new era in Cuban history. Castro and compatriots such as Che Guevara and Celia Sánchez had much to celebrate as they made their way through the city’s streets. After losing American support, Cuban President Fulgencio Batista had fled the nation the previous week, ending the long war Castro’s forces and countless other armed revolutionary factions had fought to oust the dictator. The United States initially expressed public sympathy with Castro’s government, which was immediately granted diplomatic recognition. Behind the scenes, however, President Dwight Eisenhower and members of his administration were wary of the new leader. The relationship between the two governments rapidly deteriorated following Castro’s April 1959 visit to Washington, which included a troubled meeting with Vice President Richard Nixon. On October 19, 1960, the United States instituted a trade embargo to economically isolate the Cuban regime, and in January 1961 the two nations broke off formal diplomatic relations.

As the new Cuban government instituted leftist policies that centered on agrarian reform, land redistribution, and the nationalization of private enterprises, Cuba’s wealthy and middle class citizens fled the island in droves and began to settle in Miami and other American cities. The Central Intelligence Agency, acting under the mistaken belief that the Castro government lacked popular support and that Cuban citizens would revolt if given the opportunity, began to recruit members of the exile community to participate in an invasion of the island. On April 16, 1961, an invasion force consisting primarily of Cuban émigrés landed on Girón Beach at the Bay of Pigs. Cuban soldiers and civilians quickly overwhelmed the exiles, many of whom were taken prisoner. The Cuban government’s success at thwarting the Bay of Pigs invasion did much to legitimize the new regime and was a tremendous embarrassment for the Kennedy administration.

As the political relationship between Cuba and the United States disintegrated, the Castro government became more closely aligned with the Soviet Union. This strengthening of ties set the stage for the Cuban Missile Crisis, perhaps the most dramatic foreign policy crisis in the history of the United States. In 1962, in response to the US’s long-time maintenance of a nuclear arsenal in Turkey and at the invitation of the Cuban government, the Soviet Union deployed nuclear missiles in Cuba. On October 14, 1962, American spy planes detected the construction of missile launch sites, and on October 22, President Kennedy addressed the American people to alert them to this threat. Over the course of the next several days, the world watched in horror as the United States and the Soviet Union hovered on the brink of nuclear war. Finally, on October 28, the Soviet Union agreed to remove its missiles from Cuba in exchange for a US agreement to remove its missiles from Turkey and a formal pledge that the United States would not invade Cuba, and the crisis was resolved peacefully.

The Cuban Missile Crisis was a time of great fear throughout America. Women in this photograph urged President Kennedy to be cautious of instigating war. Phil Stanziola, “800 women strikers for peace on 47 St near the UN Bldg,” 1962. Library of Congress, http://www.loc.gov/pictures/item/2001696167/.

The Cuban Missile Crisis was a time of great fear throughout America. Women in this photograph urged President Kennedy to be cautious of instigating war. Phil Stanziola, “800 women strikers for peace on 47 St near the UN Bldg,” 1962. Library of Congress, http://www.loc.gov/pictures/item/2001696167/.

Though the Cuban Missile Crisis temporarily halted the flow of Cuban refugees into the United States, emigration reinitiated in earnest in the mid-1960s. In 1965, the Johnson administration and the Castro government brokered a deal that facilitated the reunion of families that had been separated by earlier waves of migration, opening the door for thousands to leave the island. In 1966 President Lyndon B. Johnson signed the Cuban Adjustment Act, a law granting automatic permanent residency to any Cuban who entered the United States. Over the course of the 1960s, hundreds of thousands of Cubans left their homeland and began to build new lives for themselves in America.

American involvement in the Vietnam War began during the age of decolonization. With the Soviet Union backing nationalist movements across the globe, the United States feared the expansion of communist influence and pledged to confront communist revolutions in the Truman Doctrine. Between 1946 and 1954, France fought a counterinsurgency campaign against the nationalist Vietminh forces led by Ho Chi Minh. America assisted the French war effort with funds, arms, and advisors. On the eve of the Geneva Peace Conference in 1954, Vietminh forces defeated the French army at Dien Bien Phu. The conference temporarily divided Vietnam into two separate states until United Nations-monitored elections occurred. Elections, however, never transpired as the US feared a Communist victory. Consequently the US established the Republic of Vietnam, or South Vietnam, with Ngo Dinh Diem serving as prime minister. America viewed Diem favorably; although he was a nationalist, Diem was anticommunist and had lived in the US. In 1955, the CIA supported Diem in his bid to defeat all opposing political elements in South Vietnam.

A series of events hampered America and South Vietnam’s early effort against communist forces. The Battle of Ap Bac in 1963 demonstrated a South Vietnam not fully prepared for the challenges of an insurgency. Despite a clear numerical advantage, as well as mechanized and airborne infantry, Army of the Republic of Vietnam (ARVN) forces were mauled by Vietcong (VC) units. Modeled after the US Army, ARVN was too technology-dependent to operate without US assistance. In the wake of Diem’s assassination and the merry-go-round of subsequent military dictators, the situation in South Vietnam further deteriorated. In 1964, the USS Maddox reported incoming fire from North Vietnamese ships. Although the validity of the Gulf of Tonkin incident remains questionable, the event resulted in the Gulf of Tonkin Resolution. This act of Congress provided Johnson with the power to defend Southeast Asia with any measures he deemed necessary. By 1965, the U.S. forces sought to engage the VC and NVA in battle. Under General William Westmoreland, head of Military Assistance Command, Vietnam (MACV), defeating the VC and NVA was the top priority. MACV commenced a war of attrition meant to exact a human toll Hanoi could not bear. The use of helicopters to transport soldiers into battle, kill ratios, and failure to retain hard-won ground came to epitomize the war.

Although American officials like Westmoreland and Secretary of Defense Robert McNamara claimed a communist defeat was on the horizon, by 1968 the realities in Vietnam proved otherwise. On January 30, during the Vietnamese lunar new year of Tet, VC and NVA forces launched a massive, nationwide assault against South Vietnam’s major population centers. The Communist offensive failed to topple the Saigon government and American and South Vietnamese troops decimated the VC ranks.

The 1968 Tet Offensive was indeed the turning point in the Vietnam War. As a major setback for the communist forces, American-sponsored nation-building efforts flourished across much of South Vietnam. Yet the fallout from Tet proved a public relations triumph for North Vietnam. As the first truly televised war, scenes of fighting, particularly those from Tet, fueled antiwar movements in the US. Images from Vietnam presented an out-of-control conflict where Americans were needlessly dying. The My Lai Massacre, which involved US solders killing unarmed South Vietnamese citizens in March of 1968, further soured public opinion of the war and contributed to the misconception that all American soldiers were murderers.

When the most trusted news anchor in America, Walter Cronkite, declared that the US could not win the war, the Johnson administration knew it had lost public support. With growing antiwar sentiment after years of endless war, Johnson excused himself from the upcoming 1968 presidential election.

After Richard Nixon was elected, his administration sought to disengage America from the war in Vietnam. American combat forces were withdrawn from South Vietnam in a process called Vietnamization. Dubbed “victory with honor,” this process amounted to the American abandonment of South Vietnam, and the US had entered into peace negotiations with the North Vietnamese. Peace talks, however, stalled as both sides refused to compromise. Hoping the absence of American ground forces would afford a quick victory for Hanoi, the NVA launched a massive assault on South Vietnam known as the 1972 Easter Offensive. Only resilient ARVN units and US airpower stymied the NVA offensive. Consequently, secret negotiations between Hanoi and Washington resulted in the 1973 Paris Peace Accords. Omitted from negotiations, the South Vietnamese felt betrayed as the accords permitted NVA units to occupy South Vietnamese territory; the accords also severely curtailed US monetary and military support for Saigon. Without such assistance, the Republic of Vietnam succumbed to communist rule after North Vietnam’s 1975 invasion of the country.

This rapid growth of Asian American communities after 1965 emerged from the exigencies of Vietnam and the broader Cold War. Aware that the nation’s discriminatory immigration laws favoring Western European immigrants were a liability in the Cold War, the US Congress passed the Hart-Cellar Immigration Act of 1965; the act supplanted immigration laws based on national quotas with a system that provided a pathway for skilled workers and the reunification of families.  This sparked the migration of scientists, engineers, and other researchers from East Asia who participated in the nation’s defense research industry.  Given the shortage of medical professionals in the nation’s urban and rural areas and America’s neo-colonial relationship with the Philippines, at least 25,000 Filipina nurses came to the US in the decades after the 1965 Immigration Act was passed.

The end of America’s wars in Southeast Asia triggered the dislocation of thousands of refugees from Vietnam, Laos, and Cambodia.  After an initial exodus of about 100,000 Vietnamese refugees who were characterized by their relative wealth, education, and connections with the US government, over 300,000 additional Vietnamese refugees fled the political and economic instability wrought by the institution of re-education camps and a border war with China.  This next wave of migration also included thousands of Cambodians fleeing the Khmer Rouge genocide and smaller numbers of Cham, Mien, and Lao refugees. Although the US sought to assimilate refugees as quickly as possible by dispersing them into America’s interior, Southeast Asian Americans migrated to be with their co-ethnics or to areas with existing Asian communities once they accrued enough capital. Since the 1970s, Asian migrants have been economically diverse, and recent immigration trends have considerably changed the landscape of Asian America.  From 2000 to 2010 the Asian American population grew by 46 percent, and there are now over 17.3 million Asian Americans in the US.

 

VII. Conclusion

In 1969, Americans hailed the moon landing as a profound victory in the “space race” against the Soviet Union that fulfilled the promise of the late John F. Kennedy, who had declared in 1961 that the U.S. would put a man on the moon by the end of the decade. But while Neil Armstrong said his steps marked “one giant leap for mankind,” and Americans marveled at the achievement, the brief moment of wonder only punctuated years of turmoil. The Vietnam War disillusioned a generation, riots rocked cities, protests hit campuses, and assassinations robbed the nation of many of its leaders. The forward-thinking spirit of a complex decade had waned. Uncertainty loomed.

 

This chapter was edited by Samuel Abramson, with content contributions by Samuel Abramson, Marsha Barrett, Brent Cebul, Michell Chresfield, William Cossen, Jenifer Dodd, Michael Falcone, Leif Fredrickson, Jean-Paul de Guzman, Jordan Hill, William Kelly, Lucie Kyrova, Maria Montalvo, Emily Prifogle, Ansley Quiros, Tanya Roth, and Robert Thompson.

cc-by-sa-icon

26. The Affluent Society

Migrant Farm Workers, 1959, Michael Rougier—Time & Life Pictures/Getty Images. Via Life.

Migrant Farm Workers, 1959, Michael Rougier—Time & Life Pictures/Getty Images. Via Life.

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

In 1958, Harvard economist and public intellectual John Kenneth Galbraith published The Affluent Society. Galbraith’s celebrated book examined America’s new post-World War II consumer economy and political culture. The book, which popularized phrases such as “conventional wisdom,” noted the unparalleled riches of American economic growth but criticized the underlying structures of an economy dedicated to increasing production and the consumption of goods. Galbraith argued that the United States’ economy, based on an almost hedonistic consumption of luxury products, would and must inevitably lead to economic inequality as the private sector interests enriched themselves at the expense of the American public. Galbraith warned that an economy where “wants are increasingly created by the process by which they are satisfied,” was unsound, unsustainable, and, ultimately, immoral. “The Affluent Society,” he said, was anything but.

The contradictions that Galbraith noted mark the decade of the 1950s. While economists and scholars continue to debate the merits of Galbraith’s warnings and predictions, his analysis was so insightful that the title of his book has come to serve as a ready label for postwar American society. In the almost two decades after the end of World War II, the American economy witnessed massive and sustained growth that reshaped American culture through the abundance of consumer goods. Standards of living climbed to unparalleled heights. All income levels shared and inequality plummeted in what some economists have called “The Great Compression.”

And yet, as Galbraith noted, the Affluent Society had fundamental flaws. The new consumer economy that lifted millions of Americans into its burgeoning middle class also produced inequality. Women struggled to claim equal rights as full participants in American society. The ranks of America’s poor struggled to win access to good schools and good healthcare and good jobs. The Jim Crow South tenaciously defended segregation and American blacks and other minorities everywhere suffered discrimination. The suburbs gave middle class Americans new space, but left cities to wither in spirals of poverty and crime.

It is the contradictions of the Affluent Society that define a decade of unrivaled prosperity and crippling poverty, of expanded opportunity and entrenched discrimination, and of new lifestyles and stifling conformity.

 

II. The Rise of the Suburbs

Levittown, early 1950s, via Flickr user markgregory.

Levittown, early 1950s, via Flickr user markgregory.

While the electric streetcar of the late-nineteenth century facilitated the outward movement of the well to do, the seeds of a suburban nation were planted in the mid-twentieth century. At the height of the Great Depression, in 1932, some 250,000 households lost their property to foreclosure. A year later, half of all U.S. mortgages were in default. The foreclosure rate stood at more than a 1,000 per day. In response, FDR’s New Deal created the Home Owners Loan Corporation (HOLC), which began purchasing and refinancing existing mortgages at risk of default. HOLC introduced the amortized mortgage, allowing borrowers to pay back interest and principle over twenty to thirty years instead of the then standard five-year mortgage that carried large balloon payments at the end of the contract. Though homeowners paid more for their homes under this new system, home-ownership was opened to the multitudes who could now gain residential stability, lower monthly mortgage payments, and accrue equity and wealth as property values rose over time.

Additionally, the Federal Housing Administration (FHA), another New Deal organization, increased access to homeownership by insuring mortgages and protecting lenders from financial loss in the event of a default. Though only slightly more than a third of homes had an FHA backed mortgage by 1964, FHA backed loans had a ripple effect with private lenders granting more and more home loans even to non-FHA backed mortgages. Though started in the midst of the Great Depression, the effects of government programs and subsidies like HOLC and the FHA were fully felt in the postwar economy and fueled the growth of homeownership and the rise of the suburbs.

Though domestic spending programs like HOLC and FHA helped create the outlines of the new consumer economy, United States involvement and the Allied victory in World War II pushed the country out of depression and into a sustained economic boom. Wartime spending exploded and, after the war, sustained spending fueled further growth. Government expenditures provided loans to veterans, subsidized corporate research and development, and built the Interstate Highway System. In the decades after World War II, business boomed, unionization peaked, wages rose, and sustained growth buoyed a new consumer economy. The Servicemen’s Readjustment Act (The G.I. Bill), passed in 1944, offered low-interest home loans, a stipend to attend college, loans to start a business, and unemployment benefits.

The rapid growth of homeownership and the rise of suburban communities helped drive the postwar economic boom. Suburban neighborhoods of single-family homes tore their way through the outskirts of cities. William Levitt built the first Levittown, the archetype suburban community, in 1946 in Long Island, New York. Purchasing mass acreage, “subdividing” lots, and contracted crews to build countless homes at economies of scale, Levitt offered affordable suburban housing to veterans and their families. Levitt became the prophet of the new suburbs, heralding a massive internal migration. The country’s suburban share of the population rose from 19.5% in 1940 to 30.7% by 1960. Homeownership rates rose from 44% in 1940 to almost 62% in 1960. Between 1940 and 1950, suburban communities of greater than 10,000 people grew 22.1%, and planned communities grew at an astonishing rate of 126.1%. As historian Lizabeth Cohen notes, these new suburbs “mushroomed in territorial size and the populations they harbored.” Between 1950 and 1970, America’s suburban population nearly doubled to 74 million, with 83 percent of all population growth occurring in suburban places.

The postwar construction boom fed into countless industries. As manufacturers converted back to consumer goods after the war, and as the suburbs developed, appliance and automobile sales rose dramatically. Flush with rising wages and wartime savings, homeowners also used newly created installment plans to buy new consumer goods at once instead of saving for years to make major purchases. The mass-distribution of credit cards, first issued in 1950, further increased homeowners’ access to credit. Fueled by credit and no longer stymied by the Depression or wartime restrictions, consumers bought countless washers, dryers, refrigerators, freezers, and, suddenly, televisions. The percentage of Americans that owned at least one television increased from 12% in 1950 to more than 87% in 1960. This new suburban economy also led to increased demand for automobiles. The percentage of American families owning cars increased from 54% in 1948 to 74% in 1959. Motor fuel consumption rose from some 22 million gallons in 1945 to around 59 million gallons in 1958.

While the car had been around for decades by the 1950s, car culture really took off as a national fad during the decade. Arthur C. Base, August 1950 issue of Science and Mechanics. Wikimedia, http://commons.wikimedia.org/wiki/File:Car_of_the_Future_1950.jpg.

While the car had been around for decades by the 1950s, car culture really took off as a national fad during the decade. Arthur C. Base, August 1950 issue of Science and Mechanics. Wikimedia, http://commons.wikimedia.org/wiki/File:Car_of_the_Future_1950.jpg.

The rise of the suburbs transformed America’s countryside as suburban growth reclaimed millions of acres of rural space, turning agrarian communities into suburban landscapes. As suburban homeowners retreated from the cities into new developments, new developments wrenched more and more agricultural workers off the land, often pushing them into the very cities that suburbanites were fleeing.

The process of suburbanization drove the movement of Americans and turned the wheels of the new consumer economy. Seen from a macroeconomic level, the postwar economic boom turned America into a land of economic abundance. For advantaged buyers, loans had never been easier to attain, consumer goods had never been more accessible, and well-paying jobs had never been more abundant. And yet, beneath the aggregate numbers, patterns of racial disparity, sexual discrimination, and economic inequality persevered and questioned man of the assumptions of an Affluent Society.

In 1939 real estate appraisers arrived in sunny Pasadena, California. Armed with elaborate questionnaires to evaluate the city’s building conditions, the appraisers were well-versed in the policies of the Home Owners Loan Corporation (HOLC). In one neighborhood, the majority of structures were rated in “fair” repair and it was noted that there was a lack of “construction hazards or flood threats.” However, appraisers concluded that the area “is detrimentally affected by 10 owner occupant Negro families.” While “the Negroes are said to be of the better class,” the appraisers concluded, “it seems inevitable that ownership and property values will drift to lower levels.”

While suburbanization and the new consumer economy produced unprecedented wealth and affluence, the fruits of this economic and spatial abundance did not reach all Americans equally. The new economic structures and suburban spaces of the postwar period produced perhaps as much inequality as affluence. Wealth created by the booming economy filtered through social structures with built-in privileges and prejudices. Just when many middle and lower class white American families began their journey of upward mobility by moving to the suburbs with the help of government spending and government programs such as the FHA and the GI Bill, many African Americans and other racial minorities found themselves systematically shut out.

A look at the relationship between federal organizations such as the HOLC and FHA and private banks, lenders, and real estate agents tells the story of standardized policies that produced a segregated housing market. At the core of HOLC appraisal techniques, which private parties also adopted, was the pernicious insistence that mixed-race and minority dominated neighborhoods were credit risks. In partnership with local lenders and real estate agents, HOLC created Residential Security Maps to identify high and low risk-lending areas. People familiar with the local real estate market filled out uniform surveys on each neighborhood. Relying on this information, HOLC assigned every neighborhood a letter grade from A to D and a corresponding color code. The least secure, highest risk neighborhoods for loans received a D grade and the color red. Banks refused to loan money in these “redlined” areas.

Pair with 1938 Brooklyn Redline map.

Black communities in cities like Detroit, Chicago, Brooklyn, and Atlanta experienced “redlining,” the process by which banks and other organizations demarcated minority neighborhoods on a map with a red line. Doing so made visible the areas they believed were unfit for their services, denying black residents loans, housing, groceries, and other necessities of modern life. Redlined Map of Greater Atlanta, http://blog.historian4hire.net/wp-content/uploads/2014/01/HOLC-RedlineMap-NARA.jpg.

Pair with Redlined Map of Greater Atlanta

1938 Brooklyn Redline map. UrbanOasis.org via National Archives (NARA II RG 195 Entry 39 Folder “Brooklyn (Kings Co.)” Box 58). Edited by ASommer, PlaNYourCity. https://planyourcity.files.wordpress.com/2014/04/bk-map.jpg.

Phrases like “subversive racial elements” and “racial hazards” pervade the redlined area description files of surveyors and HOLC officials. Los Angeles’ Echo Park neighborhood, for instance, had concentrations of Japanese and African Americans and a “sprinkling of Russians and Mexicans.” The HOLC security map and survey noted that the neighborhood’s “adverse racial influences which are noticeably increasing inevitably presage lower values, rentals and a rapid decrease in residential desirability.”

While the HOLC was a fairly short-lived New Deal agency, the influence of its security maps lived on in the Federal Housing Authority (FHA) and the GI Bill dispensing Veteran’s Administration (VA). Both of these government organizations, which set the standard that private lenders followed, refused to back bank mortgages that did not adhere to HOLC’s security maps. On the one hand FHA and VA backed loans were an enormous boon to those who qualified for them. Millions of Americans received mortgages that they otherwise would not have qualified for. But FHA-backed mortgages were not available to all. Racial minorities could not get loans for property improvements in their own neighborhoods—seen as credit risks—and were denied mortgages to purchase property in other areas for fear that their presence would extend the red line into a new community. Levittown, the poster-child of the new suburban America, only allowed whites to purchase homes. Thus HOLC policies and private developers increased home ownership and stability for white Americans while simultaneously creating and enforcing racial segregation.

The exclusionary structures of the postwar economy pushed African Americans and other minorities to protest. Over time the federal government attempted to rectify the racial segregation created, or at least facilitated, in part by its own policies. In 1948, the U.S. Supreme Court case Shelley v. Kraemer struck down explicitly racial neighborhood housing covenants, making it illegal to explicitly consider race when selling a house. It would be years, however, until housing acts passed in the 1960s could provide some federal muscle to complement grassroots attempts to ensure equal access.

During the 1950s and early 1960s many Americans retreated to the suburbs to enjoy the new consumer economy and search for some normalcy and security after the instability of depression and war. But many could not. It was both the limits and opportunities of housing that shaped the contours of postwar American society.

 

III. Race and Education

School desegregation was a tense experience for all involved, but none more so than the African American students brought into white schools. The “Little Rock Nine” were the first to do this in Arkansas; their escorts, the 101st Airborne Division of the U.S. Army, provided protection to these students who so bravely took that first step. Photograph, 1957. Wikimedia, http://commons.wikimedia.org/wiki/File:101st_Airborne_at_Little_Rock_Central_High.jpg.

School desegregation was a tense experience for all involved, but none more so than the African American students brought into white schools. The “Little Rock Nine” were the first to do this in Arkansas; their escorts, the 101st Airborne Division of the U.S. Army, provided protection to these students who so bravely took that first step. Photograph, 1957. Wikimedia, http://commons.wikimedia.org/wiki/File:101st_Airborne_at_Little_Rock_Central_High.jpg.

Older battles over racial exclusion also confronted postwar American society. One long-simmering struggle targeted segregated schooling. Since the Supreme Court’s decision in Plessy v. Ferguson (1896), black Americans, particularly in the American South, had fully felt the deleterious effects of segregated education. Their battle against Plessy for inclusion in American education stretched across half a century when the Supreme Court again took up the merits of “separate but equal.”

On May 17, 1954, after two years of argument, re-argument, and deliberation, Chief Justice Earl Warren announced the Supreme Court’s decision on segregated schooling in Oliver Brown, et al v. Board of Education of Topeka, et al. The court found by a unanimous 9-0 vote that racial segregation violated the Equal Protection Clause of the Fourteenth Amendment. The court’s decision declared, “Separate educational facilities are inherently unequal.” “Separate but equal” was made unconstitutional.

Decades of African American-led litigation, local agitation against racial inequality, and liberal Supreme Court justices made Brown v. Board possible. In the early 1930s, the National Association for the Advancement of Colored People (NAACP) began a concerted effort to erode the legal underpinnings of segregation in the American South. Legal, or de jure, segregation subjected racial minorities to discriminatory laws and policies. Law and custom in the South hardened anti-black restrictions. But through a series of carefully chosen and contested court cases concerning education, disfranchisement, and jury selection, NAACP lawyers such as Charles Hamilton Houston, Robert L. Clark, and future Supreme Court Justice Thurgood Marshall undermined Jim Crow’s constitutional underpinnings. Initially seeking to demonstrate that states systematically failed to provide African American students “equal” resources and facilities, and thus failed to live up to Plessy, by the late 1940s activists began to more forcefully challenge the assumptions that “separate” was constitutional at all.

The NAACP was a central organization in the fight to end segregation, discrimination, and injustice based on race. NAACP leaders, including Thurgood Marshall (who would become the first African American Supreme Court Justice), hold a poster decrying racial bias in Mississippi in 1956. Photograph, 1956. Library of Congress, http://www.loc.gov/pictures/item/99401448/.

The NAACP was a central organization in the fight to end segregation, discrimination, and injustice based on race. NAACP leaders, including Thurgood Marshall (who would become the first African American Supreme Court Justice), hold a poster decrying racial bias in Mississippi in 1956. Photograph, 1956. Library of Congress, http://www.loc.gov/pictures/item/99401448/.

Though remembered as just one lawsuit, Brown consolidated five separate cases that had originated in the southeastern United States: Briggs v. Elliott (South Carolina), Davis v. County School Board of Prince Edward County (Virginia), Beulah v. Belton (Delaware), Boiling v. Sharpe (Washington, D. C.), and Brown v. Board of Education (Kansas). Working with local activists already involved in desegregation fights, the NAACP purposely chose cases with a diverse set of local backgrounds to show that segregation was not just an issue in the Deep South, and that a sweeping judgment on the fundamental constitutionality of Plessy was needed.

Briggs v. Elliott had illustrated, on the one hand, the extreme deficiencies in segregated black schools. The first case accepted by the NAACP, Briggs originated in rural Clarendon County, South Carolina, where taxpayers in 1950 spent $179 to educate each white student while spending $43 for each black student. The district’s twelve white schools were cumulatively worth $637,850; the value of its sixty-one black schools (mostly dilapidated, over-crowded shacks), was $194,575. While Briggs underscored the South’s failure to follow Plessy, the Brown v. Board suit focused less on material disparities between black and white schools (which were significantly less than in places like Clarendon County) and more on the social and spiritual degradation that accompanied legal segregation. This case cut to the basic question of whether or not “separate” was itself inherently unequal. The NAACP said the two notions were incompatible. As one witness before the U. S. District Court of Kansas said, “the entire colored race is craving light, and the only way to reach the light is to start [black and white] children together in their infancy and they come up together.”

To make its case, the NAACP martialed historical and social scientific evidence. The Court found the historical evidence inconclusive, and drew their ruling more heavily from the NAACP’s argument that segregation psychologically damaged black children. To make this argument, association lawyers relied upon social scientific evidence, such as the famous doll experiments of Kenneth and Mamie Clark. The Clarks demonstrated that while young white girls would naturally choose to play with white dolls, young black girls would, too. The Clarks argued that black children’s aesthetic and moral preference for white dolls demonstrated the pernicious effects and self-loathing produced by segregation.

Identifying and denouncing injustice, though, is different from rectifying it. Though Brown repudiated Plessy, the Court’s orders did not extend to segregation in places other than public schools and, even then, while recognizing the historical importance of the decision, the justices set aside the divisive yet essential question of remediation and enforcement to preserve a unanimous decision. Their infamously ambiguous order in 1955 (what came to be known as Brown II) that school districts desegregate “with all deliberate speed” was so vague and ineffectual that it left the actual business of desegregation in the hands of those who opposed it.

In most of the South, as well as the rest of the country, school integration did not occur on a wide scale until well after Brown. Only in the 1964 Civil Rights Act did the federal government finally implement some enforcement of the Brown decision by threatening to withhold funding from recalcitrant school districts, financially compelling desegregation, but even then southern districts found loopholes. Court decisions such as Green v. New Kent County (1968) and Alexander v. Holmes (1969) finally closed some of those loopholes, such as “freedom of choice” plans, to compel some measure of actual integration.

When Brown finally was enforced in the South, the quantitative impact was staggering. In the early 1950s, virtually no southern black students attended white schools. By 1968, fourteen years after Brown, some eighty percent of black southerners remained in schools that were ninety- to one-hundred-percent nonwhite. By 1972, though, just twenty-five percent were in such schools, and fifty-five percent remained in schools with a simple nonwhite minority. By many measures, the public schools of the South ironically became the most integrated in the nation.

As a landmark moment in American history, Brown’s significance perhaps lies less in what immediate tangible changes it wrought in African American life—which were slow, partial, and inseparable from a much longer chain of events—than in the idealism it expressed and the momentum it created. The nation’s highest court had attacked one of the fundamental supports of Jim Crow segregation and offered constitutional cover for the creation of one of the greatest social movements in American history.

 

IV. Civil Rights in an Affluent Society

Segregation extended beyond private business property; this segregated drinking fountain was located on the ground of the Halifax county courthouse in North Carolina. Photograph, April 1938. Wikimedia, http://commons.wikimedia.org/wiki/File:Segregation_1938b.jpg.

Segregation extended beyond private business property; this segregated drinking fountain was located on the ground of the Halifax county courthouse in North Carolina. Photograph, April 1938. Wikimedia, http://commons.wikimedia.org/wiki/File:Segregation_1938b.jpg.

Education was but one aspect of the nation’s Jim Crow machinery. African Americans had been fighting against a variety of racist policies, cultures and beliefs in all aspects of American life. And while the struggle for black inclusion had few victories before World War II, the war and the “Double V” campaign as well as the postwar economic boom led to rising expectations for many African Americans. When persistent racism and racial segregation undercut the promise of economic and social mobility, African Americans began mobilizing on an unprecedented scale against the various discriminatory social and legal structures.

While many of the civil rights movement’s most memorable and important moments, such as the sit-ins freedom rides and especially the March on Washington, occurred in the 1960s, the 1950s were a significant decade in the sometimes-tragic, sometimes-triumphant march of civil rights in the United States. In 1953, years before Rosa Parks’ iconic confrontation on a Montgomery city bus, an African American woman named Sarah Keys publicly challenged segregated public transportation. Keys, then serving in the Women’s Army Corps, traveled from her army base in New Jersey back to North Carolina to visit her family. When the bus stopped in North Carolina, the driver asked her to give up her seat for a white customer. Her refusal to do so landed her in jail in 1953 and led to a landmark 1955 decision, Sarah Keys v. Carolina Coach Company, in which the Interstate Commerce Commission ruled that “separate but equal” violated the Interstate Commerce Clause of the U.S. Constitution. Poorly enforced, it nevertheless gave legal coverage for the freedom riders years later. Moreover, it was a morale-building decision. Six days after the decision was announced, Rosa Parks refused to give up her seat in Montgomery.

But if some events encouraged civil rights workers with the promise of progress, others were so savage they convinced activists that they could do nothing but resist. In the summer of 1955, two white men in Mississippi kidnapped and brutally murdered a fourteen-year-old boy Emmett Till. Till, visiting from Chicago and perhaps unfamiliar with the etiquette of Jim Crow, allegedly whistled at a white woman named Carolyn Bryant. Her husband, Roy Bryant, and another man, J.W. Milam, abducted Till from his relatives’ home, beat him, mutilated him, shot him, and threw his body in the Tallahatchie River. But the body was found. Emmett’s mother held an open-casket funeral so that Till’s disfigured body could make national news. The men were brought to trial. The evidence was damning, but an all-white jury found the two not guilty. Only months after the decision the two boasted of their crime in Look magazine. For young black men and women soon to propel the civil rights movement, the Till case was an indelible lesson.

Four months after Till’s death, Rosa Parks refused to surrender her seat on a Montgomery city bus. Her arrest launched the Montgomery bus boycott, a foundational moment in the civil rights crusade. Montgomery’s public transportation system had longstanding rules that required African American passengers to sit in the back of the bus and give up their seats to white passengers when the buses filled. Parks refused to move on December 1, 1955 and was arrested. She was not the first to protest against the policy by staying seated on a Montgomery bus, but she was the woman around whom Montgomery activists rallied a boycott around.

Soon after Parks’ arrest, Montgomery’s black population, organized behind the recently arrived Baptist minister Martin Luther King Jr. and formed the Montgomery Improvement Association (MIA) to coordinate a widespread boycott. During December 1955 and all of 1956, King’s leadership sustained the boycott and thrust him into the national spotlight. The Supreme Court ruled against Montgomery and on December 20, 1956 King brought the boycott to a successful conclusion, ending segregation on Montgomery’s public transportation and establishing his reputation as a national leader in African American efforts for equal rights.

Motivated by the success of the Montgomery boycott, King and other African American leaders looked for ways to continue the fight. In 1957, King helped create the Southern Christian Leadership Conference (SCLC). Unlike the MIA, which targeted one specific policy in one specific city, the SCLC was a coordinating council to helping civil rights groups across the South coordinate and sustain boycotts, protests, and assaults on southern Jim Crow laws.

As pressure built, congress passed the Civil Rights Act of 1957, the first such measure passed since Reconstruction. Although the act was nearly compromised away to nothing, although it achieved some gains, such as creating the Civil Rights Commission in the Department of Justice to investigate claims of racial discrimination, it nevertheless signaled that pressure was finally mounting for Americans to finally confront the racial legacy of slavery and discrimination.

Despite successes at both the local and national level, the civil rights movement faced bitter opposition. Those opposed to the movement often used violent tactics to scare and intimidate African Americans and subvert legal rulings and court orders. For example, a year into the Montgomery bus boycott, angry white southerners bombed four African American churches as well as the homes of King and fellow civil rights leader E. D. Nixon. Though King, Nixon and the MIA persevered in the face of such violence, it was only a taste of things to come. Such unremitting hostility and violence left the outcome of the burgeoning civil rights movement in doubt. Despite its successes, civil rights activists looked back on the 1950s as a decade of at best mixed results and incomplete accomplishments. While the bus boycott, Supreme Court rulings and other civil rights activities signaled progress, church bombings, death threats, and stubborn legislators demonstrated the distance that still needed to be traveled.

 

V. Gender and Culture in the Affluent Society

Pair with http://everythingcroton.blogspot.com/2012/10/more-everything-crotons-fabulous_13.html.

New inventions to make housework easier and more fun for women proliferated in the post-war era, creating an advertising and marketing frenzy to attract female consumers to certain products. http://envisioningtheamericandream.files.wordpress.com/2013/03/housewives-chores.jpg.

America’s consumer economy reshaped how Americans experienced culture and shaped their identities. The Affluent Society gave Americans new experiences, new outlets, and new ways to understand and interact with one another.

“The American household is on the threshold of a revolution,” the New York Times declared in August 1948. “The reason is television.” A distinct post-war phenomenon, television was actually several years in the making before it transformed postwar American culture. Presented to the American public at New York World’s Fair in 1939, the commercialization of television in the United States lagged during the war year. In 1947, though, regular full-scale broadcasting became available to the public. Television was instantly popular, so much so that by early 1948 Newsweek reported that it was “catching on like a case of high-toned scarlet fever.” Indeed, between 1948 and 1955 close to two-thirds of the nation’s households purchased a television set. By the end of the 1950s, 90 percent of American families had one and the average viewer was tuning in for almost 5 hours a day.

The technological ability to transmit images via radio waves gave birth to television. Television borrowed radio’s organizational structure, too. The big radio broadcasting companies, NBC, CBS, and ABC, used their technical expertise and capital reserves to conquer the airwaves. They acquired licenses to local stations and eliminated their few independent competitors. The Federal Communication Commission’s (FCC) refusal to issue any new licenses between 1948 and 1955 was a de facto endorsement of the big three’s stranglehold on the market.

In addition to replicating radio’s organizational structure, television also looked to radio for content. Many of the early programs were adaptations of popular radio variety and comedy shows, including the Ed Sullivan Show and Milton Berle’s Texaco Star Theater. These were accompanied by live plays, dramas, sports, and situation comedies. Due to the cost and difficulty of recording, most programs were broadcast live, forcing stations across the country to air shows at the same time. And since audiences had a limited number of channels to choose from, viewing experiences were broadly shared. Upwards of two thirds of television-owning households, for instance, watched popular shows such as I Love Lucy.

The limited number of channels and programs meant that networks selected programs that appealed to the widest possible audience to draw viewers and, more importantly, television’s greatest financers: advertisers. By the mid-1950s, an hour of primetime programming cost about $150,000 (about $1.5 million in today’s dollars) to produce. This proved too expensive for most commercial sponsors, who began turning to a joint financing model of 30-second spot ads. The commercial need to appeal to as many people as possible promoted the production of shows aimed at the entire family. Programs such as Father Knows Best and Leave it to Beaver featured light topics, humor, and a guaranteed happy ending the whole family could enjoy.

Advertising began creeping up everywhere in the 1950s. No longer confined to commercials or newspapers, advertisements were subtly (or not so subtly in this case) worked into TV shows like the Quiz Show “21”. (Geritol is a dietary supplement.) Orlando Fernandez, “[Quiz show "21" host Jack Barry turns toward contestant Charles Van Doren as fellow contestant Vivienne Nearine looks on],” 1957. Library of Congress, http://www.loc.gov/pictures/item/00652124/.

Advertising began creeping up everywhere in the 1950s. No longer confined to commercials or newspapers, advertisements were subtly (or not so subtly in this case) worked into TV shows like the Quiz Show “21”. (Geritol is a dietary supplement.) Orlando Fernandez, “[Quiz show “21” host Jack Barry turns toward contestant Charles Van Doren as fellow contestant Vivienne Nearine looks on],” 1957. Library of Congress, http://www.loc.gov/pictures/item/00652124/.

Television’s broad appeal, however, was about more than money and entertainment. Shows of the 1950s, such as Father Knows Best and I Love Lucy, depicted a decade that extolled the nuclear family, adhered to “traditional” gender roles, and embraced white, middle-class domesticity. Leave It to Beaver centered on the breadwinner-father and homemaker-mother guiding their children through life lessons. Cold War American culture idealized the so-called “nuclear family.” There was a societal consensus that such a lifestyle was not only beneficial, but the most effective way to safeguard American prosperity against deviancy and communist threats.The marriage of the suburban consumer culture and Cold War security concerns facilitated, and in turn was supported by, the ongoing postwar baby boom. From 1946 to 1964, American fertility experienced an unprecedented spike. A century of declining birth rates abruptly reversed. Although popular memory credits the cause of the baby boom to the return of virulent soldiers from battle, the real story is more nuanced. After years of economic depression families were now wealthy enough to support larger families and had homes large enough to accommodate them, while women married younger and American culture celebrated the ideal of a large, insular family.Underlying this “reproductive consensus” was the new cult of professionalism that pervaded postwar American culture, including the professionalization of homemaking. Mothers and fathers alike flocked to the experts for their opinions on marriage, sexuality, and, most especially, child-rearing. Psychiatrists held an almost mythic status as people took their opinions and prescriptions, as well as their vocabulary, into their everyday life. Books like Dr. Spock’s Baby and Child Care (1946) were diligently studied by women who took their careers as house-wife as just that: a career, complete with all the demands and professional trappings of job development and training. And since most women had multiple children roughly the same age as their neighbors’, a cultural obsession with kids flourished throughout the decade. Women bore the brunt of this pressure, chided if they did not give enough of their time to the children—especially if it was at the expense of a career—yet cautioned that spending too much time would lead to “Momism,” producing “sissy” boys who would be incapable of contributing to society and extremely susceptible to the communist threat.A new youth culture exploded in American popular culture. On the one hand, the anxieties of the atomic age hit America’s youth particularly hard. Keenly aware of the discontent bubbling beneath the surface of the Affluent Society, for instance, many youth embraced rebellion. The 1955 film Rebel Without a Cause demonstrates the restlessness and emotional incertitude of the postwar generation, highlighting both the affluence of their lifestyle and the lack of satisfaction they derived from it. At the same time, perhaps yearning for something beyond the “massification” of American culture but having few other options beyond popular culture, American youth turned to rock ‘n’ roll. They listened to Little Richard, Buddy Holly, and especially Elvis Presley (whose hip movement alone was seen as culturally subversive).

While an accepted part of culture in the twenty-first century, Rock and Roll music was seen by many as devilish, having a corruptive influence on the youth of America. Chuck Berry defined the rhythm and style that made Rock and Roll so distinctive and irresistible. Publicity photo, c, 1971. Wikimedia, http://commons.wikimedia.org/wiki/File:Chuck_Berry_1971.JPG.

Rock and roll music was seen by many as devilish, having a corruptive influence on the youth of America. Chuck Berry defined the rhythm and style that made rock and roll so distinctive and irresistible. Publicity photo, c, 1971. Wikimedia.

The popularity of rock and roll, which emerged in the postwar years, had not yet blossomed into the countercultural musical revolution of the coming decade, but it provided a magnet for teenage restlessness and rebellion. “Television and Elvis,” the musician Bruce Springsteen would recollect, “gave us full access to a new language, a new form of communication, a new way of being, a new way of looking, a new way of thinking; about sex, about race, about identity, about life; a new way of being an American, a human being; and a new way of hearing music.” American youth had seen so little of Elvis’ energy and sensuality elsewhere in their culture. “Once Elvis came across the airwaves,” Springsteen said, “once he was heard and seen in action, you could not put the genie back in the bottle. After that moment, there was yesterday, and there was today, and there was a red hot, rockabilly forging of a new tomorrow, before your very eyes.”

While black musicians like Chuck Berry created Rock and Roll, it was brought into the mainstream (white) American culture through performers like Elvis. His good looks, sensual dancing, and sonorous voice stole the hearts of millions of American teenage girls, which was at that moment becoming a central segment of the consumer population. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/3/35/Elvis_Presley_Jailhouse_Rock.jpg.

While black musicians like Chuck Berry created Rock and Roll, it was brought into the mainstream (white) American culture through performers like Elvis. His good looks, sensual dancing, and sonorous voice stole the hearts of millions of American teenage girls, which was at that moment becoming a central segment of the consumer population. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/3/35/Elvis_Presley_Jailhouse_Rock.jpg.

But while the Affluent Society the pressure to conform was intense, many Americans in the 1950s took larger steps to reject conformity and domesticity. The writers of the Beat Generation expressed their disillusionment with capitalism, consumerism, and traditional gender roles by seeking a deeper meaning in life. Beats traveled across the country, studied Eastern religions, and experimented with drugs and sex and artistic form.

Behind the scenes, Americans were challenging sexual mores. The gay rights movement, for instance, stretched back into the Affluent Society. While the country proclaimed homosexuality a mental disorder, gay men established the Mattachine Society in Los Angeles and gay women formed the Daughters of Bilitis in San Francisco as support groups. They held meetings, distributed literature, provided legal and counseling services, and formed chapters across the country. Much of their work, however, remained secretive because homosexuals risked arrest and abuse, if discovered.

Society’s “consensus,” on everything from the consumer economy to gender roles, did not go unchallenged. Much discontent was channeled through the machine itself: advertisers sold rebellion no less than they sold baking soda. And yet others were rejecting the old ways, choosing new lifestyles, challenging old hierarchies, and embarking upon new paths.

 

VI. Politics and Ideology in the Affluent Society

Postwar economic prosperity and the creation of new suburban spaces inevitably shaped Americans’ politics. In stark contrast to the Great Depression, the new prosperity renewed belief in the superiority of capitalism, cultural conservatism, and religion.

In the 1930s, the economic ravages of the international economic catastrophe knocked the legs out from under the intellectual justifications for keeping government out of the economy. And yet, despite the inhospitable intellectual and cultural climate, there were pockets of true believers who kept the gospel of the free market alive. The single most important was the National Association of Manufacturers (NAM). In the midst of the depression, NAM, under the leadership of a group known as the “Brass Hats” reinvented itself and went on the offensive, initiating advertising campaigns supporting “free enterprise” and “The American Way of Life.” More importantly, NAM became a node for business leaders, such as J. Howard Pew of Sun Oil and Jasper Crane of DuPont Chemical Co., to network with like-minded individuals and take the message of free enterprise to the American people. The network of business leaders that NAM brought together in the midst of the Great Depression formed the financial, organizational and ideological underpinnings of the free market advocacy groups that emerged and found ready adherents in America’s new suburban spaces in the post-war decades.

One of the most important advocacy groups that sprang up after the war was Leonard Read’s Foundation for Economic Education. Read founded FEE in 1946 on the premise that “The American Way of Life” was essentially individualistic and that the best way to protect and promote that individualism was through libertarian economics. FEE, whose advisory board and supporters came mostly from the NAM network of Pew and Crane, became a key ideological factory, supplying businesses, service clubs, churches, schools and universities with a steady stream of libertarian literature, much of it authored by Austrian economist Ludwig Von Mises.

Shortly after FEE’s formation, Austrian economist and libertarian intellectual Friedrich Hayek founded the Mont Pelerin Society (MPS) in 1947. Unlike FEE, whose focus was more ideological in nature, the MPS’s focus on the intellectual work of promoting and improving capitalism brought together intellectuals from both sides of the Atlantic in common cause. Like FEE, many of the lay supporters of the MPS, such as Pew and Jasper Crane, also came from the NAM network. The MPS successfully challenged liberal, Keynesian economics on its home turf, academia, particularly when the brilliant University of Chicago economist Milton Friedman became its president. Friedman’s willingness to advocate for and apply his libertarian economics in the political realm made him, and the MPS, one of the most influential free market advocates in the world. Together with the Chicago School of Economics, the MPS carved out a critical space in academia that legitimized the libertarian ideology so successfully evangelized by FEE, its descendant organizations, and libertarian populizers such as Ayn Rand.

Libertarian politics and evangelical religion were shaping the origins of a conservative, suburban constituency. Suburban communities’ distance from government and other top-down community-building mechanisms left a social void that evangelical churches eagerly filled. More often than not the theology and ideology of these churches reinforced socially conservative views while simultaneously reinforcing congregants’ belief in economic individualism. These new communities and the suburban ethos of individualism that accompanied them became the building blocks for a new political movement. And yet, while the growing suburbs, and the conservative ideology that found a ready home there, eventually proved immensely important in American political life, their impact was not immediately felt. They did not yet have a champion.

In the post-World War II years the Republican Party faced a fork in the road. Its complete lack of electoral success since the Depression led to a battle within the party about how to revive its electoral prospects. The more conservative faction, represented by Ohio Senator Robert Taft (son of former President William Howard Taft) and backed by many party activists and financers such as J. Howard Pew, sought to take the party further to the right, particularly in economic matters, by rolling back New Deal programs and policies. On the other hand, the more moderate wing of the party led by men such as New York Governor Thomas Dewey and Nelson Rockefeller sought to embrace and reform New Deal programs and policies. There were further disagreements among party members about how involved the United States should be in the world. Issues such as foreign aid, collective security, and how best to fight Communism divided the party.

 Just like the internet, don’t always trust what you read in newspapers. This obviously incorrect banner from the front page of the Chicago Tribune on November 3, 1948 made its own headlines as the newspaper’s most embarrassing gaff. Photograph, 1948. http://media-2.web.britannica.com/eb-media/14/65214-050-D86AAA4E.jpg.

Just like the internet, don’t always trust what you read in newspapers. This obviously incorrect banner from the front page of the Chicago Tribune on November 3, 1948 made its own headlines as the newspaper’s most embarrassing gaff. Photograph, 1948. http://media-2.web.britannica.com/eb-media/14/65214-050-D86AAA4E.jpg.

Initially, the moderates, or “liberals,” won control of the party with the nomination of Thomas Dewey in 1948. Dewey’s shocking loss to Truman, however, emboldened conservatives, who rallied around Taft as the 1952 presidential primaries approached. With the conservative banner riding high in the party, General Dwight Eisenhower, most recently NATO supreme commander, felt obliged to join the race in order to beat back the conservatives and “prevent one of our great two Parties from adopting a course which could lead to national suicide.” In addition to his fear that Taft and the conservatives would undermine collective security arrangements such as NATO, he also berated the “neanderthals” in his party for their anti-New Deal stance. Eisenhower felt that the best way to stop Communism was to undercut its appeal by alleviating the conditions under which it was most attractive. That meant supporting New Deal programs. There was also a political calculus to Eisenhower’s position. He observed, “Should any political party attempt to abolish social security, unemployment insurance, and eliminate labor laws and farm programs, you would not hear of that party again in our political history.”

The primary contest between Taft and Eisenhower was close and controversial, with Taft supporters claiming that Eisenhower stole the nomination from Taft at the convention. Eisenhower, attempting to placate the conservatives in his party picked California Congressman and virulent anti-Communist Richard Nixon as his running mate. With the Republican nomination sewn up, the immensely popular Eisenhower swept to victory in the 1952 general election, easily besting Truman’s hand-picked successor, Adlai Stevenson. Eisenhower’s popularity boosted Republicans across the country, leading them to majorities in both houses of Congress.

The Republican sweep in the 1952 election proved less momentous than its supporters hoped. Eisenhower’s popularity helped elect a congress that was more conservative then he had hoped. Within two years of his election, Eisenhower saw his legislative proposals routinely defeated by an unlikely alliance of conservative Republicans, who thought Eisenhower was going too far, and liberal Democrats, who thought he was not going far enough. For example, in 1954 Eisenhower proposed a national Health Care plan that would have provided Federal support for increasing health care coverage across the nation without getting the government directly involved in regulating the health care industry. The proposal was defeated in the house by a 238-134 vote with a swing bloc of 75 conservative Republicans joining liberal Democrats voting against the plan. Eisenhower’s proposals in education and agriculture often suffered similar defeats. By the end of his presidency, Ike’s domestic legislative achievements were largely limited to expanding social security, making Housing, Education and Welfare (HEW) a cabinet position, passing the National Defense Education Act, and bolstering federal support to education, particularly in math and science.

Like any president, Eisenhower’s record was as much about his impact outside of the legislative arena. Ike’s “Middle-of-the-Road” philosophy guided his foreign as much as his domestic policy. Indeed, like his attempts to use federal dollars to give state and local governments as well as individuals the power to act at home, his foreign policy sought to keep the United States from intervening abroad by bolstering its allies. Thus Ike funneled money to the French in Vietnam fighting the Ho Chi Minh led Communists, walked a tight line between helping Chiang Kai-Shek’s Taiwan without overtly provoking Mao Tse-Tung’s China, and materially backed native actors who destabilized “unfriendly” governments in Iran and Guatemala. The centerpiece of Ike’s foreign policy was “massive retaliation,” or the threat of nuclear force in the face of Communist expansion, thus getting more “bang” for his government “buck.” While Ike’s “mainstream” “middle-way” won broad popular support, his own party was slowly moving away from his positions. By 1964 the party had moved far enough to the Right to nominate Arizona Senator Barry Goldwater, the most conservative candidate in a generation. The political moderation of the Affluent Society proved little more than a way station on the road to liberal reform and a future conservative ascendancy.

 

VII. Conclusion

The postwar American “consensus” held great promise. Despite the looming threat of nuclear war, millions experienced an unprecedented prosperity and an increasingly proud American identity. Prosperity seemed to promise ever higher standards of living. But things fell apart, and the center could not hold. Wracked by contradiction, dissent, discrimination, and inequality, the Affluent Society stood on the precipice of revolution.

 

This chapter was edited by James McKay, with content contributions by Edwin C. Breeden, Maggie Flamingo, Destin Jenkins, Kyle Livie, Jennifer Mandel, James McKay, Laura Redford, Ronny Regev, and Tanya Roth.

cc-by-sa-icon

25. The Cold War

French government's test of the Licorne thermonuclear weapon, Mururoa atoll, French Polynesia, 1970, via Flickr userPierre J.

French government’s test of the Licorne thermonuclear weapon, Mururoa atoll, French Polynesia, 1970, via Flickr user Pierre J.

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

In a public address on February 9, 1946, Soviet Premier Joseph Stalin blamed the outbreak of the Second World War on “economic and political forces” driven by “monopoly capitalism.” Many saw it as empty rhetoric to rally the Soviet Union’s capitalists, but officials in the United States and Britain, long suspicious of Stalin’s postwar intentions, viewed it with alarm. On February 22, the Charge d’Affaires of the US Embassy in Moscow, George Kennan cabled the State Department his belief assessment that “world communism” was “a malignant parasite” that “feeds only on diseased tissue,” and “the steady advance of uneasy Russian nationalism” in its “new guise of international Marxism” was “more dangerous and insidious than ever before.” The telegram made waves among American officials. On March 5, former British Prime Minister Winston Churchill visited President Harry Truman and gave a speech in his home state of Missouri declaring that Europe had been cut in half by into two spheres separated by an “iron curtain” that had “descended across the Continent.”

The Cold War, a global geopolitical and ideological struggle between (western) capitalist and (eastern) communist countries, fueled a generations-long, multifaceted rivalry between the remaining superpowers of the postwar world: the United States and the Union of Soviet Socialist Republics (USSR). Tensions ran highest, perhaps, during the “first Cold War,” which lasted from the mid-1940s through the mid-1960s, after which followed a period of relaxed tensions and increased communication and cooperation, known by the French term détente, until the “second Cold War” interceded from roughly 1979 until the collapse of the Berlin Wall in 1989 and the dissolution of the Soviet Union in 1991. “Cold” because it was not a “Hot” shooting war, the Cold War reshaped the world, altered American life, and affected generations of Americans.

 

II. Political, Economic, and Military Dimensions

The Cold War grew out of a failure to achieve a durable settlement among leaders from the ‘Big Three’ Allies—the US, Britain, and the Soviet Union—as they met at Yalta in Russian Crimea and at Potsdam in occupied Germany to shape the postwar order. The Germans had pillaged their way across Eastern Europe and the Soviets had pillaged their way back across it at the cost of millions of lives. Stalin considered within the Soviet ‘sphere of influence.’ With Germany’s defeat imminent, the Allies set terms for unconditional surrender, while deliberating over reparations, tribunals, and the nature of an occupation regime that would initially be divided into American, British, French, and Soviet zones. Even as plans were made to end the fighting in the Pacific, and it was determined that the Soviets would declare war on Japan within ninety days of Germany’s surrender, suspicion and mistrust were already mounting. The political landscape was altered drastically by Franklin Roosevelt’s sudden death in April 1945, just days before the inaugural meeting of the United Nations (UN). Roosevelt had remained skeptical of Stalin but held out a trusting hope that the Soviets could be brought into the “Free World,” but Truman, like Churchill, had no illusions of Stalin’s postwar cooperation and was committed to a hardline anti-Soviet approach.

At the Potsdam Conference, held on the outskirts of Berlin from mid-July to early August, the allies debated the fate of Soviet-occupied Poland. Toward the end of the meeting, the American delegation received word that Manhattan Project scientists had successfully tested an atomic bomb. On July 24, when Truman told Stalin about this “new weapon of unusual destructive force,” the Soviet leader simply nodded his acknowledgement and said that he hoped the Americans would make “good use” of it.

The Cold War had long roots. An alliance of convenience during World War II to bring down Hitler’s Germany was not enough to erase decades of mutual suspicions. The Bolshevik Revolution had overthrown the Russian Tsarists during World War I. Bolshevik leader Vladimir Lenin urged an immediate worldwide peace that would pave the way for world socialism just as Woodrow Wilson brought the United States into the war with promises of global democracy and free trade. The United States had intervened militarily against the Red Army during the Russian civil war, and when the Soviet Union was founded in 1922 the United States refused to recognize it. The two powers were brought together only by their common enemy, and, without that common enemy, there was little hope for cooperation.

On the eve of American involvement in World War II, on August 14, 1941, Roosevelt and Churchill had issued a joint declaration of goals for postwar peace, known as the Atlantic Charter. An adaptation of Wilson’s Fourteen Points, the Atlantic Charter established the creation of the United Nations. The Soviet Union was among the fifty charter UN member-states and was given one of five seats—alongside the US, Britain, France, and China—on the select Security Council. The Atlantic Charter, though, also set in motion the planning for a reorganized global economy. The July 1944 United Nations Financial and Monetary Conference, more popularly known as the Bretton Woods Conference, created the International Monetary Fund (IMF) and the forerunner of the World Bank, the International Bank for Reconstruction and Development (IBRD). The “Bretton Woods system” was bolstered in 1947 with the addition of the General Agreements on Tariffs and Trade (GATT), forerunner of the World Trade Organization (WTO). The Soviets rejected it all.

Many Soviet and American officials knew that the Soviet-American relationship would dissolve into renewed hostility upon the closing of the war, and events proved them right. In a 1947 article for Foreign Affairs—written under the pseudonym “Mr. X”—George Kennan warned that Americans should “continue to regard the Soviet Union as a rival, not a partner,” since Stalin harbored “no real faith in the possibility of a permanent happy coexistence of the Socialist and capitalist worlds.” He urged US leaders to pursue “a policy of firm containment, designed to confront the Russians” wherever they threaten the interests of peace and stability.

Truman, on March 12, 1947, announced $400 million in aid to Greece and Turkey, where “terrorist activities…led by Communists” jeopardized “democratic” governance. With Britain “reducing or liquidating its commitments in several parts of the world, including Greece,” it fell on the US, Truman said, “to support free peoples…resisting attempted subjugation by…outside pressures.” The so-called “Truman Doctrine” became a cornerstone of the American policy of “containment.”

In the harsh winter of 1946-47, famine loomed in much of continental Europe. Blizzards and freezing cold halted coal production. Factories closed. Unemployment spiked. Amid these conditions, the Communist parties of France and Italy gained nearly a third of the seats in their respective Parliaments. American officials worried that Europe’s impoverished masses were increasingly vulnerable to Soviet propaganda. The situation remained dire through the spring, when Secretary of State General George Marshall gave an address at Harvard University, on June 5, 1947, suggesting that “the United States should do whatever it is able to do to assist in the return of normal economic health to the world, without which there can be no political stability and no assured peace.” Although Marshall had stipulated to potential critics that his proposal was “not directed against any country, but against hunger, poverty…and chaos,” Stalin clearly understood the development of the ERP as an assault against Communism in Europe; he saw it as a ‘Trojan Horse’ designed to lure Germany and other countries into the capitalist web.

The European Recovery Program (ERP), popularly known as the Marshal Plan, pumped enormous sums into Western Europe. From 1948-1952 the US invested $13 billion toward reconstruction while simultaneously loosening trade barriers. To avoid the postwar chaos of World War I, the Marshall Plan was designed to rebuild Western Europe, open markets, and win European support for capitalist democracies. The Soviets countered with the Molotov Plan, a symbolic pledge of aid to Eastern Europe. Polish leader Józef Cyrankiewicz was rewarded with a five-year, $450 million dollar trade agreement from Russia for boycotting the Plan. Czechoslovakia received $200 million of American assistance but was summoned to Moscow where Stalin threatened Czech foreign minister Jan Masaryk. Masaryk later recounted that he “went to Moscow as the foreign minister of an independent sovereign state,” but “returned as a lackey of the Soviet Government.” Stalin exercised even tighter control over Soviet “satellite” countries in Central and Eastern Europe.

The situation in Germany meanwhile deteriorated. Berlin had been divided into communist and capitalist zones. In June 1948, when the US, British, and French officials introduced a new currency, the Soviet Union initiated a ground blockade, cutting off rail and road access to West Berlin (landlocked within the Soviet occupation zone) to gain control over the entire city. The United States organized and coordinated a massive airlift that flew essential supplies into the beleaguered city for eleven months, until the Soviets lifted the blockade on May 12, 1949. Germany was officially broken in half. On May 23, the western half of the country was formally renamed the Federal Republic of Germany (FRG) and the eastern Soviet zone became the German Democratic Republic (GDR) later that fall. Berlin, which lay squarely within the GDR, was divided into two sections (later famously separated from August 1961 until November 1989 by walls).

The Berlin Blockade and resultant Allied airlift was one of the first major crises of the Cold War. Photograph, U.S. Navy Douglas R4D and U.S. Air Force C-47 aircraft unload at Tempelhof Airport during the Berlin Airlift, c. 1948-1949. Wikimedia, http://commons.wikimedia.org/wiki/File:C-47s_at_Tempelhof_Airport_Berlin_1948.jpg.

The Berlin Blockade and resultant Allied airlift was one of the first major crises of the Cold War. Photograph, U.S. Navy Douglas R4D and U.S. Air Force C-47 aircraft unload at Tempelhof Airport during the Berlin Airlift, c. 1948-1949. Wikimedia, http://commons.wikimedia.org/wiki/File:C-47s_at_Tempelhof_Airport_Berlin_1948.jpg.

In the summer of 1949, American officials launched the North Atlantic Treaty Alliance (NATO), a mutual defense pact in which the US and Canada were joined by England, France, Belgium, Luxembourg, the Netherlands, Italy, Portugal, Norway, Denmark, and Iceland. The Soviet Union would formalize its own collective defensive agreement in 1955, the Warsaw Pact, which included Albania, Romania, Bulgaria, Hungary, Czechoslovakia, Poland, and East Germany.

Liberal journalist Walter Lippmann was largely responsible for popularizing the term “the Cold War” in his book, The Cold War: A Study in U.S. Foreign Policy, published in 1947. Lippmann envisioned a prolonged stalemate between the US and the USSR, a war of words and ideas in which direct shots would not necessarily be fired between the two. Lippmann agreed that the Soviet Union would only be “prevented from expanding” if it were “confronted with…American power,” but he felt “that the strategical conception and plan” recommended by Mr. X (George Kennan) was “fundamentally unsound,” as it would require having “the money and the military power always available in sufficient amounts to apply ‘counter-force’ at constantly shifting points all over the world.” Lippmann cautioned against making far-flung, open-ended commitments, favoring instead a more limited engagement that focused on halting the influence of communism in the ‘heart’ of Europe; he believed that if the Soviet system were successfully restrained on the Continent, it could otherwise be left alone to collapse under the weight of its own imperfections.

A new chapter in the Cold War began on October 1, 1949, when the Chinese Communist Party (CCP) led by Mao Tse-tung declared victory against “Kuomintang” Nationalists led by the Western-backed Chiang Kai-shek. The Kuomintang retreated to the island of Taiwan and the CCP took over the mainland under the red flag of the People’s Republic of China (PRC). Coming so soon after the Soviet Union’s successful test of an atomic bomb, on August 29, the “loss of China,” the world’s most populous country, contributed to a sense of panic among American foreign policymakers, whose attention began to shift from Europe to Asia. After Dean Acheson became Secretary of State in 1949, Kennan was replaced in the State Department by former investment banker Paul Nitze, whose first task was to help compose, as Acheson later described in his memoir, a document designed to “bludgeon the mass mind of ‘top government’” into approving a “substantial increase” in military expenditures.

The communist world system rested, in part, on the relationship between the two largest communist nations -- the Soviet Union and the People’s Republic of China. This 1950 Chinese Stamp depicts Joseph Stalin shaking hands with Mao Zedong. Wikimedia, http://commons.wikimedia.org/wiki/File:Chinese_stamp_in_1950.jpg.

The communist world system rested, in part, on the relationship between the two largest communist nations — the Soviet Union and the People’s Republic of China. This 1950 Chinese Stamp depicts Joseph Stalin shaking hands with Mao Zedong. Wikimedia, http://commons.wikimedia.org/wiki/File:Chinese_stamp_in_1950.jpg.

“National Security Memorandum 68: United States Objectives and Programs for National Security,” a national defense memo known as “NSC-68,” achieved its goal. Issued in April 1950, the nearly sixty-page classified memo warned of “increasingly terrifying weapons of mass destruction,” which served to remind “every individual” of “the ever-present possibility of annihilation.” It said that leaders of the USSR and its “international communist movement” sought only “to retain and solidify their absolute power.” As the central “bulwark of opposition to Soviet expansion,” America had become “the principal enemy” that “must be subverted or destroyed by one means or another.” NSC-68 urged a “rapid build-up of political, economic, and military strength” in order to “roll back the Kremlin’s drive for world domination.” Such a massive commitment of resources, amounting to more than a threefold increase in the annual defense budget, was necessary because the USSR, “unlike previous aspirants to hegemony,” was “animated by a new fanatic faith,” seeking “to impose its absolute authority over the rest of the world.” Both Kennan and Lippmann were among a minority in the ‘foreign policy establishment’ who argued to no avail that such a ‘militarization of containment’ was tragically wrongheaded.

On June 25, 1950, as US officials were considering the merits of NSC 68’s proposals, including “the intensification of…operations by covert means in the fields of economic…political and psychological warfare” designed to foment “unrest and revolt in…[Soviet] satellite countries,” fighting erupted in Korea between communists in the north and American-backed anti-communists in the south.

After Japan surrendered in September 1945, a US-Soviet joint occupation had paved the way for the division of Korea. In November 1947, the UN passed a resolution that a united government in Korea should be created but the Soviet Union refused to cooperate. Only the south held elections. The Republic of Korea (ROK), South Korea, was created three months after the election. A month later, communists in the north established the Democratic People’s Republic of Korea (DPRK). Both claimed to stand for a unified Korean peninsula. The UN recognized the ROK, but incessant armed conflict broke out between North and South.

In the spring of 1950, Stalin hesitantly endorsed North Korean leader Kim Il Sung’s plan to ‘liberate’ the South by force, a plan heavily influenced by Mao’s recent victory in China. While he did not desire a military confrontation with the US, Stalin thought correctly that he could encourage his Chinese comrades to support North Korea if the war turned against the DPRK. The North Koreans launched a successful surprise attack and Seoul, the capital of South Korea, fell to the communists on June 28. The UN passed resolutions demanding that North Korea cease hostilities and withdraw its armed forces to the 38th parallel and calling on member states to provide the ROK military assistance to repulse the Northern attack.

That July, UN forces mobilized under American General Douglass MacArthur. Troops landed at Inchon, a port city around 30 miles away from Seoul, and took the city on September 28. They moved on North Korea. On October 1, ROK/UN forces crossed the 38th parallel, and on October 26 they reached the Yalu River, the traditional Korea-China border. They were met by 300,000 Chinese troops who broke the advance and rolled up the offensive. On November 30, ROK/UN forces began a fevered retreat. They returned across the 38th parallel and abandoned Seoul on January 4, 1951. The United Nations forces regrouped, but the war entered into a stalemate. General MacArthur, growing impatient and wanting to eliminate the communist threats, requested authorization to use nuclear weapons against North Korea and China. Denied, MacArthur publicly denounced Truman. Truman, unwilling to threaten World War III and refusing to tolerate MacArthur’s public insubordination, dismissed the General in April. On June 23, 1951, the Soviet ambassador to the UN suggested a cease-fire, which the US immediately accepted. Peace talks continued for two years.

With the policy of “containing” communism and at home and abroad, the U.S. pressured the United Nations to support the South Koreans, ultimately supplying American troops to fight in the civil war. Though rather forgotten in the annals of American history, the Korean War caused over 30,000 American deaths and 100,000 wounded, leaving an indelible mark on those who served. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/1/1b/KoreanWarFallenSoldier1.jpg.

With the policy of “containing” communism and at home and abroad, the U.S. pressured the United Nations to support the South Koreans, ultimately supplying American troops to fight in the civil war. Though rather forgotten in the annals of American history, the Korean War caused over 30,000 American deaths and 100,000 wounded, leaving an indelible mark on those who served. Wikimedia.

General Dwight Eisenhower defeated Truman in the 1952 presidential election and Stalin died in March 1953. The DPRK warmed to peace, and an armistice agreement was signed on July 27, 1953. Upwards of 1.5 million people had died during the conflict.

Coming so soon after World War II and ending without clear victory, Korea became for many Americans a ‘forgotten war.’ Decades later, though, the nation’s other major intervention in Asia would be anything but forgotten. The Vietnam War had deep roots in the Cold War world. Vietnam had been colonized by France and seized by Japan during World War II. The nationalist leader Ho Chi Minh had been backed by the US during his anti-Japanese insurgency and, following Japan’s surrender in 1945, “Viet Minh” nationalists, quoting Thomas Jefferson, declared an independent Democratic Republic of Vietnam (DRV). Yet France moved to reassert authority over its former colony in Indochina, and the United States sacrificed Vietnamese self-determination for France’s colonial imperatives. Ho Chi Minh turned to the Soviet Union for assistance in waging war against the French colonizers in a protracted war.

After French troops were defeated at the ‘Battle of Dien Bien Phu’ in May 1954, US officials helped broker a temporary settlement that partitioned Vietnam in two, with a Soviet/Chinese-backed state in the north and an American-backed state in the south. To stifle communist expansion southward, the United States would send arms, offer military advisors, prop up corrupt politicians, stop elections, and, eventually, send over 500,000 troops, of whom nearly 60,000 would be lost before the communists finally reunified the country.

 

III. The Arms Buildup, the Space Race, and Technological Advancement

Harnessing years of discoveries in nuclear physics, the work of hundreds of world-class scientists, and $2 billion in research funds, during World War II the Manhattan Project had created atomic weapons. The first nuclear explosive device, “Trinity,” exploded on the deserts of New Mexico on July 16, 1945 with the destructive power equivalent of 20,000 tons of TNT. Choking back tears, physicist J. Robert Oppenheimer would remember the experience by quoting from Hindu scripture: “I have become Death, the destroyer of worlds.” The director of the Trinity test was plainer: “Now, we’re all sons of bitches.”

The world soon saw what nuclear weapons could do. In August, two bombs leveled two cities and killed perhaps 180,000 people. The world was never the same.

The Soviets accelerated their research in the wake of Hiroshima and Nagasaki, expedited in no small part by spies such as Klaus Fuchs, who had stolen nuclear secrets from the Manhattan Project. Soviet scientists successfully tested an atomic bomb on August 29, 1949, years before American officials had estimated they would. This unexpectedly quick Russian success not only caught the United States off guard, caused tensions across the Western world, and propelled a nuclear “arms race” between the US and the USSR.

The United States detonated the first thermonuclear weapon, or hydrogen bomb (using fusion explosives of theoretically limitless power) on November 1, 1952. The blast measured over 10 megatons and generated an inferno five miles wide with a mushroom cloud 25 miles high and 100 miles across. The irradiated debris—fallout—from the blast circled the Earth, occasioning international alarm about the effects of nuclear testing on human health and the environment. It only hastened the arms race, with each side developing increasingly advanced warheads and delivery systems. The USSR successfully tested a hydrogen bomb in 1953, and soon thereafter Eisenhower announced a policy of “massive retaliation.” The US would henceforth respond to threats or acts of aggression with perhaps its entire nuclear might. Both sides, then, would theoretically be deterred from starting a war, through the logic of “mutually-assured destruction,” (MAD). Oppenheimer likened the state of “nuclear deterrence” between the US and the USSR to “two scorpions in a bottle, each capable of killing the other,” but only by risking their own lives.

In response to the Soviet Union’s test of a pseudo-hydrogen bomb in 1953, the United States began Castle Bravo --  the first U.S. test of a dry fuel, hydrogen bomb. Detonated on March 1, 1954, it was the most powerful nuclear device ever tested by the U.S. But the effects were more gruesome than expected, causing nuclear fall-out and radiation poisoning in nearby Pacific islands. Photograph, March 1, 945. Wikimedia, http://commons.wikimedia.org/wiki/File:Castle_Bravo_Blast.jpg.

In response to the Soviet Union’s test of a pseudo-hydrogen bomb in 1953, the United States began Castle Bravo — the first U.S. test of a dry fuel, hydrogen bomb. Detonated on March 1, 1954, it was the most powerful nuclear device ever tested by the U.S. But the effects were more gruesome than expected, causing nuclear fall-out and radiation poisoning in nearby Pacific islands. Photograph, March 1, 945. Wikimedia, http://commons.wikimedia.org/wiki/File:Castle_Bravo_Blast.jpg.

Fears of nuclear war produced a veritable atomic culture. Films such as Godzilla, On the Beach, Fail-Safe, and Dr. Strangelove or: How I Learned to Stop Worrying and Love the Bomb plumbed the depths of American anxieties with plots featuring radioactive monsters, nuclear accidents, and doomsday scenarios. Anti-nuclear protests in the United States and abroad warned against the perils of nuclear testing and highlighted the likelihood that a thermonuclear war would unleash a global environmental catastrophe. Yet at the same time, peaceful nuclear technologies, such as fission and fusion-based energy, seemed to herald a utopia of power that would be clean, safe, and “too cheap to meter.” In 1953, Eisenhower proclaimed at the UN that the US would share the knowledge and means for other countries to use atomic power. Henceforth, “the miraculous inventiveness of man shall not be dedicated to his death, but consecrated to his life.” The ‘Atoms for Peace’ speech brought about the establishment of International Atomic Energy Agency (IAEA), along with worldwide investment in this new economic sector.

As Germany fell at the close of World War II, the United States and the Soviet Union each sought to acquire elements of the Nazi’s V-2 superweapon program. A devastating rocket that had terrorized England, the V-2 was capable of delivering its explosive payload up to a distance of nearly 600 miles, and both nations sought to capture the scientists, designs, and manufacturing equipment to make it work. A former top German rocket scientist, Wernher Von Braun, became the leader of the American space program; the Soviet Union’s program was secretly managed by former prisoner Sergei Korolev. After the end of the war, American and Soviet rocket engineering teams worked to adapt German technology in order to create an intercontinental ballistic missile (ICBM). The Soviets achieved success first. They even used the same launch vehicle on October 4, 1957, to send Sputnik 1, the world’s first human-made satellite, into orbit. It was a decisive Soviet propaganda victory.

In response, the US government rushed to perfect its own ICBM technology and launch its own satellites and astronauts into space. In 1958, the National Aeronautics and Space Administration (NASA) was created as a successor to the National Advisory Committee for Aeronautics (NACA). Initial American attempts to launch a satellite into orbit using the Vanguard rocket suffered spectacular failures, heightening fears of Soviet domination in space. While the American space program floundered, on September 13, 1959, the Soviet Union’s “Luna 2” capsule became the first human-made object to touch the moon. The “race for survival,” as it was called by the New York Times, reached a new level. The Soviet Union successfully launched a pair of dogs (Belka and Strelka) into orbit and returned them to Earth while the American Mercury program languished behind schedule. Despite countless failures and one massive accident that killed nearly one hundred Soviet military and rocket engineers, Russian ‘cosmonaut’ Yuri Gagarin was launched into orbit on April 12, 1961. Astronaut Alan Shepard accomplished a sub-orbital flight in the Freedom 7 capsule on May 5. John Kennedy would use America’s losses in the “space race” to bolster funding for a moon landing.

While outer space captivated the world’s imagination, the Cold War still captured its anxieties. The ever-escalating arms race continued to foster panic. In the early 1950s, the Federal Civil Defense Administration (FDCA) began preparing citizens for the worst. Schoolchildren were instructed, via a film featuring Bert the Turtle, to “duck and cover” beneath their desks in the event of a thermonuclear war.

Although it took a back seat to space travel and nuclear weapons, the advent of modern computing was yet another major Cold War scientific innovation, the effects of which were only just beginning to be understood. In 1958, following the humiliation of the Sputnik launches, Eisenhower authorized the creation of an Advanced Research Projects Agency (ARPA) housed within the Department of Defense (later changed to DARPA). As a secretive military research and development operation, ARPA was tasked with funding and otherwise overseeing the production of sensitive new technologies. Soon, in cooperation with university-based computer engineers, ARPA would develop the world’s first system of “network packing switches” and computer networks would begin connecting to one another.

 

IV. The Cold War Red Scare, McCarthyism, and Liberal Anti-Communism

Joseph McCarthy, Republican Senator from Wisconsin, fueled fears during the early 1950s that communism was rampant and growing. This intensified Cold War tensions felt by every segment of society, from government officials to ordinary American citizens. Photograph of Senator Joseph R. McCarthy, March 14, 1950. National Archives and Records Administration, http://research.archives.gov/description/6802721.

Joseph McCarthy, Republican Senator from Wisconsin, fueled fears during the early 1950s that communism was rampant and growing. This intensified Cold War tensions felt by every segment of society, from government officials to ordinary American citizens. Photograph of Senator Joseph R. McCarthy, March 14, 1950. National Archives and Records Administration.

Joseph McCarthy burst onto the national scene during a speech in Wheeling, West Virginia on February 9, 1950. Waving a sheet of paper in the air, he proclaimed: “I have here in my hand a list of 205…names that were made known to the Secretary of State as being members of the Communist party and who nevertheless are still working and shaping [US] policy.” Since the Wisconsin Republican had no actual list, when pressed, the number changed to fifty-seven, then, later, eighty-one. Finally he promised to disclose the name of just one communist, the nation’s “top Soviet agent.” The shifting numbers brought ridicule, but it didn’t matter, not really: McCarthy’s claims won him fame and fueled the ongoing “red scare.”

Within a ten-month span beginning in 1949, the USSR developed a nuclear bomb, China fell to Communism, and over 300,000 American soldiers were deployed to fight land war in Korea. Newspapers, meanwhile, were filled with headlines alleging Soviet espionage.

The environment of fear and panic instigated by McCarthyism led to the arrest of many innocent people. Still, some Americans accused of supplying top-secret information to the Soviets were in fact spies. The Rosenbergs were convicted of espionage and executed in 1953 for giving information about the atomic bomb to the Soviets. This was one case that has proven the test of time, for as recently as 2008 a co-conspirator of the Rosenbergs admitted to spying for the Soviet Union. Roger Higgins, “[Julius and Ethel Rosenberg, separated by heavy wire screen as they leave U.S. Court House after being found guilty by jury],” 1951. Library of Congress, http://www.loc.gov/pictures/item/97503499/.

The environment of fear and panic instigated by McCarthyism led to the arrest of many innocent people. Still, some Americans accused of supplying top-secret information to the Soviets were in fact spies. The Rosenbergs were convicted of espionage and executed in 1953 for giving information about the atomic bomb to the Soviets. This was one case that has proven the test of time, for as recently as 2008 a co-conspirator of the Rosenbergs admitted to spying for the Soviet Union. Roger Higgins, “[Julius and Ethel Rosenberg, separated by heavy wire screen as they leave U.S. Court House after being found guilty by jury],” 1951. Library of Congress, http://www.loc.gov/pictures/item/97503499/.

During the war, Julius Rosenberg had worked briefly at the US Army Signal Corps Laboratory in New Jersey, where he had access to classified information. He and his wife Ethel, who had both been members of the American Communist Party (CPUSA) in the 1930s, were accused of passing secret bomb-related documents into the hands of Soviet officials. Julius and Ethel Rosenberg who were indicted in August 1950 on changes of giving ‘nuclear secrets’ to the Russians. After a trial in March 1951, the Rosenbergs were found guilty and executed on June 19, 1953.The Rosenbergs offered anti-communists such as McCarthy the evidence they needed to allege a vast Soviet conspiracy to infiltrate and subvert the US government, allegations that justified the smearing all left-liberals, even those resolutely anti-communist. In the run-up to the 1950 and 1952 elections, progressives saw this not as a legitimate effort to expose actual subversive activity, but rather a campaign to tarnish the reputations of ‘New Dealers’ in the Democratic Party.Alger Hiss was another prize for conservatives, who identified him as the highest-ranking government official linked to Soviet espionage. While working for the State Department’s Office of Far Eastern Affairs, Hiss had been a prominent member of the US delegation to Yalta before serving as secretary-general of the UN Charter Conference in San Francisco, from April-June 1945. He left the State Department in 1946. Hounded by a young congressman named Richard Nixon, public accusations finally won results. On August 3, 1948, Whittaker Chambers gave testimony to the House Un-American Activities Committee (HUAC) claiming that he and Hiss had worked together as part of the secret ‘communist underground’ in Washington DC during the 1930s. Hiss, who always maintained his innocence, stood trial twice. Following a ‘hung jury’ decision in July 1949, he was finally convicted on two counts of perjury, the statute of limitations for espionage having expired.Although later evidence certainly suggested their guilt, the prominent convictions of a few suspected spies fueled a frenzy by many who saw communists everywhere. Not long after his February 1950 speech in Wheeling, Joe McCarthy’s sensational charges became a source of growing controversy. Forced to respond, President Truman arranged a partisan congressional investigation designed to discredit McCarthy. The Tydings Committee held hearings from early March through July, 1950, then issued a final report admonishing McCarthy for perpetrating a “fraud and a hoax” on the American public.American progressives saw McCarthy’s crusade as nothing less than a political witch hunt. In June 1950, The Nation magazine editor Freda Kirchwey characterized “McCarthyism” as “the means by which a handful of men, disguised as hunters of subversion, cynically subvert the instruments of justice…in order to help their own political fortunes.”  Truman’s liberal supporters and leftists like Kirchwey hoped that McCarthy and the new ‘ism’ that bore his name would blow over quickly. Yet ‘McCarthyism’ was ultimately just a symptom of the widespread anti-communist hysteria that engulfed American society during the first Cold War.Faced with a growing awareness of Soviet espionage, and a tough election on the horizon, in March 1947 Truman gave in to pressure and issued Executive Order 9835, establishing loyalty reviews for federal employees. In the case of Foreign Service officers, the Federal Bureau of Investigation (FBI) was empowered to conduct closer examinations of all potential ‘security risks’; congressional committees, namely the House Un-American Activities Committee (HUAC) and the Senate Permanent Subcommittee on Investigations (SPSI), were authorized to gather facts and hold hearings. Following Truman’s “loyalty order,” anti-subversion committees emerged in over a dozen state legislatures, while review procedures proliferated in public schools and universities across the country. At the University of California, for example, thirty-one professors were dismissed in 1950 after refusing to sign a loyalty oath. The Senate Internal Security (McCarran) Act passed in September 1950 mandated all “communist organizations” to register with the government and created a Senate investigative subcommittee equivalent to HUAC. The McCarran Act gave the government greater powers to investigate sedition and made it possible to prevent suspected individuals from gaining or keeping their citizenship. Between 1949 and 1954, HUAC, SPSI, and a new McCarran Committee conducted over one hundred distinct investigations of subversive activities.There had been an American communist presence. The Communist Party of the USA (CPUSA) formed in the aftermath of the 1917 Russian Revolution when the Bolsheviks created a Communist International (the Comintern) and invited socialists from around the world to join as they raised the red banner of revolution atop the palace in Leningrad (formerly St. Petersburg). During its first two years of existence, the CPUSA functioned in secret, hidden from a surge of anti-radical and anti-immigrant hysteria, investigations, deportations, and raids at the end of World War I. The CPUSA began its public life in 1921, after the panic subsided. Communism remained on the margins of American life until the 1930s, when leftists and liberals began to see the Soviet Union as a symbol of hope amid the Great Depression.

During the 1930s, many communists joined the “Popular Front,” an effort to adapt communism to the United States and make it mainstream. During the Popular Front era communists were integrated into mainstream political institutions through alliances with progressives in the Democratic Party. The CPUSA enjoyed most of its influence and popularity among workers in unions linked to the newly formed Congress of Industrial Organizations (CIO). Communists also became strong opponents of southern ‘Jim Crow’ segregation and developed a presence in both the NAACP and the American Civil Liberties Union (ACLU). The CPUSA, moreover, established “front” groups such as the League of American Writers, in which intellectuals participated without direct knowledge of its ties to the Comintern. But even at the height of the global economic crisis, communism never attracted many Americans. Even at the peak of its membership, in 1944, the CPUSA had just 80,000 national “card-carrying” members. From the mid-1930s through the mid-1940s, “the Party” exercised most of its power indirectly, through coalitions with liberals and reformers. But in the late 1930s, particularly when news broke of Hitler and Stalin’s non-aggression pact of 1939, many fled the Party, a bloc of left-liberal anti-communists purged remaining communists in their ranks, and the Popular Front collapsed.

Lacking the legal grounds to abolish the CPUSA, officials instead sought to expose and contain CPUSA influence. Following a series of predecessor committees, the House Un-American Activities Committee (HUAC) was established in 1938, then reorganized after the war and given the explicit task of investigating communism. By the time the Communist Control Act was passed in August 1954, effectively criminalizing Party membership, the CPUSA had long ceased to have meaningful influence.

Anti-communists were driven to eliminate remaining CPUSA influence from progressive institutions, including the NAACP and the CIO. The Taft-Hartley Act (1947) gave union officials the initiative to purge communists from the labor movement. A kind of “Cold War” liberalism took hold. In January 1947, anti-communist liberals formed Americans for Democratic Action (ADA), whose founding members included labor leader Walter Reuther and NAACP chairman Walter White, as well as historian Arthur Schlesinger Jr., theologian Reinhold Niebuhr, and former First Lady Eleanor Roosevelt. Working to help Truman defeat former vice-president Henry Wallace’s popular front-backed campaign in 1948, the ADA combined social and economic reforms with staunch anti-communism.

The domestic Cold War was bipartisan, fueled by a consensus drawn from a left-liberal and conservative anti-communist alliance that included politicians and policymakers, journalists and scientists, business and civic/religious leaders, and educators and entertainers.

Led by its imperious director, J. Edgar Hoover, the FBI took an active role in the domestic battle against communism. Hoover’s FBI helped incite panic by assisting the creation of blatantly propagandistic films and television shows, including The Red Menace (1949), My Son John, (1951), and I Led Three Lives (1953-1956). Such alarmist depictions of espionage and treason in a ‘free world’ imperiled by communism heightened a culture of fear experienced in the 1950s. In the fall of 1947, HUAC entered the fray with highly publicized hearings of Hollywood. Film mogul Walt Disney and actor Ronald Reagan, among others, testified to aid investigators’ attempts to expose communist influence in the entertainment industry. A group of writers, directors, and producers who refused to answer questions were held in contempt of Congress. This ‘Hollywood Ten’ created the precedent for a ‘blacklist’ in which hundreds of film artists were barred from industry work for the next decade.

HUAC made repeated visits to Hollywood during the 1950s, and their interrogation of celebrities often began with the same intimidating refrain: “Are you now, or have you ever been, a member of the Communist Party?” Many witnesses cooperated, and “named names,” naming anyone they knew who had ever been associated with communist-related groups or organizations. In 1956, black entertainer and activist Paul Robeson chided his HUAC inquisitors, claiming that they had put him on trial not for his politics, but because he had spent his life “fighting for the rights” of his people. “You are the un-Americans,” he told them, “and you ought to be ashamed of yourselves.” As Robeson and other victims of McCarthyism learned first-hand, this “second red scare,” in the glow of nuclear annihilation and global “totalitarianism,” fueled an intolerant and skeptical political world, what Cold War liberal Arthur Schlesinger, in his The Vital Center (1949), called an “age of anxiety.”

Many accused of Communist sentiments vehemently denied such allegations, including the one of the most well-known Americans at the time, African American actor and signer Paul Robeson. Unwilling to sign an affidavit confirming he was Communist, his U.S. passport was revoked. During the Cold War, he was condemned by the American press and neither his music nor films could be purchased in the U.S. Photograph. http://i.ytimg.com/vi/zDb9nM_iiXw/maxresdefault.jpg.

Many accused of Communist sentiments vehemently denied such allegations, including the one of the most well-known Americans at the time, African American actor and signer Paul Robeson. Unwilling to sign an affidavit confirming he was Communist, his U.S. passport was revoked. During the Cold War, he was condemned by the American press and neither his music nor films could be purchased in the U.S. Photograph. http://i.ytimg.com/vi/zDb9nM_iiXw/maxresdefault.jpg.

Anti-communist ideology valorized overt patriotism, religious conviction, and faith in capitalism. Those who shunned such “American values” were open to attack. If communism was a plague spreading across Europe and Asia, anti-communist hyperbole infected cities, towns, and suburbs throughout the country. The playwright Arthur Miller, whose popular 1953 The Crucible compared the red scare to the Salem Witch Trials, wrote, “In America any man who is not reactionary in his views is open to the charge of alliance with the Red hell. Political opposition, thereby, is given an inhumane overlay which then justifies the abrogation of all normally applied customs of civilized intercourse. A political policy is equated with moral right, and opposition to it with diabolical malevolence. Once such an equation is effectively made, society becomes a congerie of plots and counterplots, and the main role of government changes from that of the arbiter to that of the scourge of God.”

Rallying against communism, American society urged conformity. “Deviant” behavior became dangerous. Having entered the workforce en masse as part of a collective effort in World War II, middle class women were told to return to house-making responsibilities. Having fought and died abroad to for American democracy, blacks were told to return home and acquiesce to the American racial order. Homosexuality, already stigmatized, became dangerous. Personal secrets were seen as a liability that exposed one to blackmail. The same paranoid mindset that fueled the second red scare also ignited the Cold War “lavender scare.”

American religion, meanwhile, was fixated on what McCarthy, in his 1950 Wheeling speech, called an “all-out battle between communistic atheism and Christianity.” Cold warriors in the US routinely referred to a fundamental incompatibility between “godless communism” and god-fearing Americanism. Religious conservatives championed the idea of traditional nuclear god-fearing family as a bulwark against the spread of atheistic totalitarianism. As Baptist minister Billy Graham sermonized in 1950, communism aimed to “destroy the American home and cause … moral deterioration,” leaving the country exposed to communist infiltration.

In an atmosphere in which ideas of national belonging and citizenship were so closely linked to religious commitment, Americans during the early Cold War years attended church, professed a belief in a supreme being, and stressed the importance of religion in their lives at higher rates than in any time in American history. Americans sought to differentiate themselves from godless communists through public displays of religiosity. Politicians infused government with religious symbols. The Pledge of Allegiance was altered to include the words “one nation, under God” in 1954. “In God We Trust” was adopted as the official national motto in 1956. In popular culture, one of the most popular films of the decade, The Ten Commandments (1956), retold the biblical Exodus story as a Cold War parable, echoing (incidentally) NSC 68’s characterization of the Soviet Union as a “slave state.” Monuments of the Ten Commandments went to court houses and city halls across the country.

While the link between American nationalism and religion grew much closer during the Cold War, many Americans began to believe that just believing in almost any religion was better than being an atheist. Gone was the overt anti-Catholic and anti-Semitic language of Protestants in the past. Now, leaders spoke of a common “Judeo-Christian” heritage. In December 1952, a month before his inauguration, Dwight Eisenhower said that “our form of government makes no sense unless it is founded in a deeply-felt religious faith, and I don’t care what it is.”

Joseph McCarthy, an Irish Catholic, made common cause with prominent religious anti-communists, including southern evangelist Billy James Hargis of Christian Crusade, a popular radio and television ministry that peaked in the 1950s and 1960s. Cold War religion in America also crossed the political divide. During the 1952 campaign, Eisenhower spoke of US foreign policy as “a war of light against darkness, freedom against slavery, Godliness against atheism.” His Democratic opponent, former Illinois Governor Adlai Stevenson said that America was engaged in a battle with the “Anti-Christ.” While Billy Graham became a spiritual adviser to Eisenhower as well as other Republican and Democratic presidents, the same was true of the liberal Protestant Reinhold Niebuhr, perhaps the nation’s most important theologian when he appeared on the cover of Life in March 1948.

Though publicly rebuked by the Tydings Committee, McCarthy soldiered on. In June 1951, on the floor of Congress, McCarthy charged that then-Secretary of Defense (and former secretary of state) Gen. George Marshall had fallen prey to “a conspiracy on a scale so immense as to dwarf any previous such venture in the history of man.” He claimed that Marshall, a war hero, had helped to “diminish the United States in world affairs,” enable the US to “finally fall victim to Soviet intrigue… and Russian military might.” The speech caused an uproar. During the 1952 campaign, Eisenhower, who was in all things moderate and politically cautious, refused to publicly denounce McCarthy. “I will not…get into the gutter with that guy,” he wrote privately. McCarthy campaigned for Eisenhower, who won a stunning victory.

So did the Republicans, who regained Congress. McCarthy became chairman of the Senate Permanent Subcommittee on Investigations (SPSI). He targeted many, and turned his newfound power against the government’s overseas broadcast division, the Voice of America (VOA). McCarthy’s investigation in February-March 1953 resulted in several resignations or transfers. McCarthy’s mudslinging had become increasingly unrestrained. Soon he went after the U.S. Army. After forcing the Army to again disprove theories of a Soviet spy ring at Ft. Monmouth in New Jersey, McCarthy publicly berated officers suspected of promoting leftists. McCarthy’s badgering of witnesses created cover for critics to publicly denounce his abrasive fear-mongering.

On March 9, CBS anchor Edward Murrow, a cold war liberal, told his television audience that McCarthy’s actions had “caused alarm and dismay amongst … allies abroad, and given considerable comfort to our enemies.” Yet, Murrow explained, “He didn’t create this situation of fear; he merely exploited it—and rather successfully. Cassius was right. ‘The fault, dear Brutus, is not in our stars, but in ourselves.’”

Twenty million people saw the “Army-McCarthy Hearings” unfold over thirty-six days in 1954. The Army’s head counsel, Joseph Welch, captured much of the mood of the country when he defended a fellow lawyer from McCarthy’s public smears, saying, “Let us not assassinate this lad further, Senator. You’ve done enough. Have you no sense of decency, sir? At long last, have you left no sense of decency?” In September, a senate subcommittee recommended that McCarthy be censured. On December 2, 1954, his colleagues voted 67-22 to “condemn” his actions. Humiliated, McCarthy faded into irrelevance and alcoholism and died in May 1957, at age 48.

By the late 1950s, the worst of the second red scare was over. Stalin’s death, followed by the Korean War armistice, opened new space—and hope—for the easing of Cold War tensions. Détente and the upheavals of the late 1960s were on the horizon. But McCarthyism outlasted McCarthy and the 1950s. McCarthy made an almost unparalleled impact on Cold War American society. The tactics he perfected continued to be practiced long after his death. “Red-baiting,” the act of smearing a political opponent by linking them to communism or some other demonized ideology, persevered. McCarthy had hardly alone.

Congressman Richard Nixon, for instance, used his place on HUAC and his public role in the campaign against Alger Hiss to catapult himself into the White House alongside Eisenhower and later into the presidency. Ronald Reagan bolstered the fame he had won in Hollywood with his testimony before Congress and his anti-communist work for major American corporations such as General Electric. He too would use anti-communism to enter public life and chart a course to the presidency. In 1958, radical anti-communists founded the John Birch Society, attacking liberals and civil rights activists such as Martin Luther King Jr. as communists. Although joined by Cold War liberals, the weight of anti-communism was used as part of an assault against the New Deal and its defenders. Even those liberals, such as historian Arthur Schlesinger, who had fought against communism found themselves smeared by the red scare. Politics and culture both had been reshaped. The leftist American tradition was in tatters, destroyed by anti-communist hysteria. Movements for social justice, from civil rights to gay rights to feminism, were all suppressed under Cold War conformity.

 

V. Decolonization and the Global Reach of the ‘American Century’

In an influential 1941 Life magazine editorial titled “The American Century,” publishing magnate Henry Luce, envisioning the US as a “dominant world power,” outlined his “vision of America as the principal guarantor of freedom of the seas” and “the dynamic leader of world trade.” In his embrace of an American-led international system, the conservative Luce was joined by liberals including historian Arthur Schlesinger, who in his 1949 Cold War tome The Vital Center, proclaimed that a “world destiny” had been “thrust” upon the United States, with perhaps no other nation becoming “a more reluctant great power.” Emerging from the war as the world’s preeminent military and economic force, the US was perhaps destined to compete with the Soviet Union for influence in the Third World, where a power vacuum had been created by the demise of European imperialism. As France and Britain in particular struggled in vain to control colonies in Asia, the Middle East, and North Africa, the United States assumed responsibility for maintaining order and producing a kind of “pax-Americana.” Little of the postwar world, however, would be so peaceful.

Based on the logic of militarized containment established by NSC-68 and American Cold War strategy, interventions in Korea and Vietnam were seen as appropriate American responses to the ascent of communism in China. Unless Soviet power in Asia was halted, Chinese influence would ripple across the continent, and one country after another would “fall” to communism. Easily transposed onto any region of the world, the “Domino Theory” became a standard basis for the justification of US interventions abroad, such as in Cuba after 1959, which was seen as a communist beachhead that imperiled Latin America, the Caribbean, and perhaps eventually the United States. Like Ho Chi Minh, Cuban leader Fidel Castro was a revolutionary nationalist whose career as a communist began in earnest after he was rebuffed by the United States and American interventions targeted nations that never espoused official communist positions. Many interventions in Asia, Latin America, and elsewhere were driven by factors that were shaped by but also that transcended anti-communist ideology.

The Cuban Revolution seemed to confirm the fears of many Americans that the spread of communism could not be stopped. It is believed that American government intervened in the new government of Fidel Castro in covert ways, and many attribute the La Coubre explosion to the American Central Intelligence Agency. In this photograph, Castro and Cuban revolutionary Che Guevara march in a memorial for those killed in the explosion in March, 1960 in Havana Cuba. Wikimedia, http://commons.wikimedia.org/wiki/File:CheLaCoubreMarch.jpg.

The Cuban Revolution seemed to confirm the fears of many Americans that the spread of communism could not be stopped. It is believed that American government intervened in the new government of Fidel Castro in covert ways, and many attribute the La Coubre explosion to the American Central Intelligence Agency. In this photograph, Castro and Cuban revolutionary Che Guevara march in a memorial for those killed in the explosion in March, 1960 in Havana Cuba. Wikimedia, http://commons.wikimedia.org/wiki/File:CheLaCoubreMarch.jpg.

Instead of dismantling its military after World War II, as the United States had after eveyr major conflict, the Cold War facilitated a new permanent defense establishment. Federal investments in national defense affected the entire country. Different regions housed various sectors of what sociologist C. Wright Mills, in 1956, called the “permanent war economy.” The aerospace industry was concentrated in areas like Southern California and Long Island, New York; Massachusetts was home to several universities that received major defense contracts; the Midwest became home base for intercontinental ballistic missiles pointed at the Soviet Union; many of the largest defense companies and military installations were concentrated in the South, so much so that in 1956 author William Faulkner, who was born in Mississippi, remarked, “Our economy is the Federal Government.”

A radical critic of US policy, Mills was one of the first thinkers to question the effects of massive defense spending, which, he said, corrupted the ruling class, or “power elite,” who now had the potential to take the country into war for the sake of corporate profits. Yet perhaps the most famous critique of the entrenched war economy came from an unlikely source. During his farewell address to the nation in January 1961, President Eisenhower cautioned Americans against the “unwarranted influence” of a “permanent armaments industry of vast proportions” which could threaten “liberties” and “democratic processes.” While the “conjunction of an immense military establishment and a large arms industry” was a fairly recent development, this “military-industrial complex” had cultivated a “total influence,” which was “economic, political, even spiritual…felt in every city…Statehouse … [and] office of the Federal government.” There was, he said, great danger in failing to “comprehend its grave implications.”

In Eisenhower’s formulation, the “military-industrial complex” referred specifically to domestic connections between arms manufactures, members of Congress, and the Department of Defense. Yet the new alliance between corporations, politicians, and the military was dependent on having an actual conflict to wage, without which there could be no ultimate financial gain. To critics, military-industrial partnerships at home were now linked to US interests abroad. Suddenly American foreign policy had to ensure foreign markets and secure favorable terms for American trade all across the globe. Seen in such a way, the Cold War was just a bi-product of America’s new role as the remaining Western superpower. Regardless, the postwar rise of US power correlated with what many historians describe as a “national security consensus” that has dominated American policy since World War II. And so the United States was now more intimately involved in world affairs than ever before.

Ideological conflicts and independence movements erupted across the postwar world. More than eighty countries achieved independence, primarily from European control. As it took center stage in the realm of global affairs, the United States played a complicated and often contradictory role in this process of “decolonization.” The sweeping scope of post-1945 US military expansion was unique in the country’s history. Critics believed that the advent of a “standing army,” so feared by many Founders, set a disturbing precedent. But in the postwar world, American leaders eagerly set about maintaining a new permanent military juggernaut and creating viable international institutions.

But what of independence movements around the world? Roosevelt had spoken for many in his remark to British Premier Winston Churchill, in 1941, that it was hard to imagine “fight[ing] a war against fascist slavery, and at the same time not work to free people all over the world from a backward colonial policy.” American postwar foreign policy planners therefore struggled to balance support for decolonization against the reality that national independence movements often posed a threat to America’s global interests.

As American strategy became consumed with thwarting Russian power and the concomitant global spread of communist, to foreign policy officials it increasingly made little difference whether insurgencies or independence movements had direct involvement with the Soviet Union, so long as a revolutionary movement or government could in some way be linked to international communism. The Soviet Union, too, was attempting sway the world. Stalin and his successors pushed an agenda that included not only the creation of Soviet client states in Eastern and Central Europe, but also a tendency to support leftwing liberation movements everywhere, particularly when they espoused anti-American sentiment. As a result, the US and the USSR engaged in numerous proxy wars in the “Third World.”

American planners felt that successful decolonization could demonstrate the superiority of democracy and capitalism against competing Soviet models. Their goal was in essence to develop an informal system of world power based as much as possible on consent (hegemony) rather than coercion (empire). European powers still pushed colonization. American officials feared that anti-colonial resistance would breed revolution and push nationalists into the Soviet sphere. And when faced with such movements, American policy dictated alliances with colonial regimes, alienating nationalist leaders in Asia and Africa.

The architects of American power needed to sway the citizens of decolonizing nations toward the United States. In 1948, Congress passed the Smith-Mundt Act to “promote a better understanding of the United States in other countries.” The legislation established cultural exchanges with various nations, including even the USSR, in order to showcase American values through its artists and entertainers. The Soviets did the same, through what they called an international peace offensive, which by most accounts was more successful than the American campaign. Although making strides through the initiation of various overt and covert programs, US officials still perceived that they were lagging behind the Soviet Union in the “war for hearts and minds.” But as unrest festered in much of the Third World, American officials faced difficult choices.

As American blacks fought for justice at home, prominent American black radicals, including Malcolm X, Paul Robeson, and the aging W.E.B. DuBois, joined in solidarity with the global anti-colonial movement, arguing that the United States had inherited the racist European imperial tradition. Supporters of the Soviet Union made their own effort to win over countries in the non-aligned world, claiming that Marxist-Leninist doctrine offered a roadmap for their liberation from colonial bondage. Moreover, Kremlin propaganda pointed to injustices of the American South as an example of American hypocrisy: how could the United States claim to fight for global freedom when it refused to guarantee freedoms for its own citizenry? In such ways the Cold War connected the black freedom struggle, the Third World, and the global Cold War.

The Soviet Union took advantage of the very real racial tensions in the U.S. to create anti-American propaganda. This 1930 Soviet poster shows a black American being lynched from the Statue of Liberty, while the text below asserts the links between racism and Christianity. 1930 issue of Bezbozhnik. Wikimedia, http://commons.wikimedia.org/wiki/File:Bezbozhnik_u_stanka_US_1930.jpg.

The Soviet Union took advantage of the very real racial tensions in the U.S. to create anti-American propaganda. This 1930 Soviet poster shows a black American being lynched from the Statue of Liberty, while the text below asserts the links between racism and Christianity. 1930 issue of Bezbozhnik. Wikimedia, http://commons.wikimedia.org/wiki/File:Bezbozhnik_u_stanka_US_1930.jpg.

 

VI. Conclusion

In June 1987, American President Ronald Reagan stood at Berlin Wall and demanded that Soviet premier Mikhail Gorbachev “Tear down this wall!” Less than three years later, amid civil unrest in November 1989, East German authorities announced that their citizens were free to travel to and from West Berlin. The concrete curtain would be lifted and East Berlin would be opened to the world. Within months, the Berlin Wall was reduced to rubble by jubilant crowds anticipating the reunification of their city and their nation, which took place on October 3, 1990. By July 1991 the Warsaw Pact had crumbled, and on December 25 of that year, the Soviet Union was officially dissolved. Hungary, Poland, Czechoslovakia, and the Baltic States (Latvia, Estonia, and Lithuania) were freed from Russian domination.

Partisans fight to claim responsibility for the break-up of the Soviet Union and the ending of the Cold War. Whether it was the triumphalist rhetoric and militaristic pressure of conservatives or the internal fracturing of ossified bureaucracies and work of Russian reformers that shaped the ending of the Cold War is a question of later decades. Questions about the Cold War’s end must pause before appreciations of the Cold War’s impact at home and abroad. Whether measured by the tens of millions killed in Cold War-related conflicts, in the reshaping of American politics and culture, or in the transformation of America’s role in the world, the Cold War pushed American history upon a new path, one that it has yet to yield.

 

This chapter was edited by Ari Cushner, with content contributions by Michael Brenes, Ari Cushner, Michael Franczak, Joseph Haker, Jonathan Hunt, Jun Suk Hyun, Zack Jacobson, Micki Kaufman, Lucie Kyrova, Celeste Day Moore, Joseph Parrott, Colin Reynolds, and Tanya Roth.

cc-by-sa-icon

24. World War II

Walter Rosenblum, "D Day Rescue, Omaha Beach," via Library of Congress.

Walter Rosenblum, “D Day Rescue, Omaha Beach,” via Library of Congress.

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

The 1930s and 1940s were trying times. A global economic crisis gave way to a global war that would become the deadliest and most destructive in human history. Perhaps 80 million lost their lives during World War II. Moreover, the war unleashed the most fearsome wartime technology that has ever been used in war. It saw industrialized genocide and nearly threatened the eradication of an entire people. And when it ended, the United States found itself alone as the world’s greatest superpower, armed with the world’s greatest economy and looking forward to a prosperous consumers’ economy. But of course the war would raise as many questions as it would settle, unleashing new social forces at home and abroad that would confront new generations of Americans to come.

 

II. The Origins of the Pacific War

While the United States joined the war in 1941, two years after Europe exploded into conflict in 1939, the path to the Japanese bombing of Pearl Harbor, the surprise attack that threw the United States headlong into war, began much earlier. For the Empire of Japan, the war had begun a decade before Pearl Harbor.

On September 18, 1931, a small explosion tore up railroad tracks controlled by the Japanese-owned South Manchuria Railway near the city of Shenyang (Mukden) in the Chinese province of Manchuria. The railway company condemned the bombing as the work of anti-Japanese Chinese dissidents. Evidence, though, suggests that the initial explosion was neither an act of Chinese anti-Japanese sentiment nor an accident, but an elaborate ruse planned by the Japanese to provide a basis for invasion. In response, the privately operated Japanese Guandong (Kwangtung) army began shelling the Shenyang garrison the next day, and the garrison fell before nightfall. Hungry for Chinese territory and witnessing the weakness and disorganization of Chinese forces, but under the pretense of protecting Japanese citizens and investments, the Japanese Imperial Army ordered a full-scale invasion of Manchuria. The invasion was swift. Without a centralized Chinese army, the Japanese quickly defeated isolated Chinese warlords and by the end of February 1932, all of Manchuria was firmly under Japanese control. Japan established the nation of Manchukuo out of the former province of Manchuria.

This seemingly small skirmish—known—known by the Chinese as the September 18 Incident and the Japanese as the Manchurian Incident—sparked a war that would last thirteen years and claim the lives of over 35 million people. Comprehending Japanese motivations for attacking China, and the grueling stalemate of the ensuring war, are crucial for understanding Japan’s seemingly unprovoked attack on Pearl Harbor, Hawaii on December 7, 1941, and, therefore, for understanding the involvement of the United States in World War II as well.

Despite their rapid advance into Manchuria, the Japanese put off the invasion of China for nearly three years. Japan occupied a precarious domestic and international position after the September 18 Incident. At home, Japan was riven by political factionalism due to its stagnating economy. Leaders were torn as to whether to address modernization and lack of natural resources through unilateral expansion—the conquest of resource-rich areas such as Manchuria to export raw materials to domestic Japanese industrial bases such as Hiroshima and Nagasaki—or international cooperation—particularly a philosophy of pan-Asianism in an anti-Western coalition would push the colonial powers out of Asia. Ultimately, after a series of political crises and assassinations enflamed tensions, pro-war elements within the Japanese military triumphed over the more moderate civilian government. Japan committed itself to aggressive military expansion.

Chinese leaders Chiang Kai-shek and Zhang Xueliang appealed to the League of Nations for assistance against Japan. The United States supported the Chinese protest, proclaiming the Stimson Doctrine in January 1932, which refused to recognize any state established as a result of Japanese aggression. Meanwhile, the League of Nations sent Englishman Victor Bulwer-Lytton to investigate the September 18 Incident. After a six-month investigation, Bulwer-Lytton found the Japanese guilty of inciting the September 18 incident and demanded the return of Manchuria to China. The Japanese withdrew from the League of Nations in March 1933.

Japan isolated itself from the world. Its diplomatic isolation empowered radical military leaders who could point to Japanese military success in Manchuria and compare it to the diplomatic failures of the civilian government. The military took over Japanese policy. And in the military’s eyes, the conquest of China would not only provide for Japan’s industrial needs, it would secure Japanese supremacy in East Asia.

The Japanese launched a full-scale invasion of China. It assaulted the Marco Polo Bridge on July 7, 1937 and routed the forces of the Chinese National Revolutionary Army led by Chiang Kai-shek. The broken Chinese army gave up Beiping (Beijing) to the Japanese on August 8, Shanghai on November 26, and the capital, Nanjing (Nanking), on December 13. Between 250,000 and 300,000 people were killed, and tens of thousands of women were raped, when the Japanese besieged and then sacked Nanjing. The Western press labeled it the Rape of Nanjing. To halt the invading enemy, Chiang Kai-shek adopted a scorched-earth strategy of “trading space for time.” His Nationalist government retreated inland, burning villages and destroying dams, and established a new capital at the Yangtze River port of Chongqing (Chungking). Although the Nationalist’s scorched-earth policy hurt the Japanese military effort, it alienated scores of dislocated Chinese civilians and became a potent propaganda tool of the emerging Chinese Communist Party (CCP).

Americans read about the brutal fighting in China, but the United States lacked both the will and the military power to oppose the Japanese invasion. After the gut-wrenching carnage of World War I, many Americans retreated toward “isolationism” by opposing any involvement in the conflagrations burning in Europe and Asia. And even if Americans wished to intervene, their military was lacking. The Japanese army was a technologically advanced force consisting of 4,100,000 million men and 900,000 Chinese collaborators—and that was in China alone. The Japanese military was armed with modern rifles, artillery, armor, and aircraft. By 1940, the Japanese navy was the third-largest and among the most technologically advanced in the world.

Still, Chinese Nationalists lobbied Washington for aid. Chiang Kai-shek’s wife, Soong May-ling—known to the American public as Madame Chiang—led the effort. Born into a wealthy Chinese merchant family in 1898, Madame Chiang spent much of her childhood in the United States and had graduated from Wellesley College 1917 with a major in English literature. In contrast to her gruff husband, Madame Chiang was charming and able to use her knowledge of American culture and values to garner support for her husband and his government. But while the United States denounced Japanese aggression, it took no action.

As Chinese Nationalists fought for survival, the Communist Party was busy collecting people and supplies in the Northwestern Shaanxi Province. China had been at war with itself when the Japanese came. Nationalists battled a stubborn communist insurgency. In 1935 the Nationalists threw the communists out of the fertile Chinese coast, but an ambitious young commander named Mao Zedong recognized the power of the Chinese peasant population. In Shaanxi, Mao recruited from the local peasantry, building his force from a meager 7,000 survivors at the end of the Long March in 1935 to a robust 1.2 million members by the end of the war.

Although Japan had conquered much of the country, the Nationalists regrouped and the Communists rearmed. An uneasy truce paused the country’s civil war and refocused efforts on the invaders. The Chinese could not dislodge the Japanese, but they could stall their advance. The war mired in stalemate.

 

III. The Origins of the European War

Across the globe in Europe, the continent’s major powers were still struggling with the after-effects of the First World War when the global economic crisis spiraled much of the continent into chaos. Germany’s Weimer Republic collapsed with the economy and out of the ashes emerged Adolph Hitler’s National Socialists—the Nazis. Championing German racial supremacy, fascist government, and military expansionism, Adolph Hitler rose to power and, after aborted attempts to take power in Germany, became Chancellor in 1933 and the Nazis conquered German institutions. Democratic traditions were smashed. Leftist groups were purged. Hitler repudiated the punitive damages and strict military limitations of the Treaty of Versailles. He rebuilt the German military and navy. He reoccupied regions lost during the war and re-militarized the Rhineland, along the border with France. When the Spanish Civil War broke out in 1936, Hitler and Mussolini—the fascist Italian leader who had risen to power in the 1920s—intervened for the Spanish fascists, toppling the communist Spanish Republican Party. Britain and France stood by warily and began to rebuild their militaries, anxious in the face of a renewed Germany but still unwilling to draw Europe into another bloody war.

In his autobiographical manifesto, Mein Kampf, Hitler advocated for the unification of Europe’s German peoples under one nation and that nation’s need for lebensraum, or living space, particularly in Eastern Europe, to supply Germans with the land and resources needed for future prosperity. The untermenschen (“lesser” humans) would have to go. Once in power, Hitler worked toward the twin goals of unification and expansion.

"Adolf Hitler salutes troops of the Condor Legion who fought alongside Spanish Nationalists in the Spanish Civil War, during a rally upon their return to Germany, 1939." Hugo Jaeger—Time & Life Pictures/Getty Images. http://life.time.com/world-war-ii/nazi-propaganda-and-the-myth-of-aryan-invincibility/#ixzz2Wd38MUY9

“Adolf Hitler salutes troops of the Condor Legion who fought alongside Spanish Nationalists in the Spanish Civil War, during a rally upon their return to Germany, 1939.” Hugo Jaeger—Time & Life Pictures/Getty Images. http://life.time.com/world-war-ii/nazi-propaganda-and-the-myth-of-aryan-invincibility/#ixzz2Wd38MUY9

Huge rallies like this one in Nuremberg displayed the sheer number of armed and ready troop and instilled a fierce loyalty to (or fearful silence about) Hitler and the National Socialist Party in Germany. Photograph, November, 9. 1935. Wikimedia, http://commons.wikimedia.org/wiki/File:Reichsparteitag_1935.jpg.

Huge rallies like this one in Nuremberg displayed the sheer number of armed and ready troop and instilled a fierce loyalty to (or fearful silence about) Hitler and the National Socialist Party in Germany. Photograph, November, 9. 1935. Wikimedia, http://commons.wikimedia.org/wiki/File:Reichsparteitag_1935.jpg.

In 1938 Germany annexed Austria and set its sights on the Sudetenland, a large, ethnically German area of Czechoslovakia. Britain and France, alarmed but still anxious to avoid war, the major powers agreed—without Czechoslovakia’s input—that Germany could annex the region in return for a promise to stop all future German aggression. It was thought that Hitler could be appeased, but it became clear that his ambitions would continue pushing German expansion. In March 1939, Hitler took the rest of Czechoslovakia and began to make demands on Poland. Britain and France promised war. And war came.

Hitler signed a secret agreement—the Molotov–Ribbentrop Pact—with the Soviet Union that coordinated the splitting of Poland between the two powers and promised non-aggression thereafter. The European war began when the German Wehrmacht invaded Poland on September 1st, 1939. Britain and France declared war two days later and mobilized their armies. Britain and France hoped that the Poles could hold out for three to four months, enough time for the Allies to intervene. Poland fell in three weeks. The German army, anxious to avoid the rigid, grinding war of attrition that took so many millions in the stalemate of WWI, built their new modern army for speed and maneuverability. German doctrine emphasized the use of tanks, planes, and motorized infantry (infantry that used trucks for transportation instead of marching) to concentrate forces, smash front lines, and wreak havoc behind the enemy’s defenses. It was called blitzkrieg, or lightening war.

After the fall of Poland, France and its British allies braced for an inevitable German attack. Throughout the winter of 1939-40, however, fighting was mostly confined to smaller fronts in Norway. Belligerents called it the Sitzkrieg (sitting war). But in May 1940, Hitler launched his attack into Western Europe. Mirroring the German’s Schlieffen Plan of 1914 in the previous war, Germany attacked through the Netherlands and Belgium to avoid the prepared French defenses along the French-German border. Poland had fallen in three weeks; France lasted only a few weeks more. By June, Hitler was posing for photographs in front of the Eiffel Tower. Germany split France in half. Germany occupied and governed the north, and the south would be ruled under a puppet government in Vichy.

With France under heel, Hitler turned to Britain. Operation Sea Lion—the planned German invasion of the British Isles—required air superiority over the English Channel. From June until October the German Luftwaffe fought the Royal Air Force (RAF) for control of the skies. Despite having fewer planes, British pilots won the so-called Battle of Britain, saving the islands from immediate invasion and prompting the new Prime Minister, Winston Churchill, to declare, “never before in the field of human conflict has so much been owed by so many to so few.”

The German aerial bombing of London left thousands homeless, hurt, or dead. This child sits among the rubble with a rather quizzical look on his face, as adults ponder their fate in the background. Toni Frissell, “[Abandoned boy holding a stuffed toy animal amid ruins following German aerial bombing of London],” 1945. Library of Congress, http://www.loc.gov/pictures/item/2008680191/.

The German aerial bombing of London left thousands homeless, hurt, or dead. This child sits among the rubble with a rather quizzical look on his face, as adults ponder their fate in the background. Toni Frissell, “[Abandoned boy holding a stuffed toy animal amid ruins following German aerial bombing of London],” 1945. Library of Congress.

If Britain was safe from invasion, it was not immune from additional air attacks. Stymied in the Battle of Britain, Hitler began the Blitz—a bombing campaign against cities and civilians. Hoping to crush the British will to fight, the Luftwaffe bombed the cities of London, Liverpool, and Manchester every night from September to the following May. Children were sent far into the countryside to live with strangers to shield them from the bombings. Remaining residents took refuge in shelters and subway tunnels, emerging each morning to put out fires and bury the dead. The Blitz ended in June 1941, when Hitler, confident that Britain was temporarily out of the fight, launched Operation Barbarossa—the invasion of the Soviet Union. Hoping to capture agricultural lands, seize oil fields, and break the military threat of Stalin’s Soviet Union, Hitler broke the two powers’ 1939 non-aggression pact and, on June 22, invaded the Soviet Union. It was the largest land invasion in history. France and Poland had fallen in weeks, and German officials hoped to break Russia before the winter. And initially, the blitzkrieg worked. The German military quickly conquered enormous swaths of land and netted hundreds of thousands of prisoners. But Russia was too big and the Soviets were willing to sacrifice millions to stop the fascist advance. After recovering from the initial shock of the German invasion, Stalin moved his factories east of the Urals, out of range of the Luftwaffe. He ordered his retreating army to adopt a “scorched earth” policy, to move east and destroy food, rails, and shelters to stymie the advancing German army. The German army slogged forward. It split into three pieces and stood at the gates of Moscow, Stalingrad and Leningrad, but supply lines now stretched thousands of miles, Soviet infrastructure had been destroyed, partisans harried German lines, and the brutal Russian winter arrived. Germany had won massive gains but the winter found Germany exhausted and overextended. In the north, the German Army starved Leningrad to death during an interminable siege; in the south, at Stalingrad, the two armies bled themselves to death in the destroyed city; and, in the center, on the outskirts of Moscow, in sight of the capital city, the German army faltered and fell back. It was the Soviet Union that broke Hitler’s army. Twenty-five million Soviet soldiers and civilians died during the “The Great Patriotic War” and roughly 80% of all German casualties during the war came on the Eastern Front. The German army and its various conscripts suffered 850,000 casualties at the Battle of Stalingrad alone. In December 1941, Germany began its long retreat.

IV. The United States and the European War

While Hitler marched across Europe, the Japanese continued their war in the Pacific. In 1939 the United States dissolved its trade treaties with Japan. In 1940 the American Neutrality Acts cut off supplies of necessary war materials by embargoing oil, steel, rubber, and other vital goods. It was hoped that economic pressure would shut down the Japanese war machine. Instead, Japan’s resource-starved military launched invasions across the Pacific to sustain its war effort. The Japanese called their new empire the Greater East Asia Co-Prosperity Sphere and, with the cry of “Asia for the Asians,” made war against European powers and independent nations throughout the region. Diplomatic relations between Japan and the United States collapsed. The United States demanded Japan withdraw from China; Japan considered the oil embargo a de facto declaration of war.

Japanese military planners, believing that American intervention was inevitable, planned a coordinated Pacific offensive to neutralize the United States and other European powers and provide time for Japan to complete its conquests and fortify its positions. On the morning of December 7, 1941, the Japanese launched a surprise attack on the American naval base at Pearl Harbor, Hawaii. Japanese military planners hoped to destroy enough battleships and aircraft carriers to cripple American naval power for years. 2,400 Americans were killed in the attack.

American isolationism fell at Pearl Harbor. Japan had assaulted Hong Kong, the Philippines, and American holdings throughout the Pacific, but it was the attack on Hawaii that threw the United States into a global conflict. Franklin Roosevelt called December 7 “a date which will live in infamy” and called for a declaration of war, which Congress answered within hours. Within a week of Pearl Harbor the United States had declared war on the entire Axis, turning two previously separate conflicts into a true world war.

The American war began slowly. Britain had stood alone militarily in Europe, but American supplies had bolstered their resistance. Hitler unleashed his U-boat “wolf packs” into the Atlantic Oceans with orders to sink anything carrying aid to Britain, but Britain and the United States’ superior tactics and technology won them the Battle of the Atlantic. British code breakers cracked Germany’s radio codes and the surge of intelligence, dubbed Ultra, coupled with massive naval convoys escorted by destroyers armed with sonar and depth charges, gave the advantage to the Allies and by 1942, Hitler’s Kriegsmarine was losing ships faster than they could be built.

In North Africa in 1942, British victory at El Alamein began pushing the Germans back. In November, the first American combat troops entered the European war, landing in French Morocco and pushing the Germans east while the British pushed west. By 1943, the Allies had pushed Axis forces out of Africa. In January President Roosevelt and Prime Minister Churchill met at Casablanca to discuss the next step of the European war. Churchill convinced Roosevelt to chase the Axis up Italy, into the “soft underbelly” of Europe. Afterward, Roosevelt announcing to the press that the Allies would accept nothing less than unconditional surrender.

Meanwhile, the Army Air Force (AAF) sent hundreds (and eventually thousands) of bombers to England in preparation for a massive Strategic Bombing Campaign against Germany. The plan was to bomb Germany around the clock. American bombers hit German ball-bearing factories, rail yards, oil fields and manufacturing centers during the day, while the British Royal Air Force (RAF) carpet-bombed German cities at night. Flying in formation, they initially flew unescorted, since many believed that bombers equipped with defensive firepower flew too high and too fast to be attacked. However, advanced German technology allowed fighters to easily shoot down the lumbering bombers. On some disastrous missions, the Germans shot down almost 50% of American aircraft. However, the advent and implementation of a long-range escort fighter let the bombers hit their targets more accurately while fighters confronted opposing German aircraft.

In 1944, Allied forces began a bombing campaign of railroad and oil targets in Bucharest, part of the wider policy of bombing expeditions meant to incapacitate German transportation. Bucharest was considered the number one oil target in Europe. Photograph, August 1, 1943. Wikimedia, http://commons.wikimedia.org/wiki/File:B-24D%27s_fly_over_Polesti_during_World_War_II.jpg.

In 1944, Allied forces began a bombing campaign of railroad and oil targets in Bucharest, part of the wider policy of bombing expeditions meant to incapacitate German transportation. Bucharest was considered the number one oil target in Europe. Photograph, August 1, 1943. Wikimedia, http://commons.wikimedia.org/wiki/File:B-24D%27s_fly_over_Polesti_during_World_War_II.jpg.

Bombings throughout Europe caused complete devastation in some areas, leveling beautiful ancient cities like Cologne, Germany. Cologne experienced an astonishing 262 separate air raids by Allied forces, leaving the city in ruins as in these the photograph above. Amazingly, the Cologne Cathedral stands nearly undamaged even after being hit numerous times, while the area around it crumbles. Photograph, April 24, 1945. Wikimedia, http://commons.wikimedia.org/wiki/File:Koeln_1945.jpg.

Bombings throughout Europe caused complete devastation in some areas, leveling beautiful ancient cities like Cologne, Germany. Cologne experienced an astonishing 262 separate air raids by Allied forces, leaving the city in ruins as in these the photograph above. Amazingly, the Cologne Cathedral stands nearly undamaged even after being hit numerous times, while the area around it crumbles. Photograph, April 24, 1945. Wikimedia, http://commons.wikimedia.org/wiki/File:Koeln_1945.jpg.

In the wake of the Soviet’s victory at Stalingrad, the “Big Three” (Roosevelt, Churchill, and Stalin) met in Tehran in November 1943. Dismissing Africa and Italy as a side-show, Stalin demanded that Britain and the United States invade France to relive pressure on the Eastern Front. Churchill was hesitant, but Roosevelt was eager. The invasion was tentatively scheduled for 1944.

Back in Italy, the “soft underbelly” turned out to be much tougher than Churchill had imagined. Italy’s narrow, mountainous terrain gave the defending Axis the advantage. Movement up the peninsula was slow and in some places conditions returned to the trench-like warfare of WWI. Americans attempted to land troops behind them at Anzio on the western coast of Italy but they became surrounded and suffered heavy casualties. But the Allies pushed up the peninsula, Mussolini’s government revolted, and a new Italian government quickly made peace.

On the day the American army entered Rome, American, British and Canadian forces launched Operation Overlord, the long-awaited invasion of France. D-Day, as it became popularly known, was the largest amphibious assault in history. American general Dwight Eisenhower was uncertain enough of the attack’s chances that the night before the invasion he wrote two speeches: one for success and one for failure. The Allied landings were successful, and although progress across France was much slower than hoped for, Paris was liberated roughly two months later. Allied bombing expeditions meanwhile continued to level German cities and industrial capacity. Perhaps 400,000 German civilians were killed by allied bombing.

The Nazis were crumbling on both fronts. Hitler tried but failed to turn the war in his favor in the west. The Battle of the Bulge failed to drive the Allies back into the British Channel, but the delay cost the Allies the winter. The invasion of Germany would have to wait, while the Soviet Union continued its relentless push westward, ravaging German populations in retribution for German war crimes.

German counterattacks in the east failed to dislodge the Soviet advance, destroying any last chance Germany might have had to regain the initiative. 1945 dawned with the end of European war in sight. The Big Three met again at Yalta in the Soviet Union, where they reaffirmed the demand for Hitler’s unconditional surrender and began to plan for postwar Europe.

The Soviet Union reached Germany in January, and the Americans crossed the Rhine in March. In late April American and Soviet troops met at the Elbe while the Soviets, pushed relentlessly by Stalin to reach Berlin first, took the capital city in May, days after Hitler and his high command had committed suicide in a city bunker. Germany was conquered. The European war was over. Allied leaders met again, this time at Potsdam, Germany, where it was decided that Germany would be divided into pieces according to current Allied occupation, with Berlin likewise divided, pending future elections. Stalin also agreed to join the fight against Japan in approximately three months.

 

V. The United States and the Japanese War

As Americans celebrated “V.E.” (Victory in Europe) Day, they redirected their full attention to the still-raging Pacific War. As in Europe, the war in the Pacific started slowly. After Pearl Harbor, the American-controlled Philippine archipelago fell to Japan. After running out of ammunition and supplies, the garrison of American and Filipino soldiers surrendered. The prisoners were marched 80 miles to their prisoner of war camp without food, water, or rest. 10,000 died on the Bataan Death March.

But as Americans mobilized their armed forces, the tide turned. In the summer of 1942, American naval victories at the Battle of the Coral Sea and the aircraft carrier duel at the Battle of Midway crippled Japan’s Pacific naval operations. To dislodge Japan’s hold over the Pacific, the US military began island hopping: attacking island after island, bypassing the strongest but seizing those capable of holding airfields to continue pushing Japan out of the region. Combat was vicious. At Guadalcanal American soldiers saw Japanese soldiers launch suicidal charges rather than surrender. Many Japanese soldiers refused to be taken prisoner or to take prisoners themselves. Such tactics, coupled with American racial prejudice, turned the Pacific Theater into a more brutal and barbarous conflict than the European Theater ((Dower, War Without Mercy: Race and Power in the Pacific War (1987).)).

Japanese defenders fought tenaciously. Few battles were as one-sided as the Battle of the Philippine Sea, or what the Americans called the Japanese counterattack “The Great Marianas Turkey Shoot.” Japanese soldiers bled the Americans in their advance across the Pacific. At Iwo Jima, an eight-square-mile island of volcanic rock, 17,000 Japanese soldiers held the island against 70,000 marines for over a month. At the cost of nearly their entire force, they inflicted almost 30,000 casualties before the island was lost.

By February 1945, American bombers were in range of the mainland. Bombers hit Japan’s industrial facilities but suffered high casualties. To spare bomber crews from dangerous daylight raids, and to achieve maximum effect against Japan’s wooden cities, many American bombers dropped incendiary weapons that created massive fire storms and wreaked havoc on Japanese cities. Over sixty Japanese cities were fire-bombed. American fire bombs killed 100,000 civilians in Tokyo in March 1945.

In June 1945, after eighty days of fighting and tens of thousands of casualties, the Americans captured the island of Okinawa. The mainland of Japan was open before them. It was a viable base from which to launch a full invasion of the Japanese homeland and end the war.

Estimates varied but, given the tenacity of Japanese soldiers fighting on islands far from their home, some officials estimated that an invasion of the Japanese mainland could cost half-a-million American casualties and perhaps millions of Japanese civilians. Historians debate the many motivations that ultimately drove the Americans to drop atomic weapons over Japan, but these would be the numbers later cited by government leaders and military officials to justify their use.

Early in the war, fearing that the Germans might develop an atomic bomb, the U.S. government launched the Manhattan Project, a hugely expensive, ambitious program to harness atomic energy and create a single weapon capable of leveling entire cities. The Americans successfully exploded the world’s first nuclear device, Trinity, in New Mexico in July 1945. (Physicist J. Robert Oppenheimer, the director of the Los Alamos Laboratory, where the bomb was designed, later recalled that the event reminded him of Hindu scripture: “Now I am become death, the destroyer of worlds.”) Two more bombs—“Fat Man” and “Little Boy”—were built and detonated over two Japanese cities in August. Hiroshima was hit on August 6th. Over 100,000 civilians were killed. Nagasaki followed on August 9th. Perhaps 80,000 civilians were killed.

Emperor Hirohito announced the surrender of Japan on August 14th. The following day, aboard the battleship USS Missouri, delegates from the Japanese government formally signed their surrender. World War II was finally over.

 

VI. Soldiers’ Experiences

Almost eighteen million men served in World War II. Volunteers rushed to join the military after Pearl Harbor, but the majority—over 10 million—were drafted into service. Volunteers could express their preference for assignment, and many preempted the draft by volunteering. Regardless, those recruits judged I-A, “fit for service,” were moved into basic training, where soldiers were developed physically and trained in the basic use of weapons and military equipment. Soldiers were indoctrinated into the chain of command and introduced to military life. After basic, soldiers moved onto more specialized training. For example, combat infantrymen received additional weapons and tactical training and radio operators learned transmission codes and the operation of field radios. Afterward, an individual’s experience varied depending upon what service he entered and to what theatre he was assigned.

Soldiers and marines bore the brunt of on-the-ground combat. After transportation to the front by trains, ships, and trucks, they could expect to march carrying packs weighing anywhere from 20-50 pounds of rations, ammunition, bandages, tools, clothing, and miscellaneous personal items in addition to their weapons. Sailors, once deployed, spent months at sea operating their assigned vessels. Larger ships, particularly aircraft carriers, were veritable floating cities. In most, sailors lived and worked in cramped conditions, often sleeping in bunks stacked in rooms housing dozens of sailors. Senior officers received small rooms of their own. Sixty-thousand American sailors lost their lives in the war.

During World War II the Air Force was still a branch of the U.S. Army. Soldiers in the served on ground crews and air crews. World War II saw the institutionalization of massive bombing campaigns against cities and industrial production. Large bombers like the B-17 Flying Fortress required pilots, navigators, bombardiers, radio operators, and four dedicated machine gunners. Soldiers on bombing raids left from bases in England or Italy, or from Pacific Islands, endured hours of flight before approaching enemy territory. At high altitude, and without pressurized cabins, crews used oxygen tanks to breath and on-board temperatures plummeted. Once in enemy airspace crews confronted enemy fighters and anti-aircraft “flak” from the ground. While fighter pilots flew as escorts, the Air Corps suffered heavy casualties. Tens of thousands of airmen lost their lives.

On-the ground conditions varied. Soldiers in Europe endured freezing winters, impenetrable French hedgerows, Italian mountain ranges, and dense forests. Germans fought with a Western mentality familiar to Americans. Soldiers in the Pacific endured heat and humidity, monsoons, jungles, and tropical diseases. And they confronted an unfamiliar foe. Americans, for instance, could understand surrender as prudent; many Japanese soldiers saw it as cowardice. What Americans saw as a fanatical waste of life, the Japanese saw as brave and honorable. Atrocities flourished in the Pacific at a level unmatched in Europe.

 

VII. The Wartime Economy

Economies win wars no less than militaries. The war converted American factories to wartime production, reawakened Americans’ economic might, armed Allied belligerents and the American armed forces, effectively pulled America out of the Great Depression, and ushered in an era of unparalleled economic prosperity.

Roosevelt’s New Deal had ameliorated the worst of the Depression, but the economy still limped its way forward into the 1930s. But then Europe fell into war, and, despite its isolationism, Americans were glad to sell the Allies arms and supplies. And then Pearl Harbor changed everything. The United States drafted the economy into war service. The “sleeping giant” mobilized its unrivalled economic capacity to wage worldwide war. Governmental entities such as the War Production Board and the Office of War Mobilization and Reconversion managed economic production for the war effort and economic output exploded. An economy that was unable to provide work for a quarter of the work force less than a decade earlier now struggled to fill vacant positions.

Government spending during the four years of war doubled all federal spending in all of American history up to that point. The budget deficit soared, but, just as Depression Era economists had counseled, the government’s massive intervention annihilated unemployment and propelled growth. The economy that came out of the war looked nothing like the one that had begun it.

Military production came at the expense of the civilian consumer economy. Appliance and automobile manufacturers converted their plants to produce weapons and vehicles. Consumer choice was foreclosed. Every American received rationing cards and, legally, goods such as gasoline, coffee, meat, cheese, butter, processed food, firewood, and sugar could not be purchased without them. The housing industry was shut down, and the cities became overcrowded.

But the wartime economy boomed. The Roosevelt administration urged citizens to save their earnings or buy war bonds to prevent inflation. Bond drives were held nationally and headlined by Hollywood celebrities. Such drives were hugely successful. They not only funded much of the war effort, they helped to tame inflation as well. So too did tax rates. The federal government raised income taxes and boosted the top marginal tax rate to 94%.

Much like during WWI, citizens during WWI were urged to buy war bonds to support the effort overseas. Rallies like this one appealed to Americans’ sense of patriotism. Wikimedia, http://upload.wikimedia.org/wikipedia/commons/5/5b/A_war_bond_rally_during_World_War_II_-_NARA_-_197250.jpg.

As in World War I, citizens were urged to buy war bonds to support the effort overseas. Rallies, such as this one, appealed to Americans’ sense of patriotism. Wikimedia.

With the economy booming and twenty million American workers placed into military service, unemployment virtually disappeared. And yet limits remained. Many defense contractors still refused to hire black workers. A. Philip Randolph in 1941 threatened to lead a march on Washington in protest, compelling Roosevelt to issue Executive Order Number 8802, the Fair Employment Practice in Defense Industries Act, which established the Fair Employment Practices Committee to end racial discrimination in the federal government and the defense industry.

During the war, more and more African Americans continued to leave the agrarian south for the industrial north. And as more and more men joined the military, and more and more positions went unfilled, women joined the workforce en masse. Other American producers looked outside of the United States, southward, to Mexico, to fill its labor force. Between 1942 and 1964, the United States contracted thousands of Mexican nationals to work in American agriculture and railroads in the Bracero Program. Jointly administered by the State Department, the Department of Labor, and the Department of Justice, the binational agreement secured five million contracts across twenty four states.

With factory work proliferating across the country and agricultural labor experiencing severe labor shortages, the presidents of Mexico and the U.S. signed an agreement in July 1942 to bring the first group of legally contracted workers to California. Discriminatory policies towards people of Mexican descent prevented bracero contracts in Texas until 1947. The Bracero Program survived the war, enshrined in law until the 1960s, when the United States liberalized its immigration laws. Though braceros suffered exploitative labor conditions, for the men who participated the program was a mixed blessing. Interviews with ex-braceros captured the complexity. “They would call us pigs, I know we were a lot, but they didn’t have to treat us that way,” one said of his employers, while another said, “For me it was a blessing, the United States was a blessing…, it is a nation I fell in love with because of the excess work and good pay.” After the exodus of Mexican migrants during the Depression, the program helped to reestablish Mexican migration, institutionalized migrant farm work across much of the country, and further planted a Mexican presence in the southern and western United States.

 

VIII. Women and World War II

President Franklin D. Roosevelt and his administration had encouraged all able-bodied American women to help the war effort. He considered the role of women in the war critical for American victory and the public expected women to assume various functions to free men for active military service. While the majority of women opted to remain at home or volunteer with charitable organizations, many went to work or donned a military uniform.

World War II brought unprecedented labor opportunities for American women. Industrial labor, an occupational sphere dominated by men, shifted in part to women for the duration of wartime mobilization. Women applied for jobs in converted munitions factories. The iconic illustrated image of “Rosie the Riveter,” a muscular woman dressed in coveralls with her hair in a kerchief and inscribed with the phrase, “We Can Do It!” would come stand for female factory labor during the war. But women also worked in various auxiliary positions for the government. Although often a traditionally gendered female occupation, over a million administrative jobs at the local, state, and national levels were transferred from men to women for the duration of the war.

Women came into the workforces in greater numbers than ever before during WWII. With vacancies left by deployed men and new positions created by war production, posters like this iconic “We Can Do It!” urged women to support the war effort by going to work in America’s factories. Poster for Westinghouse, 1942. Wikimedia, http://commons.wikimedia.org/wiki/File:We_Can_Do_It!.jpg.

With so many American workers deployed overseas and so many new positions created by war production, posters like the iconic “We Can Do It!” urged women to support the war effort by entering the work force. Poster for Westinghouse, 1942. Wikimedia Commons.

For women who elected not to work, many volunteer opportunities presented themselves. The American Red Cross, the largest charitable organization in the nation, encouraged women to volunteer with local city chapters. Millions of females organized community social events for families, packed and shipped almost a half million ton of medical supplies overseas, and prepared twenty-seven million care packages of nonperishable items for American and other Allied prisoners of war. The American Red Cross further required all women volunteers to certify as nurse’s aides, providing an extra benefit and work opportunity for hospital staffs that suffered severe manpower losses. Other charity organizations, such as church and synagogue affiliates, benevolent associations, and social club auxiliaries, gave women further outlets for volunteer work.

Military service was another option for women who wanted to join the war effort. Over 350,000 women served in several all-female units of the military branches. The Army and Navy Nurse Corps Reserves, the Women’s Army Auxiliary Corps, the Navy’s Women Accepted for Volunteer Emergency Service, the Coast Guard’s SPARs (named for the Coast Guard motto,  Semper Paratus, “Always Ready”), and Marine Corps units gave women the opportunity to serve as either commissioned officers or enlisted members at  military bases at home and abroad. The Nurse Corps Reserves alone commissioned 105,000 Army and Navy nurses recruited by the American Red Cross. Military nurses worked at base hospitals, mobile medical units, and onboard hospital “mercy” ships.

Jim Crow segregation in both the civilian and military sectors remained a problem for black women who wanted to join the war effort. Even after President Roosevelt signed Executive Order 8802 in 1941, supervisors that hired black women still often relegated them to the most menial tasks on factory floors. Segregation was further upheld in factory lunchrooms and many black women were forced to work at night to keep them separate from whites. In the military, only the Women’s Army Auxiliary Corps and the Nurse Corps Reserves accepted black women for active service, and the Army set a limited quota of ten percent of total end strength for black female officers and enlisted women and segregated black units on active duty. The American Red Cross, meanwhile, recruited only four hundred black nurses for the Army and Navy Nurse Corps Reserves, and black Army and Navy nurses worked in segregated military hospitals on bases stateside and overseas.

And for all of the postwar celebration of Rosie the Riveter, after the war ended the men returned and most women voluntarily left the work force or lost their jobs. Meanwhile, former military women faced a litany of obstacles in obtaining veteran’s benefits during their transition to civilian life. The nation that beckoned the call for assistance to millions of women during the four-year crisis hardly stood ready to accommodate their postwar needs and demands.

 

IX. Race and World War II

World War II affected nearly every aspect of life in the United States, and America’s racial relationships were not immune. African Americans, Mexicans and Mexican Americans, Jews, and Japanese Americans were profoundly impacted.

In early 1941, months before the Japanese attack on Pearl Harbor, A. Philip Randolph, president of the Brotherhood of Sleeping Car Porters, the largest black trade union in the nation, made headlines by threatening President Roosevelt with a march on Washington, D.C. In this “crisis of democracy,” Randolph said, defense industries refused to hire African Americans and the armed forces remained segregated. In exchange for Randolph calling off the march, Roosevelt issued Executive Order 8802, banning racial and religious discrimination in defense industries and establishing the Fair Employment Practices Committee (FEPC) to monitor defense industry hiring practices. While the armed forces would remain segregated throughout the war, and the FEPC had limited influence, the order showed that the federal government could stand against discrimination. The black workforce in defense industries rose from 3 percent in 1942 to 9 percent in 1945.

More than one million African Americans fought in the war. Most blacks served in segregated, non-combat units led by white officers. Some gains were made, however. The number of black officers increased from 5 in 1940 to over 7,000 in 1945. The all-black pilot squadrons, known as the Tuskegee Airmen, completed more than 1,500 missions, escorted heavy bombers into Germany, and earned several hundred merits and medals. Many bomber crews specifically requested the “Red Tail Angels” as escorts. And near the end of the war, the army and navy began integrating some of its platoons and facilities, before, in 1948, the U.S. government finally ordered the full integration of its armed forces.

The Tuskegee Airmen stand at attention as Major James A. Ellison returns the salute of Mac Ross, one of the first graduates of the Tuskegee cadets. The photographs shows the pride and poise of the Tuskegee Airmen, who continued a tradition of African Americans honorably serving a country that still considered them second-class citizens. Photograph, 1941. Wikimedia, http://commons.wikimedia.org/wiki/File:First_Tuskeegee_Class.jpg.

The Tuskegee Airmen stand at attention as Major James A. Ellison returns the salute of Mac Ross, one of the first graduates of the Tuskegee cadets. The photographs shows the pride and poise of the Tuskegee Airmen, who continued a tradition of African Americans honorably serving a country that still considered them second-class citizens. Photograph, 1941. Wikimedia, http://commons.wikimedia.org/wiki/File:First_Tuskeegee_Class.jpg.

While black Americans served in the armed forces (though they were segregated), on the home front they became riveters and welders, rationed food and gasoline, and bought victory bonds. But many black Americans saw the war as an opportunity not only to serve their country but to improve it. The Pittsburgh Courier, a leading black newspaper, spearheaded the “Double V” campaign. It called on African Americans to fight two wars: the war against Nazism and Fascism abroad and the war against racial inequality at home. To achieve victory, to achieve “real democracy,” the Courier encouraged its readers to enlist in the armed forces, volunteer on the home front, and fight against racial segregation and discrimination.

During the war, membership in the NAACP jumped tenfold, from 50,000 to 500,000. The Congress of Racial Equality (CORE) was formed in 1942 and spearheaded the method of nonviolent direct action to achieve desegregation. Between 1940 and 1950, some 1.5 million southern blacks, the largest number than any other decade since the beginning of the Great Migration, also indirectly demonstrated their opposition to racism and violence by migrating out of the Jim Crow South to the North. But transitions were not easy. Racial tensions erupted in 1943 in a series of riots in cities such as Mobile, Beaumont, and Harlem. The bloodiest race riot occurred in Detroit and resulted in the death of 25 blacks and 9 whites. Still, the war ignited in African Americans an urgency for equality that they would carry with them into the subsequent years.

Many Americans had to navigate American prejudice, and America’s entry into the war left foreign nationals from the belligerent nations in a precarious position. The Federal Bureau of Investigation targeted numbers on suspicions of disloyalty for detainment, hearings, and possible internment under the Alien Enemy Act. Those who received an order for internment were sent to government camps secured by barbed wire and armed guards. Such internmentss were supposed to be for cause. Then, on February 19, 1942, President Roosevelt signed Executive Order 9066, authorizing the removal any persons from designated “exclusion zones”—which ultimately covered nearly a third of the country—at the discretion of military commanders. 30,000 Japanese Americans fought for the United States in World War II, but wartime anti-Japanese sentiment reinforced historical prejudices and, under the order, persons of Japanese descent, both immigrants and American citizens, were detained and placed under the custody of the War Relocation Authority, the civil agency that supervised their relocation to internment camps. They lost their homes and jobs. The policy indiscriminately targeted Japanese-descended populations. Individuals did not receive individual review prior to their internment. This policy of mass exclusion and detention affected over 110,000 individuals. 70,000 were American citizens.

In its 1982 report, Personal Justice Denied, the congressionally appointed Commission on Wartime Relocation and Internment of Civilians concluded that “the broad historical causes” shaping the relocation program were “race prejudice, war hysteria, and a failure of political leadership.” Although the exclusion orders were found to have been constitutionally permissible under the vagaries of national security, they were later judged, even by the military and judicial leaders of the time, to have been a grave injustice against persons of Japanese descent. In 1988, President Reagan signed a law that formally apologized for internment and provided reparations to surviving internees.

But if actions taken during war would later prove repugnant, so too could inaction. As the Allies pushed into Germany and Poland, they uncovered the full extent of Hitler’s genocidal atrocities. The Allies liberated massive camp systems set up for the imprisonment, forced labor, and extermination of all those deemed racially, ideologically, or biologically “unfit” by Nazi Germany. But the holocaust—the systematic murder of 11 million civilians, including 6 million Jews—had been underway for years. How did America respond?

This photograph became one of the most well-known images from WWII. Originally from Jürgen Stroop's May 1943 report to Heinrich Himmler, it circulated throughout Europe and America as an image of the Nazi Party’s brutality. The original German caption read: "Forcibly pulled out of dug-outs". Wikimedia, http://commons.wikimedia.org/wiki/File:Stroop_Report_-_Warsaw_Ghetto_Uprising_06b.jpg.

This photograph, originally from Jürgen Stroop’s May 1943 report to Heinrich Himmler, circulated throughout Europe and America as an image of the Nazi Party’s brutality. The original German caption read: “Forcibly pulled out of dug-outs”. Wikimedia Commons.

Initially, American officials expressed little official concern for Nazi persecutions. At the first signs of trouble in the 1930s, the State Department and most U.S. embassies did realtively little to aid European Jews. Roosevelt publically spoke out against the persecution, and even withdrew the U.S. ambassador to Germany after Kristallnacht. He pushed for the 1938 Evian Conference in France in which international leaders discussed the Jewish refugee problem and worked to expand Jewish immigration quotas by tens of thousands of people per year. But the conference came to nothing and the United States turned away countless Jewish refugees who requested asylum in the United States.

In 1939, the German ship St. Louis carried over 900 Jewish refugees. They could not find a country that would take them. The passengers could not receive visas under the United States’ quota system. A State Department wire to one passenger read that all must “await their turns on the waiting list and qualify for and obtain immigration visas before they may be admissible into the United States.” The ship cabled the president for special privilege, but the president said nothing. The ship was forced to return back to Europe. Hundreds of the St. Louis’s passengers would perish in the Holocaust.

Anti-Semitism still permeated the United States. Even if Roosevelt wanted to do more—it’s difficult to trace his own thoughts and personal views—he judged the political price for increasing immigration quotas as too high. In 1938 and 1939 the U.S. Congress debated the Wagner-Rogers Bill, an act to allow 20,000 German-Jewish children into the United States. First Lady Eleanor Roosevelt endorsed the measure but the president remained publicly silent. The bill was opposed by roughly two-thirds of the American public and was defeated. Historians speculate that Roosevelt, anxious to protect the New Deal and his rearmament programs, was unwilling to expend political capital to protect foreign groups that the American public had little interest in protecting.

Knowledge of the full extent of the Holocaust was slow in coming. When the war began, American officials, including Roosevelt, doubted initial reports of industrial death camps. But even when they conceded their existence, officials pointed to their genuinely limited options. The most plausible response was for the U.S. military was to bomb either the camps or the railroads leading to them, but those options were rejected by military and civilian officials who argued that it would do little to stop the deportations, would distract from the war effort, and could cause casualties among concentration camp prisoners. Whether bombing would have saved lives remains a hotly debated question.

Late in the war, Secretary of the Treasury Henry Morgenthau, himself born into a wealthy New York Jewish family, pushed through major changes in American policy. In 1944, he formed the War Refugees Board (WRB) and became a passionate advocate for Jewish refugees. The efforts of the WPB saved perhaps 200,000 Jews and 20,000 others. Morgenthau also convinced Roosevelt to issue a public statement condemning the Nazi’s persecution. But it was already 1944, and such policies were far too little, far too late.

 

X. Toward a Postwar World

Americans celebrated the end of the war. At home and abroad, the United States looked to create a postwar order that would guarantee global peace and domestic prosperity. Although the alliance-of-convenience with Stalin’s Soviet Union would collapse, Americans nevertheless looked for the means to ensure postwar stability and economic security for returning veterans.

The inability of the League of Nations to stop German, Italian, and Japanese aggressions caused many to question whether any global organization or agreements could ever ensure world peace. This included Franklin Roosevelt who, as Woodrow Wilson’s Undersecretary of the Navy, witnessed the rejection of this idea by both the American people and the Senate. In 1941, Roosevelt believed that postwar security could be maintained by an informal agreement between what he termed “the Four Policemen”—the U.S., Britain, the Soviet Union, and China—instead of a rejuvenated League of Nations. But others, including Secretary of State Cordell Hull and British Prime Minister Winston Churchill, disagreed and convinced Roosevelt to push for a new global organization. As the war ran its course, Roosevelt came around to the idea. And so did the American public. Pollster George Gallup noted a “profound change” in American attitudes. The United States had rejected membership in the League of Nations after World War I, and in 1937 only a third of Americans polled supported such an idea. But as war broke out in Europe, half of Americans did. America’s entry into the war bolstered support, and, by 1945, with the war closing, 81% of Americans favored the idea.

Whatever his support, Roosevelt had long showed enthusiasm for the ideas later enshrined in the United Nations charter. In January 1941, he announced his Four Freedoms—freedom of speech, of worship, from want, and from fear—that all of the world’s citizens should enjoy. That same year he signed the Atlantic Charter with Churchill, which reinforced those ideas and added the right of self-determination and promised some sort of post-war economic and political cooperation. Roosevelt first used the phrase “united nations” to describe the Allied powers, not the subsequent post-war organization. But the name stuck. At Tehran in 1943, Roosevelt and Churchill convinced Stalin to send a Soviet delegation to a conference at Dumbarton Oaks, outside Washington D.C., in August 1944 where they agreed on the basic structure of the new organization. It would have a Security Council—the original “four policemen,” plus France—who would consult on how best to keep the peace, and when to deploy the military power of the assembled nations. According to one historian, the organization demonstrated an understanding that “only the Great Powers, working together, could provide real security.” But the plan was a kind of hybrid between Roosevelt’s policemen idea and a global organization of equal representation. There would also be a General Assembly, made up of all nations, an International Court of Justice, and a council for economic and social matters. Dumbarton Oaks was a mixed success—the Soviets especially expressed concern over how the Security Council would work—but the powers agreed to meet again in San Francisco between April and June 1945 for further negotiations. There, on June 26 1945, fifty nations signed the UN charter.

Anticipating victory in World War II, leaders not only looked to the postwar global order, they looked to the fate of returning American servicemen. American politicians and interest groups sought to avoid another economic depression—the economy had tanked after World War I—by gradually easing returning veterans back into the civilian economy. The brainchild of the head of the American Legion, William Atherton, the G.I. Bill won support from progressives and conservatives alike. Passed in 1944, the G.I. Bill was a multifaceted, multi-billion-dollar entitlement program that rewarded honorably discharged veterans with numerous benefits.

Faced with the prospect of over 15 million members of the armed services (including approximately 350,000 women) suddenly returning to civilian life, the G.I. Bill offered a bevy of inducements to slow their influx into the civilian workforce as reward their service with public benefits. The legislation offered a year’s worth of unemployment benefits for veterans unable to secure work. About half of American veterans (8 million) received $4 billion in unemployment benefits over the life of the bill. The G.I. Bill also made post-secondary education a reality for many. The Veterans Administration (VA) paid the lion’s share of educational expenses, including tuition, fees, supplies, and even stipends for living expenses. The G.I. Bill sparked a boom in higher education. Enrollments at accredited colleges, universities, and technical and professional schools spiked, rising from 1.5 million in 1940 to 3.6 million in 1960. The VA disbursed over $14 billon in educational aid in just over a decade. Furthermore, the Bill encouraged home ownership. Roughly 40 percent of Americans owned homes in 1945, but that figured climbed to 60 percent a decade after the close of the war. Doing away with down-payment requirements, veterans could obtain home loans for as little as $1 down. Close to 4 million veterans purchased homes through the G.I. Bill, sparking a construction bonanza that fueled postwar growth. In addition, the V.A. also helped nearly 200,000 veterans secure farms and offered thousands more guaranteed financing for small businesses.

Not all Americans, however, benefitted equally from the G.I. Bill. Indirectly, since the military limited the number of female personnel men qualified for the bill’s benefits in far higher numbers. Colleges also limited the number of female applicants to guarantee space for male veterans. African Americans, too, faced discrimination. Segregation forced black veterans into overcrowded “historically black colleges” that had to turn away close to 20,000 applicants. Meanwhile, residential segregation limited black home ownership in various neighborhoods, denying black homeowners the equity and investment that would come in home ownership. There were other limits, and other disadvantaged groups. Veterans accused of homosexuality, for instance, were similarly unable to claim GI benefits.

The effects of the G.I. Bill were significant and long-lasting. It helped to sustain the great postwar economic boom and, if many could not attain it, it nevertheless established the hallmarks of American middle class life.

 

XI. Conclusion

The United States entered the war in a crippling economic depression and exited at the beginning of an unparalleled economic boom. The war had been won, the United States was stronger than ever, and Americans looked forward to a prosperous future. And yet new problems loomed. Stalin’s Soviet Union and the proliferation of nuclear weapons would disrupt postwar dreams of global harmony. Meanwhile, Americans that had fought a war for global democracy would find that very democracy eradicated around the world in reestablished colonial regimes and at home in segregation and injustice. The war had unleashed powerful forces, forces that would reshape the United States at home and abroad.

 

This chapter was edited by Joseph Locke, with content contributions by Mary Beth Chopas, Andrew David, Ashton Ellett, Paula Fortier, Joseph Locke, Jennifer Mandel, Valerie Martinez, Ryan Menath, Chris Thomas.
cc-by-sa-icon

23. The Great Depression

"Destitute pea pickers in California. Mother of seven children. Age thirty-two. Nipomo, California," Library of Congress.

“Destitute pea pickers in California. Mother of seven children. Age thirty-two. Nipomo, California,” Library of Congress.

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

The wonder of the stock market had permeated popular culture throughout the 1920s. Although it was released during the first year of the Great Depression, the 1930 film High Society Blues captured the speculative hope and prosperity of the previous decade. “I’m in the Market for You,” a popular musical number from the film, even used the stock market as a metaphor for love: You’re going up, up, up in my estimation, / I want a thousand shares of your caresses, too. / We’ll count the hugs and kisses, / When dividends are due, / Cause I’m in the market for you.

But, just as the song was being recorded in 1929, the stock market reached the apex of its swift climb, crashed, and brought an abrupt end to the seeming prosperity of the “Roaring ‘20s.” The Great Depression had arrived.

 

II. The Origins of the Great Depression

“Crowd of people gather outside the New York Stock Exchange following the Crash of 1929,” 1929. Library of Congress, http://www.loc.gov/pictures/item/99471695/.

“Crowd of people gather outside the New York Stock Exchange following the Crash of 1929,” 1929. Library of Congress, http://www.loc.gov/pictures/item/99471695/.

On Thursday, October 24, 1929, stock market prices suddenly plummeted. Ten billion dollars in investments (roughly equivalent to about $100 billion today) disappeared in a matter of hours. Panicked selling set in, stock sunk to record lows, and stunned investors crowded the New York Stock Exchange demanding answers. Leading bankers met privately at the offices of J.P. Morgan and raised millions in personal and institutional contributions to halt the slide. They marched across the street and ceremoniously bought stocks at inflated prices. The market temporarily stabilized but fears spread over the weekend and the following week frightened investors dumped their portfolios to avoid further losses. On October 29, “Black Tuesday,” the stock market began its long precipitous fall. Stock values evaporated. Shares of U.S. Steel dropped from $262 to $22. General Motors’ stock fell from $73 a share to $8. Four-fifths of the J.D. Rockefeller’s fortune—the greatest in American history—vanished.

Although the Crash stunned the nation, it exposed the deeper, underlying problems with the American economy in the 1920s. The stock market’s popularity grew throughout the 1920s but only 2.5% of Americans had brokerage accounts; the overwhelming majority of Americans had no direct personal stake in Wall Street. The stock market’s collapse, no matter how dramatic, did not by itself depress the American economy. Instead, the Crash exposed a great number of factors which, when combined with the financial panic, sunk the American economy into the greatest of all economic crises. Rising inequality, declining demand, rural collapse, overextended investors, and the bursting of speculative bubbles all conspired to plunge the nation into the Great Depression.

Despite progressive resistance, the vast gap between rich and poor accelerated throughout the early-twentieth century. In the aggregate, Americans in 1929 were better off than in 1929. Per capita income rose 10% for all Americans, but 75% for the nation’s wealthiest citizens. The return of conservative politics in the 1920s reinforced federal fiscal policies that exacerbated the divide: low corporate and personal taxes, easy credit, and depressed interest rates overwhelmingly favored wealthy investors who, flush with cash, spent their money on luxury goods and speculative investments in the rapidly rising stock market.

The pro-business policies of the 1920s were designed for an American economy built upon the production and consumption of durable goods. Yet, by the late 1920s, much of the market was saturated. The boom of automobile manufacturing, the great driver of the American economy in the 1920s, slowed as fewer and fewer Americans with the means to purchase a car had not already done so. More and more, the well-to-do had no need for the new automobiles, radios, and other consumer goods that fueled GDP growth in the 1920s. When products failed to sell, inventories piled up, manufacturers scaled back production, and companies fired workers, stripping potential consumers of cash, blunting demand for consumer goods, and replicating the downward economic cycle. The situation was only compounded by increased automation and rising efficiency in American factories. Despite impressive overall growth throughout the 1920s, unemployment hovered around 7% throughout the decade, suppressing purchasing power for a great swath of potential consumers.

While a manufacturing innovation, Henry Ford’s assembly line produced so many cars as to flood the automobile market in the 1920s. Interview with Henry Ford, Literary Digest, January 7, 1928. Wikimedia, http://commons.wikimedia.org/wiki/File:Ford_Motor_Company_assembly_line.jpg.

While a manufacturing innovation, Henry Ford’s assembly line produced so many cars as to flood the automobile market in the 1920s. Interview with Henry Ford, Literary Digest, January 7, 1928. Wikimedia, http://commons.wikimedia.org/wiki/File:Ford_Motor_Company_assembly_line.jpg.

For American farmers, meanwhile, “hard times” began long before the markets crashed. In 1920 and 1921, after several years of larger-than-average profits, farm prices in the South and West continued their long decline, plummeting as production climbed and domestic and international demand for cotton, foodstuffs, and other agricultural products stalled. Widespread soil exhaustion on western farms only compounded the problem. Farmers found themselves unable to make payments on loans taken out during the good years, and banks in agricultural areas tightened credit in response. By 1929, farm families were overextended, in no shape to make up for declining consumption, and in a precarious economic position even before the Depression wrecked the global economy.

Despite serious foundational problems in the industrial and agricultural economy, most Americans in 1929 and 1930 still believed the economy would bounce back. In 1930, amid one the Depression’s many false hopes, President Herbert Hoover reassured an audience that “the depression is over.” But the president was not simply guilty of false optimism. Hoover made many mistakes. During his 1928 election campaign, Hoover promoted higher tariffs as a means for encouraging domestic consumption and protecting American farmers from foreign competition. Spurred by the ongoing agricultural depression, Hoover signed into law the highest tariff in American history, the Smoot-Hawley Tariff of 1930, just as global markets began to crumble. Other countries responded in kind, tariff walls rose across the globe, and international trade ground to a halt. Between 1929 and 1932, international trade dropped from $36 billion to only $12 billion. American exports fell by 78%. Combined with overproduction and declining domestic consumption, the tariff exacerbated the world’s economic collapse.

But beyond structural flaws, speculative bubbles, and destructive protectionism, the final contributing element of the Great Depression was a quintessentially human one: panic. The frantic reaction to the market’s fall aggravated the economy’s other many failings. More economic policies backfired. The Federal Reserve overcorrected in their response to speculation by raising interest rates and tightening credit. Across the country, banks denied loans and called in debts. Their patrons, afraid that reactionary policies meant further financial trouble, rushed to withdraw money before institutions could close their doors, ensuring their fate. Such bank runs were not uncommon in the 1920s, but, in 1930, with the economy worsening and panic from the crash accelerating, 1,352 banks failed. In 1932, nearly 2,300 banks collapsed, taking personal deposits, savings, and credit with them.

The Great Depression was the confluence of many problems, most of which had begun during a time of unprecedented economic growth. Fiscal policies of the Republican “business presidents” undoubtedly widened the gap between rich and poor and fostered a “stand-off” over international trade, but such policies were widely popular and, for much of the decade, widely seen as a source of the decade’s explosive growth. With fortunes to be won and standards of living to maintain, few Americans had the foresight or wherewithal to repudiate an age of easy credit, rampant consumerism, and wild speculation. Instead, as the Depression worked its way across the United States, Americans hoped to weather the economic storm as best they could, waiting for some form of relief, any answer to the ever-mounting economic collapse that strangled so many Americans’ lives.

 

III. Herbert Hoover and the Politics of the Depression

“Unemployed men queued outside a depression soup kitchen opened in Chicago by Al Capone,” February 1931. Wikimeida, http://commons.wikimedia.org/wiki/File:Unemployed_men_queued_outside_a_depression_soup_kitchen_opened_in_Chicago_by_Al_Capone,_02-1931_-_NARA_-_541927.jpg.

“Unemployed men queued outside a depression soup kitchen opened in Chicago by Al Capone,” February 1931. Wikimeida.

As the Depression spread, public blame settled on President Herbert Hoover and the conservative politics of the Republican Party. But Hoover was as much victim as perpetrator, a man who had the misfortune of becoming a visible symbol for large invisible forces. In 1928 Hoover had no reason to believe that his presidency would be any different than that of his predecessor, Calvin Coolidge, whose time in office was marked by relative government inaction, seemingly rampant prosperity, and high approval ratings.

Coolidge had decided not to seek a second term in 1928. A man of few words, “Silent Cal” publicized this decision by handing a scrap of paper to a reporter that simply read: “I do not choose to run for president in 1928.” The race therefore became a contest between the Democratic governor of New York, Al Smith, whose Catholic faith and immigrant background aroused nativist suspicions and whose connections to Tammany Hall and anti-Prohibition politics offended reformers, and the Republican candidate, Herbert Hoover, whose All-American, Midwestern, Protestant background and managerial prowess during the First World War endeared him to American voters.

Hoover epitomized the “self-made man.” Orphaned at age 9, he was raised by a strict Quaker uncle on the West Coast. He graduated from Stanford University in 1895 and worked as an engineer for several multinational mining companies. He became a household name during World War I when he oversaw voluntary rationing as the head of the U.S. Food Administration and, after the armistice, served as the Director General of the American Relief Association in Europe. Hoover’s reputation for humanitarian service and problem-solving translated into popular support, even as the public soured on Wilson’s Progressive activism. Hoover was one of the few politicians whose career benefitted from wartime public service. After the war both the Democratic and Republican parties tried to draft him to run for president in 1920.

Hoover declined to run in 1920 and 1924. He served instead as Secretary of Commerce under both Harding and Coolidge, taking an active role in all aspects of government. In 1928, he seemed the natural successor to Coolidge. Politically, aside from the issue of Prohibition (he was a “dry,” Smith a “wet”), Hoover’s platform differed very little from Smith’s, leaving little to discuss during the campaign except personality and religion. Both benefitted Hoover. Smith’s background engendered opposition from otherwise solid Democratic states, especially in the South, where his Catholic, ethnic, urban, and anti-Prohibition background were anathema. His popularity among urban ethnic voters counted for little. Several southern states, in part owing to the work of itinerant evangelical politicking, voted Republican for the first time since Reconstruction. Hoover won in a landslide, taking nearly 60% of the popular vote.

Although Hoover is sometimes categorized as a “business president” in line with his Republican predecessors, he also embraced an inherent business progressivism, a system of voluntary action called “Associationalism” that assumed Americans could maintain a web of voluntary cooperative organizations dedicated to providing economic assistance and services to those in need. Businesses, the thinking went, would willingly limit harmful practice for the greater economic good. To Hoover, direct government aid would discourage a healthy work ethic while Associationalism would encourage the very self-control and self-initiative that fueled economic growth. But when the Depression exposed the incapacity of such strategies to produce an economic recovery, Hoover proved insufficiently flexible to recognize the limits of his ideology. And when the ideology failed, so too did his presidency.

Hoover entered office upon a wave of popular support, but by October 1929 the economic collapse had overwhelmed his presidency. Like all too many Americans, Hoover and his advisers assumed—or perhaps simply hoped—that the sharp financial and economic decline was a temporary downturn, another “bust” of the inevitable boom-bust cycles that stretched back through America’s commercial history. Many economists argued that periodic busts culled weak firms and paved the way for future growth. And so when suffering Americans looked to Hoover for help, Hoover could only answer with volunteerism. He asked business leaders to promise to maintain investments and employment and encouraged state and local charities to provide assistance to those in need. Hoover established the President’s Organization for Unemployment Relief, or POUR, to help organize the efforts of private agencies. While POUR urged charitable giving, charitable relief organizations were overwhelmed by the growing needs of the many multiplying unemployed, underfed, and unhoused Americans. By mid-1932, for instance, a quarter of all of New York’s private charities closed: they had simply run out of money. In Atlanta, solvent relief charities could only provide $1.30 per week to needy families. The size and scope of the Depression overpowered the radically insufficient capacity of private volunteer organizations to mediate the crisis.

By 1932, with the economy long-since stagnant and a reelection campaign looming, Hoover, hoping to stimulate American industry, created the Reconstruction Finance Corporation to provide emergency loans to banks, building-and-loan societies, railroads, and other private industries. It was radical in its use of direct government aid and out of character for the normally laissez-faire Hoover, but it also bypassed needy Americans to bolster industrial and financial interests. New York Congressman Fiorello LaGuardia, who later served as mayor of New York City, captured public sentiment when he denounced the RFC as a “millionaire’s dole.”

IV. The Bonus Army

“Shacks, put up by the Bonus Army on the Anacostia flats, Washington, D.C., burning after the battle with the military. The Capitol in the background. 1932.” Wikimedia, http://commons.wikimedia.org/wiki/File:Evictbonusarmy.jpg.

“Shacks, put up by the Bonus Army on the Anacostia flats, Washington, D.C., burning after the battle with the military. The Capitol in the background. 1932.” Wikimedia, http://commons.wikimedia.org/wiki/File:Evictbonusarmy.jpg.

Hoover’s reaction to a major public protest sealed his legacy. In the summer of 1932, Congress debated a bill authorizing immediate payment of long-promised cash bonuses to veterans of World War I, originally scheduled to be paid out in 1945. Given the economic hardships facing the country, the bonus came to symbolize government relief for the most deserving recipients, and from across the country more than 15,000 unemployed veterans and their families converged on Washington, D.C. They erected a tent city across the Potomac River in Anacostia Flats, a “Hooverville” in the spirit of the camps of homeless and unemployed Americans then appearing in American cities.

Concerned with what immediate payment would do to the federal budget, Hoover opposed the bill, which was eventually voted down by the Senate. While most of the “Bonus Army” left Washington in defeat, many stayed to press their case. Hoover called the remaining veterans “insurrectionists” and ordered them to leave. When thousands failed to heed the vacation order, General Douglas MacArthur, accompanied by local police, infantry, cavalry, tanks, and a machine gun squadron, stormed the tent city and routed the Bonus Army. National media covered the disaster as troops chased down men and women, tear-gassed children, and torched the shantytown.

Hoover’s insensitivity toward suffering Americans, his unwillingness to address widespread economic problems, and his repeated platitudes about returning prosperity condemned his presidency. Hoover of course was not responsible for the Depression, not personally. But neither he nor his advisers conceived of the enormity of the crisis, a crisis his conservative ideology could neither accommodate nor address. As a result, Americans found little relief from Washington. They were on their own.

 

V. The Lived Experience of the Great Depression

"Hooverville, Seattle." 1932-1937. Washington State Archives. http://www.digitalarchives.wa.gov/Record/View/B7A94A0DC95F7B3E0F1081FDB3A72C1E

“Hooverville, Seattle.” 1932-1937. Washington State Archives. http://www.digitalarchives.wa.gov/Record/View/B7A94A0DC95F7B3E0F1081FDB3A72C1E

In 1934 a woman from Humboldt County, California, wrote to First Lady Eleanor Roosevelt seeking a job for her husband, a surveyor, who had been out of work for nearly two years. The pair had survived on the meager income she received from working at the county courthouse. “My salary could keep us going,” she explained, “but—I am to have a baby.” The family needed temporary help, and, she explained, “after that I can go back to work and we can work out our own salvation. But to have this baby come to a home full of worry and despair, with no money for the things it needs, is not fair. It needs and deserves a happy start in life.”

As the United States slid ever deeper into the Great Depression, such tragic scenes played out time and time again. Individuals, families, and communities faced the painful, frightening, and often bewildering collapse of the economic institutions upon which they depended. The more fortunate were spared worst effects, and a few even profited from it, but by the end of 1932, the crisis had become so deep and so widespread that most Americans had suffered directly. Markets crashed through no fault of their own. Workers were plunged into poverty because of impersonal forces for which they shared no responsibility. With no safety net, they were thrown into economic chaos.

With rampant unemployment and declining wages, Americans slashed expenses. The fortunate could survive by simply deferring vacations and regular consumer purchases. Middle- and working-class Americans might rely upon disappearing credit at neighborhood stores, default on utility bills, or skip meals. Those that could borrowed from relatives or took in boarders in homes or “doubled up” in tenements. The most desperate, the chronically unemployed, encamped on public or marginal lands in “Hoovervilles,” spontaneous shantytowns that dotted America’s cities, depending upon breadlines and street-corner peddling. Poor women and young children entered the labor force, as they always had. The ideal of the “male breadwinner” was always a fiction for poor Americans, but the Depression decimated millions of new workers. The emotional and psychological shocks of unemployment and underemployment only added to the shocking material depravities of the Depression.  Social workers and charity officials, for instance, often found the unemployed suffering from feelings of futility, anger, bitterness, confusion, and loss of pride. Such feelings affected the rural poor no less than the urban.

 

VI. Migration and Immigration during the Great Depression

On the Great Plains, environmental catastrophe deepened America’s longstanding agricultural crisis and magnified the tragedy of the Depression. Beginning in 1932, severe droughts hit from Texas to the Dakotas and lasted until at least 1936. The droughts compounded years of agricultural mismanagement. To grow their crops, Plains farmers had plowed up natural ground cover that had taken ages to form over the surface of the dry Plains states. Relatively wet decades had protected them, but, during the early 1930s, without rain, the exposed fertile topsoil turned to dust, and without sod or windbreaks such as trees, rolling winds churned the dust into massive storms that blotted out the sky, choked settlers and livestock, and rained dirt not only across the region but as far east as Washington, D.C., New England, and ships on the Atlantic Ocean. The “Dust Bowl,” as the region became known, exposed all-too-late the need for conservation. The region’s farmers, already hit by years of foreclosures and declining commodity prices, were decimated. For many in Texas, Oklahoma, Kansas, and Arkansas who were “baked out, blown out, and broke,” their only hope was to travel west to California, whose rains still brought bountiful harvests and–potentially–jobs for farmworkers. It was an exodus. Oklahoma lost 440,000 people, or a full 18.4 percent of its 1930 population, to out-migration.

This iconic photograph made real the suffering of millions during the Great Depression. Dorothea Lange, “Destitute pea pickers in California. Mother of seven children. Age thirty-two. Nipomo, California” or “Migrant Mother,” February/March 1936. Library of Congress, http://www.loc.gov/pictures/item/fsa1998021539/PP/.

This iconic photograph made real the suffering of millions during the Great Depression. Dorothea Lange, “Destitute pea pickers in California. Mother of seven children. Age thirty-two. Nipomo, California” or “Migrant Mother,” February/March 1936. Library of Congress, http://www.loc.gov/pictures/item/fsa1998021539/PP/.

Dorothea Lange’s Migrant Mother became one of the most enduring images of the “Dust Bowl” and the ensuing westward exodus. Lange, a photographer for the Farm Security Administration, captured the image at migrant farmworker camp in Nipomo, California, in 1936. In the photograph a young mother stares out with a worried, weary expression. She a migrant, having left her home in Oklahoma to follow the crops in the Golden State. She took part in what many in the mid-1930s were beginning to recognize as a vast migration of families out of the southwestern plains states. In the image she cradles an infant and supports two older children, who cling to her. Lange’s photo encapsulated the nation’s struggle. The subject of the photograph seemed used to hard work but down on her luck, and uncertain about what the future might hold.

The “Okies,” as such westward migrants were disparagingly called by their new neighbors, were the most visible group many who were on the move during the Depression, lured by news and rumors of jobs in far flung regions of the country. By 1932 sociologists were estimating that millions of men were on the roads and rails travelling the country. Economists sought to quantify the movement of families from the Plains. Popular magazines and newspapers were filled with stories of homeless boys and the veterans-turned-migrants of the Bonus Army commandeering boxcars. Popular culture, such as William Wellman’s 1933 film, Wild Boys of the Road, and, most famously, John Steinbeck’s Grapes of Wrath, published in 1939 and turned into a hit movie a year later, captured the Depression’s dislocated populations.

These years witnessed the first significant reversal in the flow of people between rural and urban areas. Thousands of city-dwellers fled the jobless cities and moved to the country looking for work. As relief efforts floundered, many state and local officials threw up barriers to migration, making it difficult for newcomers to receive relief or find work. Some state legislatures made it a crime to bring poor migrants into the state and allowed local officials to deport migrants to neighboring states. In the winter of 1935-1936, California, Florida, and Colorado established “border blockades” to block poor migrants from their states and reduce competition with local residents for jobs. A billboard outside Tulsa, Oklahoma, informed potential migrants that there were “NO JOBS in California” and warned them to “KEEP Out.”

Sympathy for migrants, however, accelerated late in the Depression with the publication of John Steinbeck’s Grapes of Wrath. The Joad family’s struggles drew attention to the plight of Depression-era migrants and, just a month after the nationwide release of the film version, Congress created the Select Committee to Investigate the Interstate Migration of Destitute Citizens. Starting in 1940, the Committee held widely publicized hearings. But it was too late. W within a year of its founding, defense industries were already gearing up in the wake of the outbreak of World War II, and the “problem” of migration suddenly became a lack of migrants needed to fill war industries. Such relief was nowhere to be found in the 1930s.

Americans meanwhile feared foreign workers willing to work for even lower wages. The Saturday Evening Post warned that foreign immigrants, who were “compelled to accept employment on any terms and conditions offered,” would exacerbate the economic crisis. On September 8, 1930, the Hoover administration issued a press release on the administration of immigration laws “under existing conditions of unemployment.” Hoover instructed consular officers to scrutinize carefully the visa applications of those “likely to become public charges” and suggested that this might include denying visas to most, if not all, alien laborers and artisans. The crisis itself had served to stifle foreign immigration, but such restrictive and exclusionary actions in the first years of the Depression intensified its effects. The number of European visas issued fell roughly 60 percent while deportations dramatically increased. Between 1930 and 1932, 54,000 people were deported. An additional 44,000 deportable aliens left “voluntarily.”

Exclusionary measures hit Mexican immigrants particularly hard. The State Department made a concerted effort to reduce immigration from Mexico as early as 1929 and Hoover’s executive actions arrived the following year. Officials in the Southwest led a coordinated effort to push out Mexican immigrants. In Los Angeles, the Citizens Committee on Coordination of Unemployment Relief began working closely with federal officials in early 1931 to conduct deportation raids while the Los Angeles County Department of Charities began a simultaneous drive to repatriate Mexicans and Mexican Americans on relief, negotiating a charity rate with the railroads to return Mexicans “voluntarily” to their mother country. According to the federal census, from 1930 to 1940 the Mexican-born population living in Arizona, California, New Mexico and Texas fell from 616,998 to 377,433. Franklin Roosevelt did not indulge anti-immigrant sentiment as willingly as Hoover had. Under the New Deal, the Immigration and Naturalization Service halted some of the Hoover Administration’s most divisive practices, but, with jobs suddenly scarce, hostile attitudes intensified, and official policies less than welcoming, immigration plummeted and deportations rose. Over the course of the Depression, more people left the United States than entered it.

 

VII. Franklin Delano Roosevelt and the “First” New Deal

Posters like this showing the extent of the Federal Art Project were used to prove the worth of the WPA’s various endeavors and, by extension, the value of the New Deal to the American people. “Employment and Activities poster for the WPA's Federal Art Project,” January 1, 1936. Wikimedia, http://commons.wikimedia.org/wiki/File:Archives_of_American_Art_-_Employment_and_Activities_poster_for_the_WPA%27s_Federal_Art_Project_-_11772.jpg.

Posters like this showing the extent of the Federal Art Project were used to prove the worth of the WPA’s various endeavors and, by extension, the value of the New Deal to the American people. “Employment and Activities poster for the WPA’s Federal Art Project,” January 1, 1936. Wikimedia.

The early years of the Depression were catastrophic. The crisis, far from relenting, deepened each year. Unemployment peaked at 25% in 1932. With no end in sight, and with private firms crippled and charities overwhelmed by the crisis, Americans looked to their government as the last barrier against starvation, hopelessness, and perpetual poverty.

Few presidential elections in modern American history have been more consequential than that of 1932. The United States was struggling through the third year of the Depression and exasperated voters overthrew Hoover in a landslide to elect the Democratic governor of New York, Franklin Delano Roosevelt. Roosevelt came from a privileged background in New York’s Hudson River Valley (his distant cousin, Theodore Roosevelt, became president while Franklin was at Harvard). Franklin Roosevelt embarked upon a slow but steady ascent through state and national politics. In 1913, he was appointed Assistant Secretary of the Navy, a position he held during the defense emergency of World War I. In the course of his rise, in the summer of 1921, Roosevelt suffered a sudden bout of lower-body pain and paralysis. He was diagnosed with polio. The disease left him a paraplegic, but, encouraged and assisted by his wife, Eleanor, Roosevelt sought therapeutic treatment and maintained sufficient political connections to reenter politics. In 1928, Roosevelt won election as governor of New York. He oversaw the rise of the Depression and drew from progressivism to address the economic crisis. During his gubernatorial tenure, Roosevelt introduced the first comprehensive unemployment relief program and helped to pioneer efforts to expand public utilities. He also relied on like-minded advisors. For example, Frances Perkins, then commissioner of the state’s Labor Department, successfully advocated pioneering legislation which enhanced workplace safety and reduced the use of child labor in factories. Perkins later accompanied Roosevelt to Washington and serve as the nation’s first female Secretary of Labor.

On July 1, 1932, Roosevelt, the newly-designated presidential nominee of the Democratic Party, delivered the first and one of the most famous on-site acceptance speeches in American presidential history. Building to a conclusion, he promised, “I pledge you, I pledge myself, to a new deal for the American people.” Newspaper editors seized upon the phrase “new deal,” and it entered the American political lexicon as shorthand for Roosevelt’s program to address the Great Depression. There were, however, few hints in his political campaign that suggested the size and scope of the “New Deal.” Regardless, Roosevelt crushed Hoover. He won more counties than any previous candidate in American history. He spent the months between his election and inauguration traveling, planning, and assembling a team of advisors, the famous “Brain Trust” of academics and experts, to help him formulate a plan of attack. On March 4th, 1933, in his first Inaugural Address, Roosevelt famously declared, “This great Nation will endure as it has endured, will revive and will prosper. So, first of all, let me assert my firm belief that the only thing we have to fear is fear itself—nameless, unreasoning, unjustified terror which paralyzes needed efforts to convert retreat into advance.”

Roosevelt’s reassuring words would have rung hollow if he had not taken swift action against the economic crisis. In his first days in office, Roosevelt and his advisers prepared, submitted, and secured Congressional enactment of numerous laws designed to arrest the worst of the Great Depression. His administration threw the federal government headlong into the fight against the Depression.

Roosevelt immediately looked to stabilize the collapsing banking system. He declared a national “bank holiday” closing American banks and set to work pushing the Emergency Banking Act swiftly through Congress. On March 12th, the night before select banks reopened under stricter federal guidelines, Roosevelt appeared on the radio in the first of his “Fireside Chats.” The addresses, which the president continued delivering through four terms, were informal, even personal. Roosevelt used his airtime to explain New Deal legislation, to encourage confidence in government action, and to mobilize the American people’s support. In the first “chat,” Roosevelt described the new banking safeguards and asked the public to place their trust and their savings in banks. Americans responded and across the country, deposits outpaced withdrawals. The act was a major success. In June, Congress passed the Glass-Steagall Banking Act, which instituted federal deposit insurance and barred the mixing of commercial and investment banking.

Stabilizing the banks was only a first step. In the remainder of his “First Hundred Days,” Roosevelt and his congressional allies focused especially on relief for suffering Americans. Congress debated, amended, and passed what Roosevelt proposed. As one historian noted, the president “directed the entire operation like a seasoned field general.” And despite some questions over the constitutionality of many of his actions, Americans and their congressional representatives conceded that the crisis demanded swift and immediate action. The Civilian Conservation Corps (CCC) employed young men on conservation and reforestation projects; the Federal Emergency Relief Administration (FERA) provided direct cash assistance to state relief agencies struggling to care for the unemployed; the Tennessee Valley Authority (TVA) built a series of hydroelectric dams along the Tennessee River as part of a comprehensive program to economically develop a chronically depressed region; several agencies helped home and farm owners refinance their mortgages. And Roosevelt wasn’t done.

The heart of Roosevelt’s early recovery program consisted of two massive efforts to stabilize and coordinate the American economy: the Agricultural Adjustment Administration (AAA) and the National Recovery Administration (NRA). The AAA, created in May 1933, aimed to raise the prices of agricultural commodities (and hence farmers’ income) by offering cash incentives to voluntarily limit farm production (decreasing supply, thereby raising prices). The National Industrial Recovery Act, which created the National Recovery Administration (NRA) in June 1933, suspended antitrust laws to allow businesses to establish “codes” that would coordinate prices, regulate production levels, and establish conditions of employment to curtail “cutthroat competition.” In exchange for these exemptions, businesses agreed to provide reasonable wages and hours, end child labor, and allow workers the right to unionize. Participating businesses earned the right to display a placard with the NRA’s “Blue Eagle,” showing their cooperation in the effort to combat the Great Depression.

The programs of the First Hundred Days stabilized the American economy and ushered in a robust though imperfect recovery. GDP climbed once more, but even as output increased, unemployment remained stubbornly high. Though the unemployment rate dipped from its high in 1933, when Roosevelt was inaugurated, vast numbers remained out of work. If the economy could not put people back to work, the New Deal would try. The Civil Works Administration (CWA) and, later, the Works Progress Administration (WPA) put unemployed men and women to work on projects designed and proposed by local governments. The Public Works Administration (PWA) provided grants-in-aid to local governments for large infrastructure projects, such as bridges, tunnels, schoolhouses, libraries, and America’s first federal public housing projects. Together, they provided not only tangible projects of immense public good, but employment for millions. The New Deal was reshaping much of the nation.

 

VIII. The New Deal in the South

The unjust and unreasonable accusation of rape brought against the so-called Scottsboro Boys, pictured with their attorney in 1932, generated serious controversy throughout the country.  “Working people of Washington negro and white. students and intellectuals attend The ‘Scottsboro boys must not die’ mass meeting Mt. Carmel Baptist church 3d and Eye Streets N. W. Wednesday February 7 8 PM ....” Washington, D. C., 1934. Library of Congress, http://memory.loc.gov/cgi-bin/query/r?ammem/AMALL:@field%28NUMBER+@band%28rbpe+20805500%29%29.  “The Scottsboro Boys, with attorney Samuel Leibowitz, under guard by the state militia, 1932.” Wikipedia, http://en.wikipedia.org/wiki/Scottsboro_Boys#mediaviewer/File:Leibowitz,_Samuel_%26_Scottsboro_Boys_1932.jpg.

The accusation of rape brought against the so-called Scottsboro Boys, pictured with their attorney in 1932, generated controversy across the country. “The Scottsboro Boys, with attorney Samuel Leibowitz, under guard by the state militia, 1932.” Wikipedia.

The impact of initial New Deal legislation was readily apparent in the South, a region of perpetual poverty especially plagued by the Depression. In 1929 the average per capita income in the American Southeast was $365, the lowest in the nation. Southern farmers averaged $183 per year at a time when farmers on the West Coast made more than four times that. Moreover, they were trapped into the production of cotton and corn, crops that depleted the soil and returned ever-diminishing profits. Despite the ceaseless efforts of civic boosters, what little industry the South had remained low-wage, low-skilled, and primarily extractive. Southern workers made significantly less than their national counterparts: 75% of non-southern textile workers, 60% of iron and steel workers, and a paltry 45% of lumber workers. At the time of the crash, southerners were already underpaid, underfed, and undereducated.

Major New Deal programs were designed with the South in mind. FDR hoped that by drastically decreasing the amount of land devoted to cotton, the AAA would arrest its long-plummeting price decline. Farmers plowed up existing crops and left fields fallow, and the market price did rise. But in an agricultural world of land-owners and landless farmworkers (such as tenants and sharecroppers), the benefits of the AAA bypassed the southerners who needed them most. The government relied on land owners and local organizations to distribute money fairly to those most affected by production limits, but many owners simply kicked tenants and croppers off their land, kept the subsidy checks for keeping those acres fallow, and reinvested the profits in mechanical farming equipment that further suppressed the demand for labor. Instead of making farming profitable again, the AAA pushed landless southern farmworkers off the land.

But Roosevelt’s assault on southern poverty took many forms. Southern industrial practices attracted much attention. The NRA encouraged higher wages and better conditions. It began to suppress the rampant use of child labor in southern mills, and, for the first time, provided federal protection for unionized workers all across the country. Those gains were eventually solidified in the 1938 Fair Labor Standards Act, which set a national minimum wage of $0.25/hour (eventually rising to .40/hour). The minimum wage disproportionately affected low-paid southern workers, and brought southern wages within the reach of northern wages.

The president’s support for unionization further impacted the South. Southern industrialists had proven themselves ardent foes of unionization, particularly in the infamous southern textile mills. In 1934, when workers at textile mills across the southern Piedmont struck over low wages and long hours, owners turned to local and state authorities to quash workers’ groups, even as they recruited thousands of strikebreakers from the many displaced farmers swelling industrial centers looking for work. But in 1935 the National Labor Relations Act, also known as the Wagner Act, guaranteed the rights of most workers to unionize and bargain collectively. And so unionized workers, backed by the support of the federal government and determined to enforce the reforms of the New Deal, pushed for higher wages, shorter hours, and better conditions. With growing success, union members came to see Roosevelt as a protector or workers’ rights. Or, as one union leader put it, an “agent of God.”

Perhaps the most successful New Deal program in the South was the Tennessee Valley Authority (TVA), an ambitious program to use hydroelectric power, agricultural and industrial reform, flood control, economic development, education, and healthcare, to radically remake the impoverished watershed region of the Tennessee River. Though the area of focus was limited, Roosevelt’s TVA sought to “make a different type of citizen” out of the area’s penniless residents. The TVA built a series of hydroelectric dams to control flooding and distribute electricity to the otherwise non-electrified areas at government-subsidized rates. Agents of the TVA met with residents and offered training and general education classes to improve agricultural practices and exploit new job opportunities. The TVA encapsulates Roosevelt’s vision for uplifting the South and integrating it into the larger national economy.

Roosevelt initially courted conservative southern Democrats to ensure the legislative success of the New Deal, all but guaranteeing that the racial and economic inequalities of the region remained intact, but, by the end of his second term, he had won the support of enough non-southern voters that he felt confident in confronting some of the region’s most glaring inequalities. Nowhere was this more apparent than in his endorsement of a report, formulated by a group of progressive southern New Dealers, entitled “A Report on Economic Conditions in the South.” The pamphlet denounced the hardships wrought by the southern economy—in his introductory letter to the Report, called the region “the Nation’s No. 1 economic problem”—and blasted reactionary southern anti-New Dealers. He suggested that the New Deal could save the South and thereby spur a nationwide recovery. The Report was among the first broadsides in Roosevelt’s coming reelection campaign that addressed the inequalities that continued to mark southern and national life.

 

IX. The New Deal in Appalachia

The New Deal also addressed another poverty-stricken region, Appalachia, the mountain-and-valley communities that roughly follow the Appalachian Mountain Range from southern New York to the foothills of Northern Georgia, Alabama, and Mississippi. Appalachia’s abundant natural resources, including timber and coal, were in high demand during the country’s post-Civil War industrial expansion, but Appalachian industry simply extracted these resources for profit in far-off industries, depressing the coal-producing areas even earlier than the rest of the country. By the mid-1930s, with the Depression suppressing demand, many residents were stranded in small, isolated communities whose few employers stood on the verge of collapse. Relief workers from the Federal Emergency Relief Administration (FERA) reported serious shortages of medical care, adequate shelter, clothing, and food. Rampant illnesses, including typhus, tuberculosis, pneumonia, and venereal disease, as well as childhood malnutrition, further crippled Appalachia.

Several New Deal programs targeted the region. Under the auspices of the NIRA, Roosevelt established the Division of Subsistence Homesteads (DSH) within the Department of the Interior to give impoverished families an opportunity to relocate “back to the land”: the DSH established 34 homestead communities nationwide, including the Appalachian regions of Alabama, Pennsylvania, Tennessee, and West Virginia. The CCC contributed to projects throughout Appalachia, including the Blue Ridge Parkway in North Carolina and Virginia, reforestation of the Chattahoochee National Forest in Georgia, and state parks such as Pine Mountain Resort State Park in Kentucky. The TVA’s efforts aided communities in Tennessee and North Carolina, and the Rural Electric Administration (REA) brought electricity to 288,000 rural households.

 

X. Voices of Protest

Huey Long was a dynamic, indomitable force (with a wild speech-giving style, seen in the photograph) who campaigned tirelessly for the common man, demanding that Americans “Share Our Wealth.” Photograph of Huey P. Long, c. 1933-35. Wikimedia, http://commons.wikimedia.org/wiki/File:HueyPLongGesture.jpg. “Share Our Wealth” button, c. 1930s. Authentic History, http://www.authentichistory.com/1930-1939/2-fdr/2-reception/Huey_Long-Share_Our_Wealth_Button.jpg.

Huey Long was a dynamic, indomitable force (with a wild speech-giving style, seen in the photograph) who campaigned tirelessly for the common man, demanding that Americans “Share Our Wealth.” Photograph of Huey P. Long, c. 1933-35. Wikimedia.

Despite the unprecedented actions taken in his first year in office, Roosevelt’s initial relief programs could often be quite conservative. He had usually been careful to work within the bounds of presidential authority and congressional cooperation. And, unlike Europe, where several nations had turned towards state-run economies, and even fascism and socialism, Roosevelt’s New Deal demonstrated a clear reluctance to radically tinker with the nation’s foundational economic and social structures. Many high-profile critics attacked Roosevelt for not going far enough, and, beginning in 1934, Roosevelt and his advisors were forced to respond.

Senator Huey Long, a flamboyant Democrat from Louisiana, was perhaps the most important “voice of protest.” Long’s populist rhetoric appealed those who saw deeply rooted but easily addressed injustice in the nation’s economic system. Long proposed a “Share Our Wealth” program in which the federal government would confiscate the assets of the extremely wealthy and redistribute them to the less well-off through guaranteed minimum incomes. “How many men ever went to a barbecue and would let one man take off the table what’s intended for nine-tenths of the people to eat?” he asked. Over 27,000 “Share the Wealth” clubs sprang up across the nation as Long traveled the country explaining his program to crowds of impoverished and unemployed Americans. Long envisioned the movement as a stepping stone to the presidency, but his crusade ended in late 1935 when he was assassinated on the floor of the Louisiana state capitol. Even in death, however, Long convinced Roosevelt to more stridently attack the Depression and American inequality.

But Huey Long was not alone in his critique of Roosevelt. Francis Townsend, a former doctor and public health official from California, promoted a plan for old age pensions which, he argued, would provide economic security for the elderly (who disproportionately suffered poverty) and encourage recovery by allowing older workers to retire from the work force. Reverend Charles Coughlin, meanwhile, a priest and radio personality from the suburbs of Detroit, Michigan, gained a following by making vitriolic, anti-Semitic attacks on Roosevelt for cooperating with banks and financiers and proposing a new system of “social justice” through a more state-driven economy instead. Like Long, both Townsend and Coughlin built substantial public followings.

If many Americans urged Roosevelt to go further in addressing the economic crisis, the president faced even greater opposition from conservative politicians and business leaders. By late 1934, growing complaints from business-friendly Republicans of Roosevelt’s willingness to regulate industry and use federal spending for public works and employment programs. In the South, Democrats who had originally supported the president grew increasingly hostile towards programs that challenged the region’s political, economic, and social status quo. Yet the greatest opposition came from the Supreme Court, a conservative filled with appointments made from the long years of Republican presidents.

By early 1935 the Court was reviewing programs of the New Deal. On May 27, a day Roosevelt’s supporters called “Black Monday,” the justices struck down one of the president’s signature reforms: in a case revolving around poultry processing, the Court unanimously declared the NRA unconstitutional. In early 1936, the AAA fell.

 

XI. The “Second” New Deal (1935-1936)

Facing reelection and rising opposition from both the left and the right, Roosevelt decided to act. The New Deal adopted a more radical, aggressive approach to poverty, the “Second” New Deal. In 1935, hoping to reconstitute some of the protections afforded workers in the now-defunct NRA, Roosevelt worked with Congress to pass the National Labor Relations Act (known as the Wagner Act for its chief sponsor, New York Senator Robert Wagner), offering federal legal protection, for the first time, for workers to organize unions. Three years later, Congress passed the Fair Labor Standards Act, creating the modern minimum wage. The Second New Deal also oversaw the restoration of a highly progressive federal income tax, mandated new reporting requirements for publicly traded companies, refinanced long-term home mortgages for struggling homeowners, and attempted rural reconstruction projects to bring farm incomes in line with urban ones.

The labor protections extended by Roosevelt’s New Deal were revolutionary. In northern industrial cities, workers responded to worsening conditions by banding together and demanding support for worker’s rights. In 1935, the head of the United Mine Workers, John L. Lewis, took the lead in forming a new national workers’ organization, the Congress of Industrial Organizations, breaking with the more conservative, craft-oriented AFL. The CIO won a major victory in 1937 when affiliated members in the United Auto Workers struck for recognition and better pay and hours at a General Motors plant in Flint, Michigan. In the first instance of a “sit-down” strike, the workers remained in the building until management agreed to negotiate. GM recognized the UAW and the “sit-down” strike became a new weapon in the fight for workers’ rights. Across the country, unions and workers took advantage of the New Deal’s protections to organize and win major concessions from employers.

Unionization met with fierce opposition from owners and managers, particularly in the “Manufacturing Belt” of the Mid-West. Sheldon Dick, photographer, “Strikers guarding window entrance to Fisher body plant number three. Flint, Michigan,” January/February 1937. Library of Congress, http://www.loc.gov/pictures/item/fsa2000021503/PP/.

Unionization met with fierce opposition from owners and managers, particularly in the “Manufacturing Belt” of the Mid-West. Sheldon Dick, photographer, “Strikers guarding window entrance to Fisher body plant number three. Flint, Michigan,” January/February 1937. Library of Congress, http://www.loc.gov/pictures/item/fsa2000021503/PP/.

The signature piece of Roosevelt’s Second New Deal came the same year, in 1935. The Social Security Act provided for old-age pensions, unemployment insurance, and economic aid, based on means, to assist both the elderly and dependent children. The president was careful to mitigate some of the criticism from what was, at the time, in the American context, a revolutionary concept. He specifically insisted that social security be financed from payroll, not the federal government; “No dole,” Roosevelt said repeatedly, “mustn’t have a dole.” He thereby helped separate Social Security from the stigma of being an undeserved “welfare” entitlement. While such a strategy saved the program from suspicions, Social Security became the centerpiece of the modern American social welfare state. It was the culmination of a long progressive push for government-sponsored social welfare, an answer to the calls of Roosevelt’s opponents on the Left for reform, a response to the intractable poverty among America’s neediest groups, and a recognition that the government would now assume some responsibility for the economic well-being of its citizens. But for all of its groundbreaking provisions, the Act, and the larger New Deal as well, excluded large swaths of the American population.

 

XII. Equal Rights and the New Deal

The Great Depression was particularly tough for nonwhite Americans. As an African American pensioner told interviewer Studs Terkel, “The Negro was born in depression. It didn’t mean too much to him. The Great American Depression … only became official when it hit the white man.” Black workers were generally the last hired when businesses expanded production and the first fired when businesses experienced downturns. In 1932, with the national unemployment average hovering around 25%, black unemployment reached as high as 50%, while even those black who kept their jobs saw their already low wages cut dramatically.

Blacks faced discrimination everywhere, but suffered especially severe legal inequality in the Jim Crow South. In 1931, for instance, a group of nine young men riding the rails between Chattanooga and Memphis, Tennessee, were pulled from the train near Scottsboro, Alabama, and charged with assaulting two white women. Despite clear evidence that the assault had not occurred, and despite one of the women later recanting, the young men endured a series of sham trials in which all but one were sentenced to death. Only the communist-oriented International Legal Defense came to the aid of the “Scottsboro Boys,” who soon became a national symbol of continuing racial prejudice in America and a rallying point for civil rights-minded Americans. In appeals, the ILD successfully challenged the Boys’ sentencing and the death sentences were either commuted or reversed, although the last of the accused did not receive parole until 1946.

Despite a concerted effort to appoint black advisors to some New Deal programs, Franklin Roosevelt did little to directly address the difficulties black communities faced. To do so openly would provoke southern Democrats and put his New Deal coalition at risk. Roosevelt not only rejected such proposals as abolishing the poll tax and declaring lynching a federal crime, he refused to specifically target African American needs in any of his larger relief and reform packages. As he explained to the national secretary of the NAACP, “I just can’t take that risk.”

In fact, even many of the programs of the New Deal had made hard times more difficult. When the codes of the NRA set new pay scales, they usually took into account regional differentiation and historical data. In the South, where African Americans had long suffered unequal pay, the new codes simply perpetuated that inequality. The codes also exempted those involved in farm work and domestic labor, the occupations of a majority of southern black men and women. The AAA was equally problematic as owners displaced black tenants and sharecroppers, many of whom were forced to return to their farms as low-paid day labor or to migrate to cities looking for wage work.

Perhaps the most notorious failure of the New Deal to aid African Americans came with the passage of the Social Security Act. Southern politicians chaffed at the prospect of African Americans benefiting from federally-sponsored social welfare, afraid that economic security would allow black southerners to escape the cycle of poverty that kept them tied to the land as cheap, exploitable farm laborers. The Jackson (Mississippi) Daily News callously warned that “The average Mississippian can’t imagine himself chipping in to pay pensions for able-bodied Negroes to sit around in idleness … while cotton and corn crops are crying for workers.” Roosevelt agreed to remove domestic workers and farm laborers from the provisions of the bill, excluding many African Americans, already laboring under the strictures of legal racial discrimination, from the benefits of an expanding economic safety net.

Women, too, failed to receive the full benefits of New Deal programs. On one hand, Roosevelt included women in key positions within his administration, including the first female Cabinet secretary, Frances Perkins, and a prominently placed African American advisor in the National Youth Administration, Mary McLeod Bethune. First Lady Eleanor Roosevelt was a key advisor to the president and became a major voice for economic and racial justice. But many New Deal programs were built upon the assumption that men would serve as “breadwinners” and women as mothers, homemakers, and consumers. New Deal programs aimed to help both but usually by forcing such gendered assumptions, making it difficult for women to attain economic autonomy. New Deal social welfare programs tended to funnel women into means-tested, state administered relief programs while reserving “entitlement” benefits for male workers, creating a kind of two-tiered social welfare state. And so, despite great advances, the New Deal failed to challenge core inequalities that continued to mark life in the United States.

 

XIII. The End of the New Deal (1937-1939)

By 1936 Roosevelt and his New Deal had won record popularity. In November Roosevelt annihilated his Republican challenger, Governor Alf Landon of Kansas, who lost in every state save Maine and Vermont. The Great Depression had certainly not ended, but it appeared to many to be beating a slow yet steady retreat, and Roosevelt, now safely re-elected, appeared ready to take advantage of both his popularity and the improving economic climate to press for even more dramatic changes. But conservative barriers continued to limit the power of his popular support. The Supreme Court, for instance, continued to gut many of his programs.

In 1937, concerned that the Court might overthrow Social Security in an upcoming case, Roosevelt called for legislation allowing him to expand the Court by appointing a new, younger justice for every sitting member over the age of 70. Roosevelt argued that the measure would speed up the Court’s ability to handle a growing back-log of cases; however, his “court-packing scheme,” as opponents termed it, was clearly designed to allow the president to appoint up to six friendly, pro-New Deal justices to drown the influence of old-time conservatives on the Court. Roosevelt’s “scheme” riled opposition and did not become law, but the chastened Court upheld Social Security and other pieces of New Deal legislation thereafter. Moreover, Roosevelt was slowly able to appoint more amenable justices as conservatives died or retired. Still, the “court-packing scheme” damaged the Roosevelt administration and opposition to the New Deal began to emerge and coalesce.

Compounding his problems, Roosevelt and his advisors made a costly economic misstep. Believing the United States had turned a corner, Roosevelt cut spending in 1937. The American economy plunged nearly to the depths of 1932–1933. Roosevelt reversed course and, adopting the approach popularized by the English economist John Maynard Keynes, hoped that countercyclical, “compensatory” spending would pull the country out of the recession, even at the expense of a growing budget deficit. It was perhaps too late. The “Roosevelt Recession” of 1937 became fodder for critics. Combined with the “court-packing scheme,” the recession allowed for significant gains by a “conservative coalition” of southern Democrats and Midwestern Republicans. By 1939, Roosevelt struggled to build congressional support for new reforms, let alone maintain existing agencies. Moreover, the growing threat of war in Europe stole the public’s attention and increasingly dominated Roosevelt’s interests. The New Deal slowly receded into the background, outshone by war.

 

XIV. The Legacy of the New Deal

By the end of the 1930s, Roosevelt and his Democratic congresses had presided over a transformation of the American government and a realignment in American party politics. Before World War I, the American national state, though powerful, had been a “government out of sight.” After the New Deal, Americans came to see the federal government as a potential ally in their daily struggles, whether finding work, securing a decent wage, getting a fair price for agricultural products, or organizing a union. Voter turnout in presidential elections jumped in 1932 and again in 1936, with most of these newly-mobilized voters forming a durable piece of the Democratic Party that would remain loyal well into the 1960s. Even as affluence returned with the American intervention in World War II, memories of the Depression continued to shape the outlook of two generations of Americans. Survivors of the Great Depression, one man would recall in the late 1960s, “are still riding with the ghost—the ghost of those days when things came hard.”

Historians debate when the New Deal “ended.” Some identify the Fair Labor Standards Act of 1938 as the last major New Deal measure. Others see wartime measures such as price and rent control and the G.I. Bill (which afforded New Deal-style social benefits to veterans) as species of New Deal legislation. Still others conceive of a “New Deal order,” a constellation of “ideas, public policies, and political alliances,” which, though changing, guided American politics from Roosevelt’s Hundred Days forward to Lyndon Johnson’s Great Society—and perhaps even beyond. Indeed, the New Deal’s legacy still remains, and its battle lines still shape American politics.

 

This chapter was edited by Matthew Downs, with content contributed by Dana Cochran, Matthew Downs, Benjamin Helwege, Elisa Minoff, Caitlin Verboon, and Mason Williams.
cc-by-sa-icon

22. The Twenties

Al Jolson in The Jazz Singer, 1927.

Al Jolson in The Jazz Singer, 1927.

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

On a sunny day in early March of 1921, Warren G. Harding took the oath to become the twenty-ninth President of the United States. He had won a landslide election by promising a “return to normalcy.” “Our supreme task is the resumption of our onward, normal way,” he declared in his inaugural address. Two months later, he said, “America’s present need is not heroics, but healing; not nostrums, but normalcy; not revolution, but restoration.” The nation still reeled from the shock of World War I, the explosion of racial violence and political repression in 1919, and, bolstered by the Bolshevik Revolution in Russia, a lingering “Red Scare.”

More than 115,000 American soldiers had lost their lives in barely a year of fighting in Europe. Between 1918 and 1920, nearly 700,000 Americans died in a flu epidemic that hit nearly twenty percent of the American population. Waves of strikes hit soon after the war. Radicals bellowed. Anarchists and others sent more than thirty bombs through the mail on May 1, 1919. After war controls fell, the economy tanked and national unemployment hit twenty percent. Farmers’ bankruptcy rates, already egregious, now skyrocketed. Harding could hardly deliver the peace that he promised, but his message nevertheless resonated among a populace wracked by instability.

The 1920s would be anything but “normal.” The decade so reshaped American life that it came to be called by many names: the New Era, the Jazz Age, the Age of the Flapper, the Prosperity Decade, and most commonly, the Roaring Twenties. The mass production and consumption of automobiles, household appliances, film, and radio fueled a new economy and new standards of living, new mass entertainment introduced talking films and jazz while sexual and social mores loosened. Meanwhile, many Americans turned their back on reform, denounced America’s shifting demographics, stifled immigration, retreated toward an “old time religion,” and revived with millions of new members the Ku Klux Klan. Others, though, fought harder than ever for equal rights. Americans noted “the New Woman” and “the New Negro,” and the old immigrant communities that had escaped the quotas clung to their cultures and their native faiths. The 1920s were a decade of conflict and tension. And whatever it was, it was not “normalcy.”

 

II. Republican White House, 1921-1933

To deliver on his promises of stability and prosperity, Harding signed legislation to restore a high protective tariff and dismantled the last wartime controls over industry. Meanwhile, the vestiges of America’s involvement in the First World War and its propaganda and suspicions of anything less than “100 percent American,” pushed Congress to address fears of immigration and foreign populations. A sour postwar economy led elites to raise the specter of the Russian Revolution and sideline not just the various American socialist and anarchist organizations, but nearly all union activism. During the 1920s, the labor movement suffered a sharp decline in memberships. Workers not only lost bargaining power, but also the support of courts, politicians, and, in large measure, the American public.

Harding’s presidency, though, would go down in history as among the most corrupt. Many of Harding’s cabinet appointees, for instance, were individuals of true stature that answered to various American constituencies. For instance, Henry C. Wallace, the very vocal editor of Wallace’s Farmer and a well-known proponent of “scientific farming,” was made Secretary of Agriculture. Herbert Hoover, the popular head and administrator of the wartime Food Administration and a self-made millionaire, was made Secretary of Commerce. To satisfy business interests, the conservative businessmen Andrew Mellon became Secretary of the Treasury. Mostly, however, it was the appointing of friends and close supporters, dubbed “the Ohio gang,” that led to trouble.

Harding’s administration suffered a tremendous setback when several officials conspired to lease government land in Wyoming to oil companies in exchange for cash. Known as the Teapot Dome scandal (named after the nearby rock formation that resembled a teapot), Interior Secretary Albert Fall and Navy Secretary Edwin Denby were eventually convicted and sent to jail. Harding took vacation in the summer of 1923 so that he could think deeply on how to deal “with my God-damned friends”—it was his friends, and not his enemies, that kept him up walking the halls at nights. But then, on August of 1923, Harding died suddenly of a heart attack and Vice President Calvin Coolidge ascended to the highest office in the land.

The son of a shopkeeper, Coolidge climbed the Republican ranks from city councilman to the Governor of Massachusetts. As president, Coolidge sought to remove the stain of scandal but he otherwise continued Harding’s economic approach, refusing to take actions in defense of workers or consumers against American business. “The chief business of the American people,” the new President stated, “is business.” One observer called Coolidge’s policy “active inactivity,” but Coolidge was not afraid of supporting business interests and wealthy Americans by lowering taxes or maintaining high tariff rates. Congress, for instance, had already begun to reduce taxes on the wealthy from wartime levels of sixty-six percent to twenty percent, which Coolidge championed.

While Coolidge supported business, other Americans continued their activism. The 1920s, for instance, represented a time of great activism among American women, who had won the vote with the passage of the 19th Amendment in 1920. Female voters, like their male counterparts, pursued many interests. Concerned about squalor, poverty, and domestic violence, women had already lent their efforts to prohibition, which went into effect under the Eighteenth Amendment in January 1920. After that point, alcohol could no longer be manufactured or sold. Other reformers urged government action to ameliorate high mortality rates among infants and children, to provide federal aid for education, and ensure peace and disarmament. Some activists advocated protective legislation for women and children, while Alice Paul and the National Women’s Party called for the elimination of all legal distinctions “on account of sex” through the proposed Equal Rights Amendment (ERA), which was introduced but defeated in Congress.

During the 1920s, the National Women’s Party fought for the rights of women beyond that of suffrage, which they had secured through the 19th Amendment in 1920. They organized private events, like the tea party pictured here, and public campaigning, such as the introduction of the Equal Rights Amendment to Congress, as they continued the struggle for equality. “Reception tea at the National Womens [i.e., Woman's] Party to Alice Brady, famous film star and one of the organizers of the party,” April 5, 1923. Library of Congress, http://www.loc.gov/pictures/item/91705244/.

During the 1920s, the National Women’s Party fought for the rights of women beyond that of suffrage, which they had secured through the 19th Amendment in 1920. They organized private events, like the tea party pictured here, and public campaigning, such as the introduction of the Equal Rights Amendment to Congress, as they continued the struggle for equality. “Reception tea at the National Womens [i.e., Woman’s] Party to Alice Brady, famous film star and one of the organizers of the party,” April 5, 1923. Library of Congress.

National politics in the 1920s were dominated by the Republican Party, which not held the presidency but both houses of Congress as well. In a note passed to American reporters, Coolidge announced his decision not to run in the presidential election of 1928. Republicans nominated Herbert Hoover. An orphan from Iowa who graduated from Stanford, became wealthy as a mining engineer, and won a deserved reputation as a humanitarian for his relief efforts in famine-struck, war-torn Europe. Running against Hoover was Democrat Alfred E. Smith, the four-time governor of New York and the son of Irish immigrants. Smith was a part of the New York machine and favored workers’ protections while also opposing prohibition and immigration restrictions. Hoover focused on economic growth and prosperity. He had served as Secretary of Commerce under Harding and Coolidge and claimed credit for the sustained growth seen during the 1920s, Hoover claimed in 1928 that America had never been closer to eliminating poverty. Much of the election, however, centered around Smith’s religion: he was a Catholic. And not only was he a Catholic, he opposed Protestant America’s greatest political triumph, prohibition. Many Protestant ministers preached against Smith and warned that he be enthralled to the Pope. Hoover won in a landslide. While Smith won handily in the nation’s largest cities, portending future political trends, he lost most of the rest of the country. Even several solidly Democratic southern states pulled the lever for a Republican for the first time since Reconstruction.

III. Culture of Consumption

“Change is in the very air Americans breathe, and consumer changes are the very bricks out of which we are building our new kind of civilization,” announced marketing expert and home economist Christine Frederick in her influential 1929 monograph, Selling Mrs. Consumer. The book, which was based on one of the earliest surveys of American buying habits, advised manufacturers and advertisers how to capture the purchasing power of women, who, according to Frederick, accounted for 90% of household expenditures. Aside from granting advertisers insight into the psychology of the “average” consumer, Frederick’s text captured the tremendous social and economic transformations that had been wrought over the course of her lifetime.

Indeed, the America of Frederick’s birth looked very different from the one she confronted in 1929. The consumer change she studied had resulted from the industrial expansion of the late-nineteenth and early-twentieth centuries. With the discovery of new energy sources and manufacturing technologies, industrial output flooded the market with a range of consumer products such as ready-to-wear clothing to convenience foods to home appliances. By the end of the nineteenth century, output had risen so dramatically that many contemporaries feared supply had outpaced demand and that the nation would soon face the devastating financial consequences of overproduction. American businessmen attempted to avoid this catastrophe by developing new merchandising and marketing strategies that transformed distribution and stimulated a new culture of consumer desire.

The department store stood at the center of this early consumer revolution. By the 1880s, several large dry goods houses blossomed into modern retail department stores. These emporiums concentrated a broad array of goods under a single roof, allowing customers to purchase shirtwaists and gloves alongside toy trains and washbasins. To attract customers, department stores relied on more than variety. They also employed innovations in service—such as access to restaurants, writing rooms, and babysitting—and spectacle—such as elaborately decorated store windows, fashion shows, and interior merchandise displays. Marshall Field & Co. was among the most successful of these ventures. Located on State Street in Chicago, the company pioneered many of these strategies, including establishing a tearoom that provided refreshment to the well-heeled women shoppers that comprised the store’s clientele. Reflecting on the success of Field’s marketing techniques, Thomas W. Goodspeed, an early trustee of the University of Chicago wrote, “Perhaps the most notable of Mr. Field’s innovations was that he made a store in which it was a joy to buy.”

The joy of buying infected a growing number of Americans in the early twentieth century as the rise of mail-order catalogs, mass-circulation magazines, and national branding further stoked consumer desire. The automobile industry also fostered the new culture of consumption by promoting the use of credit. By 1927, more than sixty percent of American automobiles were sold on credit, and installment purchasing was made available for nearly every other large consumer purchase. Spurred by access to easy credit, consumer expenditures for household appliances, for example, grew by more than 120 percent between 1919 and 1929. Henry Ford’s assembly line, which advanced production strategies practiced within countless industries, brought automobiles within the reach of middle-income Americans and fruther drove the spirit of consumerism. By 1925, Ford’s factories were turning out a Model-T every 10 seconds. The number of registered cars ballooned from just over nine million in 1920 to nearly twenty-seven million by the decade’s end. Americans owned more cars than Great Britain, Germany, France, and Italy combined. In the late 1920s, eighty percent of the world’s cars drove on American roads.

 

IV. Culture of Escape

As transformative as steam and iron had been in the previous century, gasoline and electricity—embodied most dramatically for many Americans in automobiles, film, and radio—propelled not only consumption, but also the famed popular culture in the 1920s. “We wish to escape,” wrote Edgar Burroughs, author of the Tarzan series. “The restrictions of manmade laws, and the inhibitions that society has placed upon us.” Burroughs authored a new Tarzan story nearly every year from 1914 until 1939. “We would each like to be Tarzan,” he said. “At least I would; I admit it.” Like many Americans in the 1920s, Burroughs sought to challenge and escape the constraints of a society that seemed more industrialized with each passing day.

Just like Burroughs, Americans escaped with great speed. Whether through the automobile, Hollywood’s latest films, jazz records produced on Tin Pan Alley, or the hours spent listening to radio broadcasts of Jack Dempsey’s prizefights, the public wrapped itself in popular culture. One observer estimated that Americans belted out the silly musical hit “Yes, We Have No Bananas” more than “The Star Spangled Banner” and all the hymns in all the hymnals combined.

As the automobile became more popular and more reliable, more people traveled more frequently and attempted greater distances. Women increasingly drove themselves to their own activities as well as those of their children. Vacationing Americans sped to Florida to escape northern winters. Young men and women fled the supervision of courtship, exchanging the staid parlor couch for sexual exploration in the backseat of a sedan. In order to serve and capture the growing number of drivers, Americans erected gas stations, diners, motels, and billboards along the roadside. Automobiles themselves became objects of entertainment: nearly one hundred thousand people gathered to watch drivers compete for the $50,000 prize of the Indianapolis 500.

The automobile changed American life forever. Rampant consumerism, the desire to travel, and the affordability of cars allowed greater numbers of Americans to purchase automobiles. This was possible only through innovations in automobile design and manufacturing led by Henry Ford in Detroit, Michigan. Ford was a lifelong inventor, creating his very first automobile – the quadricycle – in his home garage. From The Truth About Henry Ford by Sarah T. Bushnell, 1922. Wikimedia, http://commons.wikimedia.org/wiki/File:Mr_and_Mrs_Henry_Ford_in_his_first_car.jpg.

The automobile changed American life forever. Rampant consumerism, the desire to travel, and the affordability of cars allowed greater numbers of Americans to purchase automobiles. This was possible only through innovations in automobile design and manufacturing led by Henry Ford in Detroit, Michigan. Ford was a lifelong inventor, creating his very first automobile – the quadricycle – in his home garage. From The Truth About Henry Ford by Sarah T. Bushnell, 1922. Wikimedia.

Meanwhile, the United States dominated the global film industry. By 1930, as movie-making became more expensive, a handful of film companies took control of the industry. Immigrants, mostly of Jewish heritage from Central and Eastern Europe, originally “invented Hollywood” because most turn-of-the-century middle and upper class Americans viewed cinema as lower-class entertainment. After their parents emigrated from Poland in 1876, Harry, Albert, Sam, and Jack Warner (who were given the name when an Ellis Island official could not understand their surname) founded Warner Bros. in 1918. Universal, Paramount, Columbia, and MGM were all founded by or led by Jewish executives. Aware of their social status as outsiders, these immigrants (or sons of immigrants) purposefully produced films that portrayed American values of opportunity, democracy, and freedom.

Not content with distributing thirty-minute films in nickelodeons, film moguls produced longer, higher-quality films and showed them in palatial theaters that attracted those who had previously shunned the film industry. But as filmmakers captured the middle and upper classes, they maintained working-class moviegoers by blending traditional and modern values. Cecil B. DeMille’s 1923 epic The Ten Commandments depicted orgiastic revelry, for instance, while still managing to celebrate a biblical story. But what good was a silver screen in a dingy theater? Moguls and entrepreneurs soon constructed picture palaces. Samuel Rothafel’s Roxy Theater in New York held more than six thousand patrons who could be escorted by a uniformed usher past gardens and statues to their cushioned seat. In order to show The Jazz Singer (1927), the first movie with synchronized words and pictures, the Warners spent half a million to equip two theaters. “Sound is a passing fancy,” one MGM producer told his wife, but Warner Bros.’ assets, which increased from just $5,000,000 in 1925 to $230,000,000 in 1930, tell a different story.

Americans fell in love with the movies. Whether it was the surroundings, the sound, or the production budgets, weekly movie attendance skyrocketed from sixteen million in 1912 to forty million in the early 1920s. Hungarian immigrant William Fox, founder of Fox Film Corporation, declared that “the motion picture is a distinctly American institution” because “the rich rub elbows with the poor” in movie theaters. With no seating restriction, the one-price admission was accessible for nearly all Americans (African Americans, however, were either excluded or segregated). Women represented more than sixty percent of moviegoers, packing theaters to see Mary Pickford, nicknamed “America’s Sweetheart,” who was earning one million dollars a year by 1920 through a combination of film and endorsements contracts. Pickford and other female stars popularized the “flapper,” a woman who favored short skirts, makeup, and cigarettes.

Mary Pickford’s film personas led the glamorous and lavish lifestyle that female movie-goers of the 1920s desired so much. Mary Pickford, 1920. Library of Congress, http://www.loc.gov/pictures/item/2003666664.

Mary Pickford’s film personas led the glamorous and lavish lifestyle that female movie-goers of the 1920s desired so much. Mary Pickford, 1920. Library of Congress, http://www.loc.gov/pictures/item/2003666664.

As Americans went to the movies more and more, at home they had the radio. Italian scientist Guglielmo Marconi transmitted the first transatlantic wireless (radio) message in 1901, but radios in the home did not become available until around 1920, when they boomed across the country. Around half of American homes contained a radio by 1930. Radio stations brought entertainment directly into the living room through the sale of advertisements and sponsorships, from The Maxwell House Hour to the Lucky Strike Orchestra. Soap companies sponsored daytime drams so frequently that an entire genre—“soap operas”—was born, providing housewives with audio adventures that stood in stark contrast to common chores. Though radio stations were often under the control of corporations like the National Broadcasting Company (NBC) or the Columbia Broadcasting System (CBS), radio programs were less constrained by traditional boundaries in order to capture as wide an audience as possible, spreading popular culture on a national level.

Radio exposed Americans to a broad array of music. Jazz, a uniquely American musical style popularized by the African-American community in New Orleans, spread primarily through radio stations and records. The New York Times had ridiculed jazz as “savage” because of its racial heritage, but the music represented cultural independence to others. As Harlem-based musician William Dixon put it, “It did seem, to a little boy, that . . . white people really owned everything. But that wasn’t entirely true. They didn’t own the music that I played.” The fast-paced and spontaneity-laced tunes invited the listener to dance along. “When a good orchestra plays a ‘rag,’” dance instructor Vernon Castle recalled, “One has simply got to move.” Jazz became a national sensation, played and heard by whites and blacks both. Jewish Lithuanian-born singer Al Jolson—whose biography inspired The Jazz Singer and who played the film’s titular character—became the most popular singer in America.

Babe Ruth’s incredible talent attracted widespread attention to the sport of baseball, helping it become America’s favorite pastime. Ruth’s propensity to shatter records with the swing of his bat made him a national hero during a period when defying conventions was the popular thing to do. “[Babe Ruth, full-length portrait, standing, facing slightly left, in baseball uniform, holding baseball bat],” c. 1920. Library of Congress, http://www.loc.gov/pictures/item/92507380/.

Babe Ruth’s incredible talent attracted widespread attention to the sport of baseball, helping it become America’s favorite pastime. Ruth’s propensity to shatter records with the swing of his bat made him a national hero during a period when defying conventions was the popular thing to do. “[Babe Ruth, full-length portrait, standing, facing slightly left, in baseball uniform, holding baseball bat],” c. 1920. Library of Congress, http://www.loc.gov/pictures/item/92507380/.

The 1920s also witnessed the maturation of professional sports. Play-by-play radio broadcasts of major collegiate and professional sporting events marked a new era for sports, despite the institutionalization of racial segregation in most. Suddenly, Jack Dempsey’s left crosses and right uppercuts could almost be felt in homes across the United States. Dempsey, who held the heavyweight championship for most of the decade, drew million-dollar gates and inaugurated “Dempseymania” in newspapers across the country. Red Grange, who carried the football with a similar recklessness, helped to popularize professional football, which was then in the shadow of the college game. Grange left the University of Illinois before graduating to join the Chicago Bears in 1925. “There had never been such evidence of public interest since our professional league began,” recalled Bears owner George Halas of Grange’s arrival.Perhaps no sports figure left a bigger mark than did Babe Ruth. Born George Herman Ruth, the “Sultan of Swat” grew up in an orphanage in Baltimore’s slums. Ruth’s emergence onto the national scene was much needed, as the baseball world had been rocked by the so-called black Sox scandal in which eight players allegedly agreed to throw the 1919 World Series. Ruth hit fifty-four home runs in 1920, which was more than any other team combined. Baseball writers called Ruth a superman, and more Americans could recognize Ruth than they could then-president Warren G. Harding.After an era of destruction and doubt brought about by the First World War, Americans craved heroes that seemed to defy convention and break boundaries. Dempsey, Grange, and Ruth dominated their respective sport, but only Charles Lindbergh conquered the sky. On May 21, 1927, Lindbergh concluded the first ever non-stop solo flight from New York to Paris. Armed with only a few sandwiches, some bottles of water, paper maps, and a flashlight, Lindbergh successfully navigated over the Atlantic Ocean in thirty-three hours. Some historians have dubbed Lindbergh the “hero of the decade,” not only for his transatlantic journey, but because he helped to restore the faith of many Americans in individual effort and technological advancement. Devastated in war by machine guns, submarines, and chemical weapons, Lindbergh’s flight demonstrated that technology could inspire and accomplish great things. Outlook Magazine called Lindbergh “the heir of all that we like to think is the best in America.”The decade’s popular culture seemed to revolve around escape. Coney Island in New York marked new amusements for young and old. Americans drove their sedans to massive theaters to enjoy major motion pictures. Radio towers broadcasted the bold new sound of jazz, the adventure of soap operas, and the feats of amazing athletes. Dempsey and Grange seemed bigger, stronger, and faster than any who dared to challenge them. Babe Ruth smashed home runs out of ball parks across the country. And Lindbergh escaped earth’s gravity and crossed entire ocean. Neither Dempsey nor Ruth nor Lindbergh made Americans forget the horrors of the First World War and the chaos that followed, but they made it seem as if the future would be that much brighter.

V. “The New Woman”

This “new breed” of women – known as the flapper – went against the gender proscriptions of the era, bobbing their hair, wearing short dresses, listening to jazz, and flouting social and sexual norms. While liberating in many ways, these behaviors also reinforced stereotypes of female carelessness and obsessive consumerism that would continue throughout the twentieth century. Bain News Service, “Louise Brooks,” undated. Library of Congress, http://www.loc.gov/pictures/item/ggb2006007866/.

This “new breed” of women – known as the flapper – went against the gender proscriptions of the era, bobbing their hair, wearing short dresses, listening to jazz, and flouting social and sexual norms. While liberating in many ways, these behaviors also reinforced stereotypes of female carelessness and obsessive consumerism that would continue throughout the twentieth century. Bain News Service, “Louise Brooks,” undated. Library of Congress, http://www.loc.gov/pictures/item/ggb2006007866/.

The rising emphasis on spending and accumulation nurtured a national ethos of materialism and individual pleasure. These impulses were embodied in the figure of the flapper, whose bobbed hair, short skirts, makeup, cigarettes, and carefree spirit captured the attention of American novelists such as F. Scott Fitzgerald and Sinclair Lewis. Rejecting the old Victorian values of desexualized modesty and self-restraint, young “flappers” seized opportunities for the public coed pleasures offered by new commercial leisure institutions, such as dance halls, cabarets, and nickelodeons, not to mention the illicit blind tigers and speakeasies spawned by Prohibition. So doing, young American women had helped to usher in a new morality that permitted women greater independence, freedom of movement, and access to the delights of urban living. In the words of psychologist G. Stanley Hall, “She was out to see the world and, incidentally, be seen of it.”

Such sentiments were repeated in an oft-cited advertisement in a 1930 edition of the Chicago Tribune: “Today’s woman gets what she wants. The vote. Slim sheaths of silk to replace voluminous petticoats. Glassware in sapphire blue or glowing amber. The right to a career. Soap to match her bathroom’s color scheme.” As with so much else in 1920s, however, sex and gender were in many ways a study in contradictions. It was the decade of the “New Woman,” and one in which only 10% of married women worked outside the home. It was a decade in which new technologies decreased time requirements for household chores, and one in which standards of cleanliness and order in the home rose to often impossible standards. It was a decade in which women would, finally, have the opportunity to fully exercise their right to vote, and one in which the often thinly-bound women’s coalitions that had won that victory splintered into various causes. Finally, it was a decade in which images such as the “flapper” would give women new modes of representing femininity, and one in which such representations were often inaccessible to women of certain races, ages, and socio-economic classes.

Women undoubtedly gained much in the 1920s. There was a profound and keenly felt cultural shift which, for many women, meant increased opportunity to work outside the home. The number of professional women, for example, significantly rose in the decade. But limits still existed, even for professional women. Occupations such as law and medicine remained overwhelmingly “male”: the majority of women professionals were in “feminized” professions such as teaching and nursing. And even within these fields, it was difficult for women to rise to leadership positions.

Further, it is crucial not to over-generalize the experience of all women based on the experiences of a much-commented upon subset of the population. A woman’s race, class, ethnicity, and marital status all had an impact on both the likelihood that she worked outside the home, as well as the types of opportunities that were available to her. While there were exceptions, for many minority women, work outside the home was not a cultural statement but rather a financial necessity (or both), and physically demanding, low-paying domestic service work continued to be the most common job type. Young, working class white women were joining the workforce more frequently, too, but often in order to help support their struggling mothers and fathers.

For young, middle-class, white women—those most likely to fit the image of the carefree flapper—the most common workplace was the office. These predominantly single women increasingly became clerks, jobs that had been primarily “male” earlier in the century. But here, too, there was a clear ceiling. While entry-level clerk jobs became increasingly feminized, jobs at a higher, more lucrative level remained dominated by men. Further, rather than changing the culture of the workplace, the entrance of women into the lower-level jobs primarily changed the coding of the jobs themselves. Such positions simply became “women’s work.”

The frivolity, decadence, and obliviousness of the 1920s was embodied in the image of the flapper, the stereotyped carefree and indulgent woman of the Roaring Twenties depicted by Russell Patterson’s drawing. Russell Patterson, artist, “Where there's smoke there's fire,” c. 1920s. Library of Congress, http://www.loc.gov/pictures/item/2009616115/.

The frivolity, decadence, and obliviousness of the 1920s was embodied in the image of the flapper, the stereotyped carefree and indulgent woman of the Roaring Twenties depicted by Russell Patterson’s drawing. Russell Patterson, artist, “Where there’s smoke there’s fire,” c. 1920s. Library of Congress, http://www.loc.gov/pictures/item/2009616115/.

Finally, as these same women grew older and married, social changes became even subtler. Married women were, for the most part, expected to remain in the domestic sphere. And while new patterns of consumption gave them more power and, arguably, more autonomy, new household technologies and philosophies of marriage and child-rearing increased expectations, further tying these women to the home—a paradox that becomes clear in advertisements such as the one in the Chicago Tribune. Of course, the number of women in the workplace cannot exclusively measure changes in sex and gender norms. Attitudes towards sex, for example, continued to change in the 1920s, as well, a process that had begun decades before. This, too, had significantly different impacts on different social groups. But for many women—particularly young, college-educated white women—an attempt to rebel against what they saw as a repressive “Victorian” notion of sexuality led to an increase in premarital sexual activity strong enough that it became, in the words of one historian, “almost a matter of conformity.”

In the homosexual community, meanwhile, a vibrant gay culture grew, especially in urban centers such as New York. While gay males had to contend with increased policing of the gay lifestyle (especially later in the decade), in general they lived more openly in New York in the 1920s than they would be able to for many decades following World War II. At the same time, for many lesbians in the decade, the increased sexualization of women brought new scrutiny to same-sex female relationships previously dismissed as harmless.

Ultimately, the most enduring symbol of the changing notions of gender in the 1920s remains the flapper. And indeed, that image was a “new” available representation of womanhood in the 1920s. But it is just that: a representation of womanhood of the 1920s. There were many women in the decade of differing races, classes, ethnicities, and experiences, just as there were many men with different experiences. For some women, the 1920s were a time of reorganization, new representations and new opportunities. For others, it was a decade of confusion, contradiction, new pressures and struggles new and old.

 

VI. “The New Negro”

Just as cultural limits loosened across the nation, the 1920s represented a period of serious self-reflection among African Americans, most especially those in northern ghettos. New York City was a popular destination of American blacks during the Great Migration. The city’s black population grew 257%, from 91,709 in 1910 to 327,706 by 1930 (the white population grew only 20%). Moreover, by 1930, some 98,620 foreign-born blacks had migrated to the U.S. Nearly half made their home in Manhattan’s Harlem district.

Harlem originally lay between Fifth Avenue to Eighth Avenue and 130th Street to 145th Street. By 1930, the district had expanded to 155th Street and was home to 164,000 mostly African Americans. Continuous relocation to “the greatest Negro City in the world” exacerbated problems with crime, health, housing, and unemployment. Nevertheless, it importantly brought together a mass of black people energized by the population’s World War I military service, the urban environment, and for many, ideas of Pan-Africanism or Garveyism. Out of the area’s cultural ferment emerged the Harlem Renaissance, or what was then termed “New Negro Movement.” While this stirring in self-consciousness and racial pride was not confined to Harlem, this district was truly as James Weldon Johnson described: “The Culture Capital.” The Harlem Renaissance became a key component in African Americans’ long history of cultural and intellectual achievements.

Alain Locke did not coin “New Negro”, but he did much to popularize it. In the 1925 The New Negro, Locke proclaimed that the generation of subservience was no more—“we are achieving something like a spiritual emancipation.” Bringing together writings by men and women, young and old, black and white, Locke produced an anthology that was of African Americans rather than simply about them. The book joined many others. Popular Harlem Renaissance writers published some twenty-six novels, ten volumes of poetry, and countless short stories between 1922 and 1935. Alongside the well-known Langston Hughes and Claude McKay, women writers like Jessie Redmon Fauset and Zora Neale Hurston published nearly one-third of these novels. While themes varied, the literature frequently explored and countered pervading stereotypes and forms of American racial prejudice.

The Harlem Renaissance was manifested in theatre, art, and music. For the first time, Broadway presented black actors in serious roles. The 1924 production, Dixie to Broadway, was the first all-black show with mainstream showings. In art, Meta Vaux Warrick Fuller, Aaron Douglas, and Palmer Hayden showcased black cultural heritage as well as captured the population’s current experience. In music, jazz rocketed in popularity. Eager to hear “real jazz,” whites journeyed to Harlem’s Cotton Club and Smalls. Next to Greenwich Village, Harlem’s nightclubs and speakeasies (venues where alcohol was publicly consumed) presented a place where sexual freedom and gay life thrived. Unfortunately, while headliners like Duke Ellington were hired to entertain at Harlem’s venues, the surrounding black community was usually excluded. Furthermore, black performers were often restricted from restroom use and relegated to service door entry. As the Renaissance faded to a close, several Harlem Renaissance artists went on to produce important works indicating that this movement was but one component in African American’s long history of cultural and intellectual achievements.

Marcus Garvey inspired black American activists disappointed with the lack of progress since emancipation to create a world-wide community to fight injustice. One of the many forms of social activism in the 1920s, Garveyism was seen by some as too radical to engender any real change. Yet Garveyism formed a substantial following, and was a major stimulus for later black nationalistic movements like the Black Panthers. Photograph of Marcus Garvey, August 5, 1924. Library of Congress, http://www.loc.gov/pictures/item/2003653533/.

Marcus Garvey inspired black American activists disappointed with the lack of progress since emancipation to create a world-wide community to fight injustice. One of the many forms of social activism in the 1920s, Garveyism was seen by some as too radical to engender any real change. Yet Garveyism formed a substantial following, and was a major stimulus for later black nationalistic movements like the Black Panthers. Photograph of Marcus Garvey, August 5, 1924. Library of Congress, http://www.loc.gov/pictures/item/2003653533/.

The explosion of African American self-expression found multiple outlets in politics. In the 1910s and 20s, perhaps no one so attracted disaffected black activists as Marcus Garvey. Garvey was a Jamaican publisher and labor organizer who arrived in New York City in 1916. Within just a few years of his arrival, he built the largest black nationalist organization in the world, the Universal Negro Improvement Association (UNIA). Inspired by Pan-Africanism and Booker T. Washington’s model of industrial education, and critical of what he saw as DuBois’s elitist strategies in service of black elites, Garvey sought to promote racial pride, encourage black economic independence, and root out racial oppression in Africa and the Diaspora.

Headquartered in Harlem, the UNIA published a newspaper, Negro World, and organized elaborate parades in which members, “Garveyites,” dressed in ornate, militaristic regalia and marched down city streets. The organization criticized the slow pace of the judicial focus of the National Association for the Advancement of Colored People (NAACP), as well as this organization’s acceptance of memberships and funds from whites. “For the Negro to depend on the ballot and his industrial progress alone,” Garvey opined, “will be hopeless as it does not help him when he is lynched, burned, jim-crowed, and segregated.” In 1919, the UNIA announced plans to develop a shipping company called the Black Star Line as part of a plan that pushed for blacks to reject the political system and to “return to Africa” instead.” Most of the investments came in the form of shares purchased by UNIA members, many of whom heard Garvey give rousing speeches across the country about the importance of establishing commercial ventures between African Americans, Afro-Caribbeans, and Africans.

Garvey’s detractors disparaged these public displays and poorly managed business ventures, and they criticized Garvey for peddling empty gestures in place of measures that addressed the material concerns of African Americans. NAACP leaders depicted Garvey’s plan as one that simply said, “Give up! Surrender! The struggle is useless.” Enflamed by his aggressive attacks on other black activists and his radical ideas of racial independence, many African American and Afro-Caribbean leaders worked with government officials and launched the “Garvey Must Go” campaign, which culminated in his 1922 indictment and 1925 imprisonment and subsequent deportation for “using the mails for fraudulent purposes.” The UNIA never recovered its popularity or financial support, even after Garvey’s pardon in 1927, but his movement made a lasting impact on black consciousness in the United States and abroad. He inspired the likes of Malcolm X, whose parents were Garveyites, and Kwame Nkrumah, the first president of Ghana. Garvey’s message, perhaps best captured by his rallying cry, “Up, you mighty race,” resonated with African Americans who found in Garveyism a dignity not granted them in their everyday lives. In that sense, it was all too typical of the Harlem Renaissance.

 

VII. Culture War

For all of its cultural ferment, however, the 1920s were also a difficult time for radicals and immigrants and anything “modern.” Fear of foreign radicals led to the executions of Nicola Sacco and Bartolomeo Vanzetti, two Italian anarchists, in 1927. In May 1920, the two had been arrested for robbery and murder connected with an incident at a Massachusetts factory. Their guilty verdicts were appealed for years as the evidence surrounding their convictions was slim. For instance, while one eyewitness claimed that Vanzetti drove the get-away car, but accounts of others described a different person altogether. Nevertheless, despite around world lobbying by radicals and a respectable movement among middle class Italian organizations in the U.S., the two men were executed on August 23, 1927. Vanzetti conceivably provided the most succinct reason for his death, saying, “This is what I say . . . . I am suffering because I am a radical and indeed I am a radical; I have suffered because I was an Italian, and indeed I am an Italian.”

Many Americans expressed anxieties about the changes that had remade the United States and, seeking scapegoats, many middle-class white Americans pointed to Eastern European and Latin American immigrants (Asian immigration had already been almost completely prohibited), or African Americans who now pushed harder for civil rights and after migrating out of the American South to northern cities as a part of the Great Migration, that mass exodus which carried nearly half a million blacks out of the South between just 1910-1920. Protestants, meanwhile, continued to denounce the Roman Catholic Church and charged that American Catholics gave their allegiance to the Pope and not to their country.

In 1921, Congress passed the Emergency Immigration Act as a stopgap immigration measure, and then, three years later, permanently established country-of-origin quotas through the National Origins Act. The number of immigrants annually admitted to the United States from each nation was restricted to two percent of the population who had come from that country and resided in the United States in 1890. (By pushing back three decades, past the recent waves of “new” immigrants from Southern and Eastern Europe, Latin America, and Asia, the law made it extremely difficult for immigrants outside of Northern Europe to legally enter the United States.) The act also explicitly excluded all Asians, though, to satisfy southern and western growers, temporarily omitted restrictions on Mexican immigrants. The Sacco and Vanzetti trial and sweeping immigration restrictions pointed to a rampant nativism. A great number of Americans worried about a burgeoning America that did not resemble the one of times past. Many wrote of an American riven by a cultural war.

 

VIII. Fundamentalist Christianity

In addition to alarms over immigration and the growing presence of Catholicism and Judaism, a new core of Christian fundamentalists were very much concerned about relaxed sexual mores and increased social freedoms, especially as found in city centers. Although never a centralized group, most fundamentalists lashed out against what they saw as a sagging public morality, a world in which Protestantism seemed challenged by Catholicism, women exercised ever greater sexual freedoms, public amusements encouraged selfish and empty pleasures, and critics mocked prohibition through bootlegging and speakeasies.

Christian Fundamentalism arose most directly from a doctrinal dispute among Protestant leaders. Liberal theologians sought to intertwine religion with science and secular culture. These “Modernists,” influenced by the Biblical scholarship of nineteenth century German academics, argued that Christian doctrines about the miraculous might be best understood metaphorically. The church, they said, needed to adapt itself to the world. According to the Baptist pastor Harry Emerson Fosdick, the “coming of Christ” might occur “slowly…but surely, [as] His will and principles [are] worked out by God’s grace in human life and institutions.” The social gospel, which encouraged Christians to build the Kingdom of God on earth by working against social and economic inequality, was very much tied to liberal theology.

During the 1910s, funding from oil barons Lyman and Milton Stewart enabled the evangelist A. C. Dixon to commission some ninety essays to combat religious liberalism. The collection, known as The Fundamentals, became the foundational documents of Christian fundamentalism, from which the movement’s name is drawn. Contributors agreed that Christian faith rested upon literal truths, that Jesus, for instance, would physically return to earth at the end of time to redeem the righteous and damn the wicked. Some of the essays put forth that human endeavor would not build the Kingdom of God, while others covered such subjects as the virgin birth and biblical inerrancy. American Fundamentalists spanned Protestant denominations and borrowed from diverse philosophies and theologies, most notably the holiness movement, the larger revivalism of the nineteenth century and new dispensationalist theology (in which history proceeded, and would end, through “dispensations” by God). They did, however, all agree that modernism was the enemy and the Bible was the inerrant word of God. It was a fluid movement often without clear boundaries, but it featured many prominent clergymen, including the well-established and extremely vocal John Roach Straton (New York), J. Frank Norris (Texas), and William Bell Riley (Minnesota).

On March 21, 1925 in a tiny courtroom in Dayton, Tennessee, Fundamentalists gathered to tackle the issues of creation and evolution. A young biology teacher, John T. Scopes, was being tried for teaching his students evolutionary theory in violation of the Butler Act, a state law preventing evolutionary theory or any theory that denied “the Divine Creation of man as taught in the Bible” from being taught in publically-funded Tennessee classrooms. Seeing the act as a threat to personal liberty, the American Civil Liberties Union (ACLU) immediately sought a volunteer for a “test” case, hoping that the conviction and subsequent appeals would lead to a day in the Supreme Court, testing the constitutionality of the law. It was then that Scopes, a part-time teacher and coach, stepped up and voluntarily admitted to teaching evolution (Scopes’ violation of the law was never in question). Thus the stage was set for the pivotal courtroom showdown—“the trial of the century”—between the champions and opponents of evolution that marked a key moment in an enduring American “culture war.”

The case became a public spectacle. Clarence Darrow, an agnostic attorney and a keen liberal mind from Chicago, volunteered to aid the defense came up against William Jennings Bryan. Bryan, the “Great Commoner,” was the three-time presidential candidate who in his younger days had led the political crusade against corporate greed. He had done so then with a firm belief in the righteousness of his cause, and now he defended biblical literalism in similar terms. The theory of evolution, Bryan said, with its emphasis on the survival of the fittest, “would eliminate love and carry man back to a struggle of tooth and claw.”

The Scopes trial signified a pivotal moment when science and religion became diametrically opposed. The Scopes defense team, three of whom are seen in this photograph, argued a literal interpretation of the Bible that is still popular in many fundamentalist Christian circles today. “Dudley Field Malone, Dr. John R. Neal, and Clarence Darrow in Chicago, Illinois.” The Clarence Darrow Digital Collection, http://darrow.law.umn.edu/photo.php?pid=874.

The Scopes trial signified a pivotal moment when science and religion became diametrically opposed. The Scopes defense team, three of whom are seen in this photograph, argued a literal interpretation of the Bible that is still popular in many fundamentalist Christian circles today. “Dudley Field Malone, Dr. John R. Neal, and Clarence Darrow in Chicago, Illinois.” The Clarence Darrow Digital Collection, http://darrow.law.umn.edu/photo.php?pid=874.

Newspapermen and spectators flooded the small town of Dayton. Across the nation, Americans tuned their radios to the national broadcasts of a trial that dealt with questions of religious liberty, academic freedom, parental rights, and the moral responsibility of education. For six days in July, the men and women of America were captivated as Bryan presented his argument on the morally corrupting influence of evolutionary theory (and pointed out that Darrow made a similar argument about the corruptive potential of education during his defense of the famed killers Nathan Leopold and Richard Loeb a year before). Darrow eloquently fought for academic freedom.

At the request of the defense, Bryan took the stand as an “expert witness” on the Bible. At his age, he was no match for Darrow’s famous skills as a trial lawyer and his answers came across as blundering and incoherent, particularly as he was not in fact a literal believer in all of the Genesis account (believing—as many anti-evolutionists did—that the meaning of the word “day” in the book of Genesis could be taken as allegory) and only hesitantly admitted as much, not wishing to alienate his fundamentalist followers. Additionally, Darrow posed a series of unanswerable questions: Was the “great fish” that swallowed the prophet Jonah created for that specific purpose? What precisely happened astronomically when God made the sun stand still? Bryan, of course, could cite only his faith in miracles. Tied into logical contradictions, Bryan’s testimony was a public relations disaster, although his statements were expunged from the record the next day and no further experts were allowed—Scopes’ guilt being established, the jury delivered a guilty verdict in minutes. The case was later thrown out on technicality. But few cared about the verdict. Darrow had, in many ways, at least to his defenders, already won: the fundamentalists seemed to have taken a beating in the national limelight. Journalist and satirist H. L. Mencken characterized the “circus in Tennessee” as an embarrassment for fundamentalism, and modernists remembered the “Monkey Trial” as a smashing victory. If fundamentalists retreated from the public sphere, they did not disappear entirely. Instead, they went local, built a vibrant subculture, and emerged many decades later stronger than ever.

 

IX. Rebirth of the Ku Klux Klan (KKK)

Suspicions of immigrants, Catholics, and modernists contributed to a string of reactionary organizations. None so captured the imaginations of the country as the reborn Ku Klux Klan (KKK), a white supremacist organization that expanded beyond its Reconstruction Era anti-black politics to now claim to protect American values and way of life from blacks, feminists (and other radicals), immigrants, Catholics, Jews, atheists, bootleggers, and a host of other imagined moral enemies.

Two events in 1915 are widely credited with inspiring the rebirth of the Klan: the lynching of Leo Frank and the release of The Birth of the Nation, a popular and groundbreaking film that valorized the Reconstruction Era Klan as a protector of feminine virtue and white racial purity. Taking advantage of this sudden surge of popularity, Colonel William Joseph Simmons organized what is often called the “second” Ku Klux Klan in Georgia in late 1915. This new Klan, modeled after other fraternal organizations with elaborate rituals and a hierarchy, remained largely confined to Georgia and Alabama until 1920, when Simmons began a professional recruiting effort that resulted in individual chapters being formed across the country and membership rising to an estimated five million.

Partly in response to the migration of Southern blacks to Northern cities during World War I, the KKK expanded above the Mason-Dixon. Membership soared in Philadelphia, Detroit, Chicago, and Portland, while Klan-endorsed mayoral candidates won in Indianapolis, Denver, and Atlanta. The Klan often recruited through fraternal organizations such as the Freemasons and through various Protestant churches. In many areas, local Klansmen would visit churches of which they approved and bestow a gift of money upon the presiding minister, often during services. The Klan also enticed people to join through large picnics, parades, rallies, and ceremonies. The Klan established a women’s auxiliary in 1923 headquartered in Little Rock, Arkansas. The Women of the Ku Klux Klan mirrored the KKK in practice and ideology and soon had chapters in all forty-eight states, often attracting women who were already part of the prohibition movement, the defense of which was a centerpiece of Klan activism.

Contrary to its perception of as a primarily Southern and lower-class phenomenon, the second Klan had a national reach composed largely of middle-class people. Sociologist Rory McVeigh surveyed the KKK newspaper Imperial Night-Hawk for the years 1924 and 1924, at the organization’s peak, and found the largest number of Klan-related activities to have occurred in Texas, Pennsylvania, Indiana, Illinois, and Georgia. The Klan was even present in Canada, where it was a powerful force within Saskatchewan’s Conservative Party. In many states and localities, the Klan dominated politics to such a level that one could not be elected without the support of the KKK. For example, in 1924, the Klan supported William Lee Cazort for governor of Arkansas, leading his opponent in the Democratic Party primary, Thomas Terral, to seek honorary membership through a Louisiana klavern so as not to be tagged as the anti-Klan candidate. In 1922, Texans elected Earle B. Mayfield, an avowed Klansman who ran openly as that year’s “klandidate,” to the United States Senate. At its peak the Klan claimed between four and five million members.

Despite the breadth of its political activism, the Klan is today remembered largely as a violent vigilante group—and not without reason. Members of the Klan and affiliated organizations often carried out acts of lynching and “nightriding”—the physical harassment of bootleggers, union activists, civil rights workers, or any others deemed “immoral” (such as suspected adulterers) under the cover of darkness or while wearing their hoods and robes. In fact, Klan violence was extensive enough in Oklahoma that Governor John C. Walton placed the entire state under martial law in 1923. Witnesses testifying before the military court disclosed accounts of Klan violence ranging from the flogging of clandestine brewers to the disfiguring of a prominent black Tulsan for registering African Americans to vote. In Houston, Texas, the Klan maintained an extensive system of surveillance that included tapping telephone lines and putting spies into the local post office in order to root out “undesirables.” A mob that organized and led by Klan members in Aiken, South Carolina, lynched Bertha Lowman and her two brothers in 1926, but no one was ever prosecuted: the sheriff, deputies, city attorney, and state representative all belonged to the Klan.

The Klan dwindled in the face of scandal and diminished energy over the last years of the 1920s. By 1930, the Klan only had about 30,000 members and it was largely spent as a national force, only to appear again as a much diminished force during the civil rights movement in the 1950s and 60s.

 

X. Conclusion

In his inauguration speech in 1929, Herbert Hoover told Americans that the Republican Party had brought prosperity. Even ignoring stubbornly large rates of poverty and unparalleled levels of inequality, he could not see the weaknesses behind the decade’s economy. Even as the new culture of consumption promoted new freedoms, it also promoted new insecurities. An economy built on credit exposed the nation to tremendous risk. Flailing European economies, high tariffs, wealth inequality, a construction bubble, and an ever-more flooded consumer market loomed dangerously until the Roaring Twenties would grind to a halt. In a moment the nation’s glitz and glamour seemed to give way to decay and despair. For farmers, racial minorities, unionized workers, and other populations that did not share in 1920s prosperity, the veneer of a Jazz Age and a booming economy had always been a fiction. But for them, as for millions of Americans, end of an era was close. The Great Depression loomed.

 

This chapter was edited by Brandy Thomas Wells, with content contributions by Micah Childress, Mari Crabtree, Maggie Flamingo, Guy Lancaster, Emily Remus, Colin Reynolds, Kristopher Shields, and Brandy Thomas Wells.

cc-by-sa-icon

21. World War I & Its Aftermath

Striking steel mill workers holding bulletins, Chicago, Illinois, September 22, 1919. ExplorePAhistory.com

Striking steel mill workers holding bulletins, Chicago, Illinois, September 22, 1919. ExplorePAhistory.com

*The American Yawp is currently in beta draft. Please click here to help improve this chapter*

I. Introduction

World War I (“The Great War”) toppled empires, created new nations, and sparked tensions that would explode across future years. On the battlefield, its gruesome modern weaponry wrecked an entire generation of young men. The United States entered the conflict in 1917 and was never the same. The war heralded to the world the United States’ potential as a global military power, and domestically it advanced but then beat back American progressivism before unleashing vicious waves of repression. The war simultaneously stoked national pride and fueled disenchantments that burst Progressive Era hopes for the modern world. And it laid the groundwork for a global depression, a second world war, and an entire history of national, religious, and cultural conflict around the globe.

 

II: Prelude to War

As the German empire rose in power and influence at the end of the nineteenth century, skilled diplomats maneuvered this disruption of traditional powers and influences into several decades of European peace. In Germany, however, a new ambitious monarch would overshadow years of tactful diplomacy. Wilhelm II rose to the German throne in 1888. He admired the British Empire of his grandmother, Queen Victoria, and envied the Royal Navy of Great Britain so much so that he attempted to build a rival German navy and plant colonies around the globe. The British viewed the prospect of a German navy as a strategic threat, but, jealous of what he perceived to as a lack of prestige in the world, Wilhelm II pressed Germany’s case for access to colonies and symbols of status suitable for a world power. Wilhelm’s maneuvers and Germany’s rise spawned a new system of alliances as rival nations warily watched Germany’s expansion.

In 1892, German posturing worried the leaders of Russia and France and prompted a defensive alliance to counter the existing triple threat between Germany, Austria-Hungary, and Italy. Britain’s Queen Victoria remained unassociated with the alliances until a series of diplomatic crises and an emerging German naval threat led to British agreements with Czar Nicholas II and French President Emile Loubet in the early twentieth century. (The alliance between Great Britain, France, and Russia became known as the Triple Entente.)

The other great threat to European peace was the Ottoman Empire, in Turkey. While the leaders of the Austrian-Hungarian Empire showed little interest in colonies elsewhere, Turkish lands on its southern border appealed to their strategic goals. However, Austrian-Hungarian expansion in Europe worried Czar Nicholas II who saw Russia as both the historic guarantor of the Slavic nations in the Balkans and as the competitor for territories governed by the Ottoman Empire.

By 1914, the Austrian-Hungarian Empire had control of Bosnia and Herzegovina and viewed Slavic Serbia, a nation protected by Russia, as its next challenge. On June 28, 1914, after Serbian Gavrilo Princip assassinated the Austrian-Hungarian heirs to the throne, Archduke Franz Ferdinand and his wife, Grand Duchess Sophie, vengeful nationalist leaders believed the time had arrived to eliminate the rebellious ethnic Serbian threat.

On the other side of the Atlantic, the United States played an insignificant role in global diplomacy—it rarely forayed into internal European politics. The federal government did not participate in international diplomatic alliances but nevertheless championed and assisted with the expansion of the transatlantic economy. American businesses and consumers benefited from the trade generated as the result of the extended period of European peace.

Stated American attitudes toward international affairs followed the advice given by President George Washington in his 1796 Farewell Address, one-hundred and twenty years before America’s entry in World War I. He had recommended that his fellow countrymen avoid “foreign alliances, attachments, and intrigues” and “those overgrown military establishments which, under any form of government, are inauspicious to liberty, and which are to be regarded as particularly hostile to republican liberty.”

A national foreign policy of neutrality reflected America’s inward-looking focus on the construction and management of its new powerful industrial economy (built in large part with foreign capital). The federal government possessed limited diplomatic tools with which to engage an international struggles for world power. America’s small and increasingly antiquated military precluded forceful coercion and left American diplomats to persuade by reason, appeals to justice, or economic coercion. But in the 1880s, as Americans embarked upon empire, Congress authorized the construction of a modern Navy. The Army nevertheless remained small and underfunded compared to the armies of many industrializing nations.

After the turn of the century, the Army and Navy faced a great deal of organizational uncertainty. New technologies—airplanes, motor vehicles, submarines, modern artillery—stressed the capability of Army and Navy personnel to effectively procure and use them. The nation’s Army could police Native Americans in the West and garrison recent overseas acquisitions, but it could not sustain a full-blown conflict of any size. The Davis Act of 1908 and the National Defense Act of 1916 represented the rise of the modern versions of the National Guard and military reserves. A system of state-administered units available for local emergencies that received conditional federal funding for training could be activated for use in international wars. The National Guard program encompassed individual units separated by state borders. The program supplied summer training for college students as a reserve officer corps. This largely resolved the myriad of conflicts between the demands of short term state problems such as natural disasters, the fear in the federal government of too few or substandard soldiers, and state leaders who thought their men would fill gaps in the national armed forces during international wars. Military leaders resisted similar efforts from allied nations to use American forces as fillers for depleted armies. The federal and state governments needed a long term strategic reserve full of trained soldiers and sailors. Meanwhile, for weapons and logistics, safe and reliable prototypes of new technologies capable of rapid deployment often ran into developmental and production delays.

Border troubles in Mexico served as an important field test for modern American military forces. Revolution and chaos threatened American business interests in Mexico. Mexican reformer Francisco Madero challenged Porfirio Diaz’s corrupt and unpopular conservative regime, was jailed, and fled to San Antonio, where he penned the Plan of San Luis Potosí, paving the way for the Mexican Revolution and the rise of armed revolutionaries across the country.

In April 1914, President Woodrow Wilson ordered Marines to accompany a naval escort to Veracruz on the lower eastern coast of Mexico. After a brief battle, the Marines supervised the city government and prevented shipments of German arms to Mexican leader Victor Huerta until they departed in November 1914. The raid emphasized the continued reliance on naval forces and the difficulty in modernizing the military during a period of European imperial influence in the Caribbean and elsewhere. The threat of war in Europe enabled passage of the Naval Act of 1916. President Wilson declared that the national goal was to build the Navy as “incomparably, the greatest…in the world.” And yet Mexico still beckoned. The Wilson administration had withdrawn its support of Diaz, but watched warily as the Revolution devolved into assassinations and deceit. In 1916, Pancho Villa, a popular revolutionary in Northern Mexico, spurned by American support for rival contenders, raided Columbus, New Mexico killed seventeen Americans and burned down the town center before sustaining severe casualties from American soldiers and retreating. In response, President Wilson commissioned Army General John “Black Jack” Pershing to capture Villa and disperse his rebels. Motorized vehicles, reconnaissance aircraft, and the wireless telegraph aided in the pursuit of Villa. Motorized vehicles in particular allowed General Pershing supplies without relying on railroads controlled by the Mexican government. The aircraft assigned to the campaign crashed or were grounded due to mechanical malfunctions, but they provided invaluable lessons in their worth and use in war. Wilson used the powers of the new National Defense Act to mobilize over 100,000 National Guard units across the country as a show of force in northern Mexico.

The conflict between the United States and Mexico might have escalated into full-scale war if the international crisis in Europe had not overwhelmed the public’s attention. After the outbreak of war in Europe in 1914, President Wilson declared American neutrality. He insisted from the start that the United States be neutral “in fact as well as in name;” a policy the majority of American people enthusiastically endorsed. What exactly “neutrality” meant in a world of close economic connections, however, prompted immediate questions the United States was not yet adequately prepared to answer. Ties to the British and French proved strong, and those nations obtained far more loans and supplies than the Germans. In October 1914, President Wilson approved commercial credit loans to the combatants which made it increasingly difficult for the nation to claim impartiality as war spread through Europe. Trade and trade-related financial relations with combatant nations conflicted with previous agreements with the Allies that ultimately drew the nation further into the conflict. In spite of mutually declared blockades between Germany, Great Britain, and France, munitions and other war suppliers in the United States witnessed a brisk and booming increase in business. The British naval blockades that often stopped or seized ships proved annoying and costly, but the unrestricted and surprise torpedo attacks from German submarines were far more deadly. In May 1915, the sinking of the RMS Lusitania at the cost of over a hundred American lives and other German attacks on American and British shipping raised the ire of the public and stoked the desire for war.

If American diplomatic tradition avoided alliances and the Army seemed inadequate for sustained overseas fighting, the United States outdistanced the nations of Europe in one important measure of world power: by 1914, the nation held the top position in the global industrial economy. The United States producing slightly more than one-third of the world’s manufactured goods, roughly equal to the outputs of France, Great Britain, and Germany combined.

 

III. War Spreads through Europe

After the assassination of Archduke Ferdinand and Grand Duchess Sophie, Austria secured the promise of aid from its German ally and issued a list of ten ultimatums to Serbia. On July 28, 1914, Austria declared war on Serbia for failure to meet all of the demands. Russia, determined to protect Serbia, began to mobilize its armed forces. On August 1, 1914, Germany declared war on Russia to protect Austria after warnings directed at Czar Nicholas II failed to stop Russian preparations for war.

In spite of the central European focus of the initial crises, the first blow was struck against neutral Belgium in northwestern Europe. Germany made plans to deal with the French and Russian threats by taking advantage of the sluggish Russian mobilization to focus the mission of the Germany army on France. Similar to the military operations of 1871, to enter France quickly German military leaders activated the Schlieffen Plan that directed the placement shift of German armies by rapid rail transport. The clever strategy detered and confused Russian forces and ultimately led to a victory march over Belgium and into France.

Belgium fell victim early to invading German forces. Germany wanted to avoid the obvious avenue of advance across the French-German border and encounter the French army units stationed there. German army commanders ordered a wide sweep around the French border forces that led straight to Belgium. However, this violation of Belgian neutrality also ensured that Great Britain entered the war against Germany. On August 4, 1914, Great Britain declared war on Germany for failure to respect Belgium as a neutral nation.

WAR & CONFLICT BOOKERA:  WORLD WAR I/THE FRONT

In 1915, the European war had developed into a series of bloody trench stalemates that continued through the following year. Offensives, largely carried out by British and French armies, achieved nothing but huge numbers of casualties. Peripheral campaigns against the Ottoman Empire in Turkey at Gallipoli, throughout the Middle East, and in various parts of Africa were either unsuccessful or had no real bearing on the European contest for victory. The third year of the war proposed promises of great German successes in eastern Europe after the regime of Czar Nicholas II collapsed in Russia in March 1917. At about the same time, the German general staff demanded the reimposition of unrestricted submarine warfare to deprive the Allies of replenishment supplies from the United States.

The Germans realized that submarine warfare would likely bring intervention on behalf of the United States. However, the Germans also believed the European war would be over before American soldiers could arrive in sufficient numbers to alter the balance of power. A German diplomat, Arthur Zimmermann, planned to complicate the potential American intervention. He offered support to the Mexican government via a desperate bid to regain Texas, New Mexico, and Arizona. Mexican national leaders declined the offer, but the revelation of the Zimmermann Telegram helped to usher the United States into the war.

 

IV. America Enters the War

By the fall of 1916 and spring of 1917, President Wilson believed an imminent German victory would drastically and dangerously alter the balance of power in Europe. With a good deal of public support inflamed by submarine warfare and items like the Zimmermann telegram (which revealed a German menace in Mexico), Congress declared war on Germany on April 4th, 1917. Despite the National Defense Act of 1916 and Naval Act of 1916, America faced a war three thousand miles away with a small and unprepared military. The United States was unprepared in nearly every respect for modern war. Considerable time elapsed before an effective Army and Navy could be assembled, trained, equipped, and deployed to the Western Front in Europe. The process of building the Army and Navy for the war proved to be different from previous American conflicts and counter to the European military experience. Unlike the largest European military powers of Germany, France, and Austria-Hungary, no tradition existed in the United States to maintain large standing armed forces or trained military reserves during peacetime. Moreover, there was no American counterpart to the European practice of rapidly equipping, training, and mobilizing reservists and conscripts.

America relied solely on traditional volunteerism to fill the ranks of the armed forces. Notions of patriotic duty and adventure appealed to many young men who not only volunteered for wartime service, but sought and paid for their own training at Army camps before the war. American labor organizations favored voluntary service over conscription. Labor leader Samuel Gompers argued for volunteerism in letters to the Congressional committees considering the question. “The organized labor movement,” he wrote, “has always been fundamentally opposed to compulsion.” Referring to American values as a role model for others, he continued, “It is the hope of organized labor to demonstrate that under voluntary conditions and institutions the Republic of the United States can mobilize its greatest strength, resources and efficiency.”

Moments like this – with the Boy Scouts of America charging up Fifth Avenue in New York City with flags in their hands – signaled to Americans that it was time to wake up to the reality of war and support the effort in any way possible. "Wake Up, America" parades like this one were throughout the country in support of recruitment. Nearly 60,000 people attended this single parade in New York City. Photograph from National Geographics Magazine, 1917. Wikimedia, http://commons.wikimedia.org/wiki/File:Boy_Scouts_NGM-v31-p359.jpg.

Moments like this – with the Boy Scouts of America charging up Fifth Avenue in New York City with flags in their hands – signaled to Americans that it was time to wake up to the reality of war and support the effort in any way possible. “Wake Up, America” parades like this one were throughout the country in support of recruitment. Nearly 60,000 people attended this single parade in New York City. Photograph from National Geographics Magazine, 1917. Wikimedia, http://commons.wikimedia.org/wiki/File:Boy_Scouts_NGM-v31-p359.jpg.

Though some observers believed that opposition to conscription might lead to civil disturbances, Congress quickly instituted a reasonably equitable and locally administered system to draft men for the military. On May 18, 1917, Congress approved the Selective Service Act, and President Wilson signed it into action a week later. The new legislation avoided the unpopular system of bonuses and substitutes used during the Civil War and was generally received without serious objection by the American people.

The conscription act initially required men from ages 21 to 30 to register for compulsory military service. The basic requirement for the military was to demonstrate a competitive level of physical fitness. These tests offered the emerging fields of social science a range of data collection tools and new screening methods. The Army Medical Department examined the general condition of young American men selected for service from the population. The Surgeon General compiled his findings from draft records in the 1919 report, “Defects Found in Drafted Men,” a snapshot of the 2.5 million men examined for military service. Of that group, 1,533,937 physical defects were recorded (often more than one per individual). More than thirty-four percent of those examined were rejected for service or later discharged for neurological, psychiatric, or mental deficiencies.

To provide a basis for the neurological, psychiatric, or mental evaluations, the Army assessed eligibility for service and aptitude for advanced training through the use of cognitive skills tests to determine intelligence. About 1.9 million men were tested on intelligence. Soldiers who were literate took the Army Alpha test. Illiterates and non-English speaking immigrants took the non-verbal equivalent, the Army Beta test, which relied on visual testing procedures. Robert M. Yerkes, president of the American Psychological Association and chairman of the Committee on the Psychological Examination of Recruits, developed and analyzed the tests. His data suggested that the mental age of recruits, in particular immigrant recruits from southern and eastern Europe, averaged about thirteen years. As a eugenicist, he interpreted the results as roughly equivalent to a mild level of retardation and as an indication of racial deterioration. Many years later, experts agreed the results misrepresented the levels of education for the recruits and revealed defects in the design of the tests.

The experience of service in the Army expanded many individual social horizons as natives and immigrants joined the ranks. Immigrants had been welcomed into Union ranks during the Civil War with large numbers of Irish and Germans who had joined and fought alongside native born men. Some Germans in the Civil War fought in units where German was the main language. Between 1917 and 1918, the Army accepted immigrants with some hesitancy because of the widespread public agitation against “hyphenated Americans” that demanded they conform without delay or reservation. However, if the Army appeared concerned about the level of assimilation and loyalty of recent immigrants, some social mixtures simply could not be tolerated within the ranks.

Propagandistic images increased patriotism in a public relatively detached from events taking place overseas. This photograph, showing two United States soldiers sprinting past the bodies of two German soldiers toward a bunker, showed Americans the heroism evinced by their men in uniform. Likely a staged image taken after fighting ending, it nonetheless played on the public’s patriotism, telling them to step up and support the troops. “At close grips with the Hun, we bomb the corkshaffer's, etc.,” c. 1922?. Library of Congress, http://www.loc.gov/pictures/item/91783839/.

Propagandistic images increased patriotism in a public relatively detached from events taking place overseas. This photograph, showing two United States soldiers sprinting past the bodies of two German soldiers toward a bunker, showed Americans the heroism evinced by their men in uniform. Likely a staged image taken after fighting ending, it nonetheless played on the public’s patriotism, telling them to step up and support the troops. “At close grips with the Hun, we bomb the corkshaffer’s, etc.,” c. 1922?. Library of Congress.

Prevailing racial attitudes mandated the assignment of white and black soldiers to different units. Despite racial discrimination and Jim Crow, many black American leaders, such as W. E. B. DuBois, supported the war effort and sought a place at the front for black soldiers. Black leaders viewed military service as an opportunity to demonstrate to white society the willingness and ability of black men to assume all duties and responsibilities of citizens, including the wartime sacrifice. If black soldiers were drafted and fought and died on equal footing with white soldiers, then white Americans would see that they deserved to full citizenship. The War Department, however, barred black troops from combat specifically to avoid racial tensions. The military relegated black soldiers to segregated service units where they worked in logistics and supply and as general laborers.

In France, the experiences of black soldiers during training and periods of leave broadened their understanding of the Allies and life in Europe. The Army often restricted the privileges of black soldiers to ensure the conditions they encountered in Europe did not lead them to question their place in American society. However, black soldiers were not the only ones feared to be at risk by the temptations of European vice. To ensure that American “doughboys” did not compromise their special identity as men of the new world who arrived to save the old, several religious and progressive organizations created an extensive program designed to keep the men pure of heart, mind, and body. With assistance from the Young Men’s Christian Association (YMCA) and other temperance organizations, the War Department put together a program of schools, sightseeing tours, and recreational facilities to provide wholesome and educational outlets. The soldiers welcomed most of the activities from these groups, but many still managed to find and enjoy the traditional recreational venues of soldiers at war.

While the War and Navy Departments initiated recruitment and mobilization plans for millions of men, women reacted to the war preparations by joining several military and civilian organizations. Their enrollment and actions in these organizations proved to be a pioneering effort for American women in war. Military leaders authorized the permanent gender transition of several occupations that gave women opportunities to don uniforms where none had existed before in history. Civilian wartime organizations, although chaired by male members of the business elite, boasted all-female volunteer workforces. Women performed the bulk of volunteer charitable work during the war.

The military faced great upheaval with the admittance of women in the war. The War and Navy Departments authorized the enlistment of women to fill positions in several established administrative occupations. The gendered transition of these jobs freed more men to join combat units. Army women served as telephone operators (Hello Girls) for the Signal Corps, Navy women enlisted as Yeomen (clerical workers), and the first groups of women joined the Marine Corps in July 1918. For the military medical professions, approximately 25,000 nurses served in the Army and Navy Nurse Corps for duty stateside and overseas, and about a hundred female physicians were contracted by the Army. Neither the female nurses nor the doctors served as commissioned officers in the military. The Army and Navy chose to appoint them instead which left the status of professional medical women hovering somewhere between the enlisted and officer ranks. As a result, many female nurses and doctors suffered various physical and mental abuses at the hands of their male coworkers with no system of redress in place.

The experiences of women in civilian organizations proved to be less stressful than in the military. Millions of women volunteered with the American Red Cross, the Young Men’s and Women’s Christian Associations (YMCA/YWCA), and the Salvation Army. Most women performed their volunteer duties in communal spaces owned by the leaders of the municipal chapters of these organizations. Women met at designated times to roll bandages, prepare and serve meals and snacks, package and ship supplies, and organize community fundraisers. The variety of volunteer opportunities that existed gave women the ability to appear in public spaces and promote charitable activities for the war effort. Women volunteers encouraged entire communities, including children, to get involved in war work. While most of these efforts focused on support for the home front, a small percentage of women volunteers served with the American Expeditionary Force in France.

Jim Crow segregation in both the military and the civilian sector stood as a barrier for black women who wanted to give their time to the war effort. The military prohibited black women from serving as enlisted or appointed medical personnel. The only avenue for black women to wear a military uniform existed with the armies of the allied nations. A few black female doctors and nurses joined the French Foreign Legion to escape the racism in the American Army.  Black women volunteers faced the same discrimination in civilian wartime organizations. White leaders of American Red Cross, YMCA/YWCA, and Salvation Army municipal chapters refused to admit them as equal participants. Black women were forced to charter auxiliary units as subsidiary divisions to the chapters and given little guidance in which to organize fellow volunteers. They turned instead to the community for support and recruited millions of women for auxiliaries that supported the nearly 200,000 black soldiers and sailors serving in the military. While the majority of women volunteers labored to care for black families on the homefront, three YMCA secretaries received the opportunity of a lifetime to work with the black troops in France.

 

V. On the Homefront

In the early years of the war, Americans were generally detached from the events in Europe. The population paired their horror of war accounts with gratitude for the economic opportunities provided by the war and pride in a national tradition of non-involvement with the kind of entangling alliances that had caused the current war. Progressive Era reform politics dominated the political landscape, and Americans remained most concerned with domestic issues and the shifting role of government at home. However, the facts of the war could not be ignored by the public. The destruction taking place on European battlefields and the ensuing casualty rates indicated the unprecedented brutality of modern warfare. Increasingly, a sense that the fate of the Western world lay in the victory or defeat of the Allies.

President Wilson, a committed progressive, had articulated a global vision of democracy even as he embraced neutrality. And as war continued to engulf Europe, it seemed apparent that the United States’ economic power would shape the outcome of the conflict regardless of any American military intervention. By 1916, American trade with the Allies tripled while trade with the Central Powers shrunk astronomically, to less than one percent of previous levels.

The large numbers of German immigrants living throughout the United States created suspicion within the federal government. The American Protective League, a group of private citizens, worked directly with the U.S. government during WWI to identify suspected German sympathizers. Additionally, they sought to eradicate all radical, anarchical, left-wing, and anti-war activities through surveillance and raids. Even Herbert Hoover, the infamous head of the FBI, used the APL to gather intelligence. A membership card in the American Protective League, issued 28 May 1918. Wikimedia, http://commons.wikimedia.org/wiki/File:APL-Membership-Card.png.

The large numbers of German immigrants living throughout the United States created suspicion within the federal government. The American Protective League, a group of private citizens, worked directly with the U.S. government during WWI to identify suspected German sympathizers. Additionally, they sought to eradicate all radical, anarchical, left-wing, and anti-war activities through surveillance and raids. Even Herbert Hoover, the infamous head of the FBI, used the APL to gather intelligence. A membership card in the American Protective League, issued 28 May 1918. Wikimedia, http://commons.wikimedia.org/wiki/File:APL-Membership-Card.png.

The progression of the war in Europe generated fierce national debates about military preparedness. The Allies and the Central Powers had taken little time to raise and mobilize vast armies and navies. By comparison, the United States still fielded a miniscule army and had limited federal power to summon an adequate defense force before the enactment of conscription. When America entered the war, mobilization of military resources and the cultivation of popular support for the war consumed the country. Because the federal government had lacked the coercive force to mobilize before the war, the American war effort was marked by enormous publicity and propaganda campaigns. President Wilson went to extreme measures to push public opinion towards the war. Most notably, he created the Committee on Public Information, known as the “Creel Committee,” headed by Progressive George Creel, to enflame the patriotic mood of the country and generate support for military adventures abroad. Creel enlisted the help of Hollywood studios and other budding media outlets to cultivate a view of the war that pit democracy against imperialism, that framed America as crusading nation endeavoring to rescue Western civilization from medievalism and militarism. As war passions flared, challenges to the onrushing patriotic sentiment that America was making the world “safe for democracy” were labeled disloyal. Wilson signed the Espionage Act in 1917 and the Sedition Act in 1918, stripping dissenters and protestors of their rights to publicly resist the war. Critics and protestors were imprisoned. Immigrants, labor unions, and political radicals became targets of government investigations and an ever more hostile public culture. Meanwhile, the government insisted that individual financial contributions made a discernible difference for the men on the Western Front. Americans lent their financial support to the war effort by purchasing war bonds or supporting Liberty Loan Drive. Many Americans, however, sacrificed much more than money.

 

VI. Before the Armistice

The brutality of war persevered as European powers struggled to adapt to modern war. Until the spring of 1917, the Allies possessed few effective defensive measures against submarine attacks.  German submarines sank more than a thousand ships by the time America entered the war. The rapid addition of American naval escorts to the British surface fleet and the establishment of a convoy system countered much of the effect of German submarines. Shipping and military losses declined rapidly, just as the American Army arrived in Europe in large numbers. Although much of the equipment still needed to make the transatlantic passage, the physical presence of the Army proved to a fatal blow to German war plans.

In July 1917, after one last disastrous offensive against the Germans, the Russian army disintegrated. The tsarist regime collapsed and in November 1917 Vladimir Lenin’s Bolshevik party came to power. Russia soon surrendered to German demands and exited the war, freeing Germany to finally fight the one-front war it had desired since 1914. The German general staff quickly shifted hundreds of thousands of soldiers from the eastern theater in preparation for a new series of offensives planned for the following year in France.

In March 1918, Germany launched the Kaiserschlacht (Spring Offensive), a series of five major attacks. By the middle of July 1918, each and every one had failed to break through on the Western Front. A string of Allied offensives commenced on the Western Front On August 8, 1918. The two million men of the American Expeditionary Force joined British and French armies in a series of successful counter offensives that pushed the disintegrating German front lines back across France. German General Erich Ludendorff referred to launch of the counteroffensive as the “black day of the German army.” The German offensive gamble exhausted Germany’s faltering military effort. Defeat was inevitable. Kaiser Wilhelm II abdicated at the request of the German general staff and the new German democratic government agreed to an armistice (cease fire) on November 11, 1918. German military forces withdrew from France and Belgium and returned to a Germany teetering on the brink of chaos.

By the end of the war, more than 4.7 had million American men served in all branches of the military: four million in the Army, six hundred thousand in the Navy, and about eighty thousand in the Marine Corps. The United States lost over 100,000 men (Fifty-three thousand died in battle, and even more from disease). Their terrible sacrifice, however, paled before the Europeans’. After four years of brutal stalemate, France had suffered almost a million and a half military dead and Germany even more. Both nations lost about 4% of its population to the war. And death was not done.

 

VII. The War and the Influenza Pandemic

As the war still raged on the Western Front in the spring of 1918, a new threat appeared, one as deadly as the war itself. An influenza virus originated in the farm country of Haskell County, Kansas only a few miles from Camp Funston, one of the largest Army training camps in the nation. Labeled H1N1 by medical researchers working for the United States Public Health Service, the virus spread like a wildfire as disparate populations were brought together and then returned home, from the heartland to the coasts and then in consecutive waves around the world. The second wave was a mutated strain of the virus even deadlier than the first. The new virus struck down those in the prime of their lives: a disproportionate amount of the influenza victims were between the ages of 18 and 35 years old.

Between March and May 1918, fourteen of the largest American military training camps reported outbreaks of influenza. Some of the infected soldiers carried the virus on troop transports to France. By September 1918 influenza had spread to all training camps in the United States before mutating into its deadlier version. In Europe, influenza attacked both sides of soldiers on the Western Front. The “Spanish Influenza,” or the “Spanish Lady,” abruptly misnamed due to accounts of the disease that appeared in newspapers in neutral and uncensored Spain, resulted in the untimely deaths of an estimated fifty million people worldwide. Public health reports from the Surgeon General of the Army revealed that while 227,000 soldiers were hospitalized from wounds received in battle, almost half a million suffered from deadly influenza. The worst part of the epidemic struck during the height of the Meuse-Argonne Offensive in the fall of 1918 and compromised the combat capabilities of the American and German armies. During the war more soldiers died from influenza than combat. The pandemic continued to spread after the Armistice before finally fading in the early 1920s. To date, no cure exists for the H1N1 influenza virus.

 

VIII. The Fourteen Points and the League of Nations

As the flu virus wracked the world, Europe and America rejoiced at the end of hostilities. On December 4, 1918, President Wilson became the first American president to leave the country during his term. He intended to shape the peace. The war brought an abrupt end to four great European imperial powers. The German, Russian, Austrian-Hungarian and Ottoman empires evaporated and the map of Europe was redrawn to accommodate new independent nations. As part of the terms of the Armistice, Allied forces followed the retreating Germans and occupied territories in the Rhineland to prevent Germany from reigniting war. As Germany disarmed, Wilson and the other Allied leaders gathered in France at Versailles for the Paris Peace Conference to dictate the terms of a settlement to the war.

Earlier that year, on January 8, 1918, before a joint session of Congress, President Wilson offered an enlightened statement of war aims and peace terms known as the Fourteen Points. The plan not only dealt with territorial issues but offered principles upon which a long-term peace could be built, including the establishment of a League of Nations to guard against future wars. But in January 1918 Germany still anticipated a favorable verdict on the battlefield and did not seriously consider accepting the terms of the Fourteen Points. The initial reaction from Germany seemed even more receptive than the Allies. French Prime Minister Georges Clemenceau remarked, “The good Lord only had ten (points).”

President Wilson toiled for his vision of the post-war world. The United States had entered the fray, Wilson proclaimed, “to make the world safe for democracy.” At the center of the plan was a novel international organization–the League of Nations–charged with keeping a worldwide peace by preventing the kind of destruction that tore across Europe and “affording mutual guarantees of political independence and territorial integrity to great and small states alike.” This promise of collective security, that an attack on one sovereign member would be viewed as an attack on all, was a key component of the Fourteen Points.

But the fight for peace was daunting. While President Wilson was celebrated in Europe and welcomed as the “God of Peace,” his fellow statesmen were less enthusiastic about his plans for post-war Europe. America’s closest allies had little interest in the League of Nations. Allied leaders sought to guarantee the future safety of their own nations. Unlike the United States, the Allies endured firsthand the horrors of the war. They refused to sacrifice further. The negotiations made clear that British Prime Minister David Lloyd-George was more interested in preserving Britain’s imperial domain, while French Prime Minister Clemenceau sought a peace that recognized the Allies’ victory and the Central Powers’ culpability: he wanted reparations—severe financial penalties—and limits on Germany’s future ability to wage war. The fight for the League of Nations was therefore largely on the shoulders of President Wilson. By June 1919, the final version of the treaty was signed and President Wilson was able to return home. The treaty was a compromise that included demands for German reparations, provisions for the League of Nations, and the promise of collective security. For President Wilson, it was an imperfect peace, but better than no peace at all.

The real fight for the League of Nations was on the American homefront. Republican Senator Henry Cabot Lodge of Massachusetts stood as the most prominent opponent of the League of Nations. As chair of the Senator Foreign Relations Committee and an influential Republican Party leader, he could block ratification of the treaty. Lodge attacked the treaty for potentially robbing the United States of its sovereignty. Never an isolationist, Lodge demanded instead that the country deal with its own problems in its own way, free from the collective security—and oversight—offered by the League of Nations. Unable to match Lodge’s influence the Senate, President Wilson took his case to the American people in the hopes that ordinary voters might be convinced that the only guarantee of future world peace was the League of Nations. During his grueling cross-country trip, however, President Wilson suffered an incapacitating stroke. His opponents had the upper hand.

President Wilson’s dream for the League of Nations died on the floor of the Senate. Lodge’s opponents successfully blocked America’s entry into the League of Nations, an organization conceived and championed by the American president. The League of Nations operated with fifty-eight sovereign members, but the United States refused to join, refused to lend it American power, and refused to provide it with the power needed to fulfill its purpose.

 

IX. Aftermath of World War I

The war transformed the world. It drastically changed the face of the Middle East, for instance. For centuries the Ottoman Empire had shaped life in the region. Before the war, the Middle East had three main centers of power: Egypt, the Ottoman Empire, and Iran. President Wilson’s call for self-determination appealed to many under the Ottoman Empire’s rule. In the aftermath of the war, Wilson sent a commission to investigate the region to determine the conditions and aspirations of the populace. The King-Crane Commission found that most of the inhabitants favored an independent state free of European control. However, these wishes were largely ignored, and the lands of the former Ottoman Empire divided into mandates through the Treaty of Sevres at the San Remo Conference in 1920. The Ottoman Empire disintegrated into several nations, many created in part without regard to ethnic realities by European powers. These Arab provinces were ruled by Britain and France, and the new nation of Turkey emerged from the former heartland of Anatolia. According to the League of Nations, mandates “were inhabited by peoples not yet able to stand by themselves under the strenuous conditions of the modern world.” Though allegedly for the benefit of the people of the Middle East, the mandate system was essentially a reimagined form of nineteenth-century imperialism. France received Syria; Britain took control of Iraq, Palestine, and Transjordan (Jordan). The United States was asked to become a mandate power, but declined. The geographical realignment of the Middle East also included the formation of two new nations: the Kingdom of Hejaz and Yemen. (The Kingdom of Hejaz was ruled by Sharif Hussein and only lasted until the 1920s when it became part of Saudi Arabia.)

The fates of Nicola Sacco and Bartolomeo Vanzetti, two Italian-born anarchists who were convicted of robbery and murder in 1920, reflected the Red Scare in American society that followed the Russian Revolution in 1917. Their arrest, trial, and execution inspired many leftists and dissenting artists to express their sympathy with the accused, such as in Maxwell Anderson’s Gods of the Lightning or Upton Sinclair’s Boston. The Sacco-Vanzetti demonstrated a newly exacerbated American nervousness about immigrants’ and the potential spread of radical ideas, especially those related to international communism after the Russian Revolution.

When in March 1918 the Bolsheviks signed a separate peace treaty with Germany, the Allies planned to send troops to northern Russia and Siberia prevent German influence and fight the Bolshevik revolution. Wilson agreed, and, in a little-known foreign intervention, American troops remained in Russia as late as 1920. Although the Bolshevik rhetoric of self-determination followed many of the ideals of Wilson’s Fourteen Points—Vladimir Lenin supported revolutions against imperial rule across the world—imperialism and anti-communism could not be so easily undone by vague ideas of self-rule.

While still fighting in WWI, President Wilson sent American troops to Siberia during the Russian Civil War for reasons both diplomatic and military. This photograph shows American soldiers in Vladivostok parading before the building occupied by the staff of the Czecho-Slovaks (those opposing the Bolsheviks). To the left, Japanese marines stand to attention as the American troops march. Photograph, August 1, 1918. Wikimedia, http://commons.wikimedia.org/wiki/File:American_troops_in_Vladivostok_1918_HD-SN-99-02013.JPEG.

While still fighting in WWI, President Wilson sent American troops to Siberia during the Russian Civil War for reasons both diplomatic and military. This photograph shows American soldiers in Vladivostok parading before the building occupied by the staff of the Czecho-Slovaks (those opposing the Bolsheviks). To the left, Japanese marines stand to attention as the American troops march. Photograph, August 1, 1918. Wikimedia, http://commons.wikimedia.org/wiki/File:American_troops_in_Vladivostok_1918_HD-SN-99-02013.JPEG.

At home, the United States grappled with harsh postwar realities. Racial tensions culminated in the Red Summer of 1919 when violence broke out in at least twenty-five cities, including Chicago and Washington, D.C. The riots originated from wartime racial tensions. Industrial war production and massive wartime service created vast labor shortages and thousands of southern blacks travelled to the North and Midwest to escape the traps of southern poverty. But the so-called Great Migration sparked significant racial conflict as local whites and returning veterans fought to reclaim their jobs and their neighborhoods from new black migrants.

But many American blacks, who had fled the Jim Crow South and traveled halfway around the world to fight for the United States, would not so easily accede to postwar racism. The overseas experience of black Americans and their return triggered a dramatic change in black communities. W.E.B. DuBois wrote boldly of returning soldiers: “We return. We return from fighting. We return fighting. Make way for Democracy!” But white Americans desired a return to the status quo, a world that did not include social, political, or economic equality for black people.

In 1919 America suffered through the “Red Summer.” Riots erupted across the country from April until October. The massive bloodshed during included thousands of injuries, hundreds of deaths, and a vast destruction of private and public property across the nation. The Chicago Riot, from July 27 to August 3, 1919, considered the summer’s worst, sparked a week of mob violence, murder, and arson. Race riots had rocked the nation, but the Red Summer was something new. Recently empowered blacks actively defended their families and homes, often with militant force. This behavior galvanized many in black communities, but it also shocked white Americans who alternatively interpreted black resistance as a desire for total revolution or as a new positive step in the path toward black civil rights. In the riots’ aftermath, James Weldon Johnson wrote, “Can’t they understand that the more Negroes they outrage, the more determined the whole race becomes to secure the full rights and privileges of freemen?” Those six hot months in 1919 forever altered American society and roused and terrified those that experienced the sudden and devastating outbreaks of violence.

 

X. Conclusion

World War I decimated millions and profoundly altered the course of world history. Postwar instabilities led directly toward a global depression and a second world war. The war sparked the Bolshevik revolution that the United States later engaged in Cold War. It created Middle Eastern nations and aggravated ethnic tensions that the United States could never tackle. By fighting with and against European powers on the Western Front, America’s place in the world was never the same. By whipping up nationalist passions, American attitudes toward radicalism, dissent, and immigration were poisoned. Postwar disillusionment shattered Americans’ hopes for the progress of the modern world. The war came and went, and left in its place the bloody wreckage of an old world through which the world travelled to a new and uncertain future.

 

This chapter was edited by Paula Fortier, with content contributions by Tizoc Chavez, Zachary W. Dresser, Blake Earle, Morgan Deane, Paula Fortier, Larry A. Grant, Mariah Hepworth, Jun Suk Hyun, and Leah Richier.

cc-by-sa-icon