April/May/June 2021 | Washington Monthly https://washingtonmonthly.com/magazine/april-may-june-2021/ Tue, 01 Nov 2022 17:54:38 +0000 en-US hourly 1 https://washingtonmonthly.com/wp-content/uploads/2016/06/cropped-WMlogo-32x32.jpg April/May/June 2021 | Washington Monthly https://washingtonmonthly.com/magazine/april-may-june-2021/ 32 32 200884816 How to Fight Authoritarianism https://washingtonmonthly.com/2021/04/04/how-to-fight-authoritarianism/ Mon, 05 Apr 2021 00:45:24 +0000 https://washingtonmonthly.com/?p=127561 Biden, Merkel, Xi, Illustration

To manage the rise of China and other illiberal forces, the U.S. and Europe need a new kind of alliance.

The post How to Fight Authoritarianism appeared first on Washington Monthly.

]]>
Biden, Merkel, Xi, Illustration

As he spoke from the sunny steps of the U.S. Capitol during his inauguration, Joe Biden acknowledged that this will be “a time of testing.” He enumerated the crises we face—“an attack on democracy and on truth, a raging virus, growing inequality, the sting of systemic racism, a climate in crisis, America’s role in the world.” He vowed to “repair our alliances and engage with the world once again.”

Despite the president’s strong vision, the events of January 6 remain at the forefront of American concerns: a successful assault on the halls of Congress; a Republican Party in thrall to a politically gifted, defeated, and vengeful demagogue and his supporters; and the painful implications for effective, democratic governance. What has emerged in the near term is a struggle to enforce accountability for the past four years. But most observers understand that it will be impossible to fix the country without addressing the underlying conditions that spawned the ugly events: racism; growing inequalities in wealth and income; and large segments of the American population left behind in an economy driven by financialization and high technology. 

In the midst of all this, we are also facing increasingly severe challenges abroad. Recognition is now dawning across America that this includes not just terrorists but also China and Russia. These problems are bound together as consequences of our own choices in foreign policy, politics, and the economy. 

We have to recalibrate our policies at home and abroad. Americans can no longer assume that we will be the indispensable power—nor can we simply turn to the private sector to lead the country. We need a new way forward that steels us against both internal and external challenges. That way involves a much deeper, more structured relationship with the European Union and the United Kingdom.

Why the EU and the UK? There are several reasons. Combined, the continent and Britain are home to more than 500 million people, enough to put our collective population on closer footing with that of China. Both the EU and the UK are advanced economies, capable of sustaining important investments in research and infrastructure. But above all else, we must partner with the European Union and the United Kingdom because of our long-standing, shared liberal values and a shared recognition that those values are at risk. These common commitments—to freedom, privacy, equal opportunity, fair labor practices, environmental stewardship, and respect for the rule of law—are threatened both by authoritarian leaders in China and Russia and by practices the U.S. and the EU have condoned or failed to rein in, from tax havens that shield ill-gotten gains to monopolies that undermine entrepreneurs. Recommitting to these principles will allow us to reinvigorate our democracies.

Forming a robust, values-based partnership with the EU and the UK will be difficult. The United States has not done the best job of living up to many of these principles, especially under the previous administration, which may make our partners across the Atlantic doubt our reliability. It will therefore require that we make progressive policy changes that cannot easily be undone. For that reason, it would be best if this deeper, more structured relationship were codified by an agreement binding all parties to a pro-democracy reform agenda. 

A formal treaty is ideal, and a sufficient number of Senate Republicans, whose concerns about China have been ramped up by the previous administration, might be motivated to support one. Even without Republican votes, however, there are ways Biden and congressional Democrats could get a binding agreement over the finish line. One way or another, we need to strengthen the Atlantic alliance. Authoritarian demagogues, both domestic and foreign, are testing American and European commitment to democracy. We need to partner with each other to save this system of government—and ourselves.

In the 1990s, the United States was the sole superpower. The American economy led the world by sheer strength. But in the 2000s, against the warning of some allies, we deployed our military power to the Middle East and Southwest Asia in response to the terrorist strikes orchestrated by Osama bin Laden. The invasion of Iraq was the most costly strategic blunder in U.S. history: We empowered Iran; unleashed a worldwide wave of terrorism and displacement; caused hundreds of thousands of deaths; wreaked quiet havoc at home; and distracted our country from its global responsibilities for the better part of two decades. As our economy fitfully recovered from the Great Recession, we continued to struggle with ISIS in Iraq and Syria, remained bogged down in Afghanistan, and contended with Iran.

At the same time, we came to see growing challenges from an increasingly authoritarian China and Russia. Today the American national security community is unified in understanding that China presents a long-term and increasingly profound threat to fundamental American values and interests. Russia is a dangerous spoiler, growing ever more closely aligned with China. 

With its 1.4 billion people, China has sustained unprecedented economic growth for more than three decades; its economy will soon overtake that of the U.S. in GDP. The Chinese middle class alone is larger than the population of the United States. The country’s production of steel is 10 times that of America; its automobile production twice as great; its intellectual property production—measured by patents filed—two and a half times as great. For 11 years, China has graduated more science and engineering students than the United States and the six largest EU nations combined. Chinese technology is on a par with the U.S. in many sectors, and perhaps more advanced in fields like artificial intelligence and quantum computing. 

China’s government maintains its legitimacy by promoting historic Chinese nationalism and delivering wealth and higher standards of living to its people, but its grip is enforced through surveillance and repression. President Xi Jinping has increasingly concentrated power personally, and he takes an assertive and expansionist view of China’s role and responsibilities. Under his leadership, the country has claimed and militarized the South China Sea and is pushing territorial claims against both India and Japan. China continues to strengthen its economy with vast flows of foreign exchange from exports and sales of debt, and also uses these funds to invest in infrastructure abroad though its Belt and Road Initiative and other programs. It is also a major investor in U.S. Treasury debt. China brings a so-called whole-of-society approach to its strategic ambitions: Every business, every student, and every investment is potentially in service to Xi’s dream of a greater China and is a means to collect information, exert pressure, or gain dominance. 

China has for decades drawn technology from the West through foreign investments, theft, and cybercrime. According to its most recent Five-Year Plan, China wants increased self-sufficiency from Western technology, and to further its own technological supremacy. Simultaneously, China is seeking freedom from the U.S. dollar in trade and investments, as well as the creation of a new set of global institutions that would replace those established by the United States at the end of World War II. Simply put, China wants to replace the U.S. in global leadership and impose a China-centric, power-dominated global order.

While China doesn’t seek war in its quest for global leadership, it is assembling the military power to protect and project power globally. China’s armed forces now possess stealth aircraft, by far the largest navy in the Pacific, two (soon to be three) aircraft carriers, and an exquisitely developed, missile-backed, anti-access/area denial strategy aimed at deterring the United States. In the event of a conflict in the western Pacific, China could possibly establish regional military dominance in the South China Sea and the western Pacific through the use of its aircraft carriers, land-based bombers and attack aircraft, long-range, maneuverable ballistic missiles, stealthy submarines, and excellent targeting capabilities. While Taiwan is not without defenses, the U.S. would be challenged to defend it effectively if China mounted a full-force attack on the island. Taiwan’s safety relies on deterring conflict, not winning it. China is establishing its “string of pearls” naval bases across the Indian Ocean and into Africa. China’s modernization is reaching into space, with lasers, rockets, and cyberweapons that have the potential to destroy satellites at all altitudes—critical to American military communications and navigation—and an ambitious manned space flight program. As China increases its pressure campaign against Taiwan, the threat of conflict is growing. 

For decades, China’s only ally was North Korea—now a nuclear power with the capacity to strike the United States. But today, China and Russia are aligning policies and actions more closely than ever. Russia, since its war with Georgia, has undertaken significant military modernization. Always strong in air defense, it has produced new hardware for air and ground warfare, new technologies, including hypersonic and nuclear weaponry, and effective expeditionary forces. Russia is now in Syria, Libya, and several locations in Africa with its uniformed forces or mercenary organizations, like the Wagner Group. It retains formidable nuclear power.

Like China, Russia has become an increasingly determined rival of the United States, contesting American activities abroad diplomatically and with military advice, assistance, and intervention. It is also undercutting the United States and NATO through a variety of soft-power means. Russian President Vladimir Putin aims to dominate western Europe through supplies of natural gas delivered via various pipelines, including its nearly completed Baltic pipeline, Nord Stream 2, plus reconstruction of Syria’s oil infrastructure, and efforts to control Libya’s oil fields. Russian money flows both legally and illegally into the West, where it buys friends and influence and corrupts democratic processes. Russian cyberactivity spreads disinformation, disrupts elections, and affords opportunities for destructive interference in Western economies and institutions. Russia’s white Christian supremacist doctrine is peddled throughout the West, where it has fueled nationalist sentiments in Hungary and Austria, as well as reaching deeply into the United States. 

For more than a decade, U.S. national security strategists have recognized that these growing challenges abroad cannot be addressed without first fixing what’s wrong here at home. Foreign-backed disinformation campaigns thrive in a damaged polity, and 40 years of delegitimizing the federal government and increasing our reliance on the private sector have left the economy frayed and social structure weakened. While there has been significant GDP growth with the emergence of new technologies, especially in communications and information processing, most Americans sense that something is amiss—and they are right. During this time an increasingly fragmented and costly education system has been failing to prepare young people for the modern economy. Millions have been left without access to affordable health care despite our costly medical system. Trade agreements, outsourcing, automation, and information technology have gutted American industry, leaving deep pockets of unemployment and underemployment in a system that delivers an ever-increasing concentration of wealth and income at the top as the economy has become financialized. We are suffering a crisis tied to pain-relieving opioids, which saps the energy and economy of much of rural America.

Unfortunately, the American political system has not faced up to the challenges. Partisanship has steadily deepened throughout the country. Struggles over private school funding and alternatives to public schools and teachers’ unions still rage. The Affordable Care Act was never fully implemented, lacks a public option, and has been bitterly fought over for a decade. President Donald Trump started a trade war with China and promised to bring back the coal industry, but he failed to halt the economic hollowing out of American manufacturing, most recently exposed in the nation’s great dependence on foreign suppliers for critical personal protective equipment. Where manufacturers have invested, the increasing reliance on robotics and automation has prevented significant growth in manufacturing employment or the return of organized labor. Even a $15 minimum wage is highly contentious. Opioid manufacturers have been taken to court, but legal accountability has failed to stem the epidemic. No infrastructure bill has been passed despite more than a decade of promises and a very clear need. COVID-19 has killed more than 500,000 Americans while deepening the problems of unemployment, lost wages, and growing inequalities of income and wealth. When all of this is combined with a pervasive climate of disinformation on the political right, it’s little wonder that Trump and his allies were able to incite an angry mob to storm the Capitol.

But even as the Biden administration attempts to address these domestic issues and “build back better,” most strategists recognize that we cannot maintain our place in the world, and our security at home, without relying on our partners overseas. Allies can potentially offset the weight of China’s huge population and formidable economy, with everything that entails in terms of talent, attractive markets, and investible surpluses. Allies can reinforce America’s voice in international institutions, resist the blandishments of strategically significant Chinese and Russian investments, work against money laundering and state-sponsored corruption, and embargo the release or transfer of sensitive technologies to China and Russia. We can help encourage Xi and Putin to modify their aims and fold their countries peacefully into extant international institutions and the rules-based international order. 

China, of course, knows this, and as part of their long-range strategy they have worked steadily and systematically to invest abroad and, simultaneously, to undercut American influence with friends and allies. At first, the major area of contention was Africa. Stepping into the aftermath of decolonization and conflict, Chinese army engineers deployed to build roads, the China Rail Corporation and others sought to build railways to connect the continent, and the Chinese government began to offer very attractive loans to African nations looking to fund various endeavors. All of this both assured China access to Africa’s raw materials and also undercut Western influence. Central and South America have been open for Chinese interest and investments, including shipping interests in Panama and deep commercial and agricultural ties with Brazil, Peru, and others in South America. Southeast Asia is all too aware of China’s assertiveness and aims, but is nonetheless heavily reliant on China as an economic engine and is susceptible to Chinese pressures. However flawed the Trans-Pacific Partnership might have been, its cancellation opened the door to greater Chinese influence in the region and cast doubt on America’s long-term dependability as an ally. The historic enmity between China and Japan and the legacy of the Korean War has stymied Chinese expansionism into Northeast Asia, though both Korea and Japan are heavily engaged with China economically. 

This leaves Europe the remaining and most important area of contention. To manage China’s ascent, we need its help. Thankfully, we already have close relationships with almost all of its states. The U.S. is the largest investor in the European Union, and the EU nations are the largest source of our direct foreign investment. While its GDP has failed to grow much over the past decade, the EU has technologies and heavy industries either on a par with ours or superior. 

But although Europe has been deeply reliant on the U.S. for almost a century, the relationship has always been fraught with tension. Europeans resented American efforts to push decolonization in the 1940s, ’50s, and ’60s as well as our refusal to share nuclear technology. Germany always felt the magnetic pull of markets to the east, and a large measure of guilt and regret over the tragedy of World War II. France struggled to maintain its language, culture, and economy against what one prominent French politician and intellectual called “the American challenge.” From the beginning, Americans complained that Europe was not bearing a fair share of the defense burden within NATO. The formation of the European Union itself was an effort to preserve cultures and relative freedom of action against the American superpower. Most Europeans opposed the U.S. invasion of Iraq in 2003. When the U.S. pivoted to Asia in 2011, Europe felt a further sense of abandonment. And by early 2021, after four years of criticism and rebuke by Trump, America’s reputation for reliability and competence had been diminished. 

For China, this has been a tremendous opening. Europe has technologies China covets, is a growing market for Chinese exports as well as a source of capital, and is a general geostrategic target. China has assiduously courted European favor and recently surpassed the U.S. as the largest of Europe’s trading partners. China loves Italian fashions and owns the port of Piraeus in Greece. Rail shipments now connect China and Spain. In December 2020, China and the European Union finished negotiating the EU-China Comprehensive Agreement on Investment, a deliberate effort from some in Europe to gain greater distance from the U.S. For China, it was a successful wedge between the U.S. and Europe. In the U.S., it was received with shock. 

In view of the Russian challenge in Europe, providing our allies the security of the American nuclear umbrella and military deterrence is necessary—and we must support the military modernization and deployments necessary to maintain it. But especially when it comes to China, deterrence and conventional economic arrangements will no longer be sufficient to protect our interests—or theirs. We cannot simply appeal to fear alone. Regaining America’s global leadership requires renewed military commitments, as well as convincing Europe to view alignment with a democratic, fractious U.S. as more attractive than linkage with the rising power of the authoritarian but effective China.

Traditionally, when binding itself more closely to overseas partners, the United States has had separate alliances and trade agreements—broadly related but in two different channels. In the former case, the focus is on military risks, plans, and exercises. As the largest power, and with global interests, American capabilities dominate the alliances, like NATO, and focus on deterrence. In the latter, the U.S. seeks agreements that promote business opportunities abroad and serve consumers at home. These agreements are usually laboriously prepared to protect domestic interests, and there is the inevitable give-and-take for consensus and ratification with negotiating partners and at home. 

Facing the China challenge requires blending together our traditional security alliances like NATO with broader economic and social commitments that enable us and our allies to work together to contend with the challenges we face. This will not be easy. Europe has deep and understandable concerns about America’s reliability as an ally. While NATO survived the last administration, Trump’s many disparagements of the alliance rattled its foundations. Meanwhile, efforts begun under the Obama administration to negotiate the Transatlantic Trade and Investment Program, or TTIP, collapsed under Trump, leading the two continents to go their separate ways; the U.S.-China “Phase I” trade agreement was announced in early 2020, and a China-EU investment deal was announced in December. 

But there is a decisive opening for the United States: values. There is mutual recognition among most governments in Europe, and among our new government here, that we need to reinforce liberal democratic principles from sustained attack, and the U.S.-China and China-EU agreements do not effectively address that. (Nor, for that matter, did TTIP.) Instead, they are about economic interests and access to markets. While the EU-China agreement has weak promises of transparency and labor standards, they are unlikely to be kept. 

It is through this shared goal that the U.S. and Europe can revive and deepen their relationship. Thankfully, the new administration is already taking steps in this direction. Biden has announced his intent to convene what he calls a “Summit for Democracy” with democratic countries from around the world. This provides the framework within which a variety of efforts, programs, and policies can be developed and sustained. Of course, the relationship with the EU will give the summit its scope and power, but we should also invite our Asian democratic allies. Through the summit, we should move forward with the EU on crafting shared policies along multiple tracks: security; finance and investment; the environment and climate change; labor policies and trade; and democratic values. 

In each track, progress will be driven by mutual interests, but the efforts in one track can be reinforced by work in another. For instance, efforts in the security track to identify and preclude sales of sensitive hardware to China and Russia might be allied with greater transatlantic research and development programs or some relief for Europe from our policy to “buy American” with defense procurements. Restricting Chinese and Russia investments in sensitive European infrastructure might need to be related to greater American and European Central Bank financing, perhaps with new programs for financing European infrastructure. Certain efforts, like achieving vehicle electrification and phasing out older power plants, might benefit from greater coordination in transatlantic and U.S.-Japanese incentive programs. The art of diplomacy is to manage and coordinate multiple channels and means of influence to achieve the desired result, and creating a new architecture for collaboration would begin with tying many issues together. 

We would ask Europe to restrict tech companies like Huawei that might compromise security, or Chinese and Russian infrastructure or other strategic investments that could further either country’s influence. We would also seek assurances that our allies would not export sensitive technologies and capital goods to these countries, or invest in military-related Chinese and Russian enterprises. We would seek an end to the threat of the Nord Stream 2 pipeline, a project that would increase Europe’s vulnerability to Russia. 

The Europeans will also have needs and interests. The American market must be able to welcome more German and Italian heavy industrial products that would otherwise seek outlets in Russia, as well as German and French high tech that might find easier receipt in China. To help eliminate the pipeline, the EU will want a more ready supply of natural gas from the United States. 

But perhaps the most significant demands will be more fundamental. As part of any deepened partnership, Europe will want American commitments to move toward a carbon-neutral economy by a fixed date. European politicians will also want to understand the mechanisms to make this happen, regardless of which political party is in power. They will also seek assurances on labor standards, data privacy, monopolistic practices, and the social safety net; this will require U.S. commitments to raise labor’s share of the GDP and to promote workforce development and safety net programs on both sides of the Atlantic, as well as tackling other tough issues at home.

In other words, bringing Europe successfully into alignment on U.S.-China policies will require a greater alignment of our economy with some “European” values. This is not far from what the Biden administration and the increasingly strong progressive voices here have been seeking: strengthening the social safety net; reducing gross inequalities in opportunity, income, and wealth; and trying to make the cost-price models of the private economy more inclusive of the externalities they generate, in everything from greenhouse gas generation to pushing millions of families into greater reliance on debt and government programs. These are some of the issues on which the American political system has faltered in recent decades. 

Still, seeking the best of both systems provides benefits on both sides of the Atlantic. Europe has consistently lagged behind the U.S. and China in economic growth and recovery and is, consequently, also struggling with resurgent nationalism. Europe should welcome deeper economic engagement with the United States. But in return, they will expect us to stop some of the vicious cost cutting at the expense of labor and its benefits that has come to define transatlantic industrial competition.

The design and process of negotiating the new agreement should be carefully considered. The discarded TTIP was to be detailed and prescriptive. There would have been winners on both sides of the Atlantic, but there would have been losers, too. It was crafted over three years of negotiations, and, like most trade talks, it was cloaked in secrecy. The same anxieties will be exposed in working on this agreement, but the issues—our national security, economic well-being, and the future of democracy—are even more pressing. A different design and process must be adopted. Instead of rolling out a finished agreement after detailed and meticulous negotiations, the process must begin with alternative structures in mind. 

This new agreement would ideally take the form of a treaty alliance. Discussions would follow the model of NATO; the talks that produced that treaty were concluded in little more than a year, following a coup in Czechoslovakia and the Soviet shutdown of ground access to Berlin. The alliance has survived because it was based on broad principles and worked as a living organization, evolving to face new challenges and to continuously meet the interests of member nations. As a treaty, it presented member states with legally binding obligations. 

Perhaps this new agreement could be called the “Democracy Partnership.” There could be a relatively rapid set of discussions that could lead to broad but binding commitments in principle, followed by phased implementation via treaty. Similar to NATO, it could have a secretariat and representatives from its member states. There could be periodic meetings at the ministerial and head-of-state levels, perhaps synchronized with the NATO schedule. States would make common commitments and then develop and implement them. The European Union’s role would be a matter of particular consideration. On which issues can it represent its member states? Are its decisions on particular issues recommendations or obligations? Can the EU be made into a stronger partner?

Efforts would likely begin with commitments on climate change and the protection of sensitive technologies. Then there would be agreements on how to consider and approve foreign infrastructure investments—the current EU and U.S. procedures need to be jointly reviewed. Preferential investment incentives in our respective critical infrastructure must also be developed. There are models and memories of such efforts within NATO and its infrastructure program, but now the needs are broader. Unlike TTIP, investment incentives would be geared not so much to the financial sector as to real investments, creating the modern and resilient communications, data, banking, transportation, and power systems that are the lifeblood of modern economies. This could reinforce the long-awaited American infrastructure program and link it to similar investments in the EU. 

Strong relationships already exist among central banks on both sides of the Atlantic, but these could be further structured to harmonize more effective fiscal and monetary policies within the EU. Chinese access to U.S. and European capital markets should be more finely regulated to prevent feeding the industrial and military machine that threatens democratic governance in the West. More broadly, Western financial institutions and corporations need guidelines that assure that their pursuit of profits is in line with the security of democratic values and governance, even at the expense of their fiduciary obligations to maximize returns to investors. Basic research and development will need greater investment, and intellectual property will need more rational protection. There would be new measures to harmonize antimonopoly practices and data and privacy considerations. Many of these issues are already actively under discussion or in the court system. What is new is seeking agreements on these issues in multiple forums consistent with broadly agreed transatlantic principles. 

Can such a framework be crafted and approved in the United States and Europe? There are serious obstacles. The GOP is hostile to progressive change, and it will be difficult to get 17 Senate Republicans to join Democrats in voting for a treaty that binds America to stronger regulatory standards.

But for three years, the Republican Party has worked against China with protective tariffs and weapon sales to Taiwan. It has staunchly criticized China on the issues of Taiwan, Hong Kong, and COVID-19, and focused on the threat of Huawei and other Chinese tech companies. China might be the single most powerful issue around which Democrats and Republicans could unite. Certainly the Republicans will favor stronger military and military-related efforts to constrain and contain China. Only private consultations with the Republican leadership can ascertain whether they realize that confrontation with China will not be successful if we go it alone, and determine how much support the party is willing to give. 

Europe’s deepening ties with China and its concerns about the U.S. also pose challenges. But the promise of economic assistance, such as more infrastructure financing, could help sluggish EU economies sign on. Progressive policy change in Washington, such as the kind Democrats are currently pushing, will also help our promises appear more credible.

If a formal treaty-based organization is not immediately feasible, then perhaps the Summit for Democracy would lead to the creation of a permanent secretariat to coordinate, raise, and steer issues. An official treaty would not necessarily legally bind member states, but there could be preliminary agreement to cooperate, followed by more traditional diplomacy. 

One key criterion, however, is that the summit would need to generate prompt actions—if not a treaty, then promises to work on related issues and firm deadlines for agreement and implementation. The secretariat, for example, could condition participation in future summits on passing a nontraditional but still enforceable trade agreement requiring that states fight climate change, combat monopolization, and tighten investment rules to prevent China and Russia from gaining access to critical infrastructure. In the U.S., trade agreements can be ratified via the fast-track process, which is not subject to the filibuster, making Republican support not essential.

Even if we must start small, both the U.S. and Europe might find that momentum for more change will build once we begin to implement a values-based agreement. After all, this deal would help fix the institutional failures that have led isolationism and nationalism in the U.S. and Europe to thrive. For four decades, it has been a mantra in the U.S. that the private sector can do it better than government. Yet it is government that must provide the boundaries, priorities, resources, and incentives that enable the private sector and markets to work for the public good. In Europe, government action has been hobbled by the criteria laid out in the EU’s founding treaty, fears of inflation, and notions of austerity that hearken back to mistaken economic ideas of the 1930s. The resulting public suffering has led to declining respect for governmental and European-wide institutions. 

The question, then, is whether nations are willing and able to begin this hard work. Policymakers in the United States and Europe recognize the extraordinary systemic challenge posed by China. But dealing with economic and political interests is far more challenging for democracies than agreeing on military policies to deter or halt aggression. Diplomats and leaders must have the courage to take the necessary risks, because as China strengthens its global reach, such agreements will only become more difficult. 

China will, of course, work against such arrangements. For centuries, its emperors sowed division and conflict among the Mongol tribes north of the Great Wall in order to protect their country. China will use its traditional tools of blandishments, rumors, and innuendo. It will also use the magnetic appeal of its culture and rising wealth as well as its modern technologies. 

But it is to China’s benefit as well as our own that we create a stronger transatlantic linkage. The West must continue to trade with China and collaborate in areas of common interest, such as climate change; for China, a stronger U.S.-European alliance would provide a more reliable and capable partner in dealing with global issues. It would also give the country a more rational, predictable competitor, and a stronger reason to look for alternatives to coercion and the use of force. Ultimately, this agreement would not be an anti-China alliance; it would be, rather, an effort by a group of democracies to find support for their own values and interests as the emerging economies of the 20th century rightfully move into global prominence. 

Going forward with this new transatlantic agenda would be one of the most visible and effective means of restoring American global leadership, as well as dealing with the painful issues exposed on January 6. There is no time for delay.

The post How to Fight Authoritarianism appeared first on Washington Monthly.

]]>
127561
America’s Next Insurgency https://washingtonmonthly.com/2021/04/04/americas-next-insurgency/ Mon, 05 Apr 2021 00:40:15 +0000 https://washingtonmonthly.com/?p=127559 Militia men

The January 6 violence could signal the start of nationwide conflict not seen since the Civil War. Can we stop it?

The post America’s Next Insurgency appeared first on Washington Monthly.

]]>
Militia men

Bleeding Kansas began with an eviction attempt. In late 1854, Jacob Branson, an abolitionist from Ohio, started trying to kick Franklin Coleman, a slavery proponent, off his property. Roughly a year later, Coleman ran into a friend of Branson’s at a local blacksmith’s shop. The friend berated Coleman for continuing to squat on the land and demanded that he desist. It’s not clear what, if anything, Coleman said in response. But it is clear what he did. As the friend walked away, Coleman took out a gun and killed him.

Fearing reprisal in what was a largely antislavery community, Coleman fled to a nearby town and turned himself in to a proslavery sheriff. That sheriff promptly freed him and then arrested Branson. Local abolitionists, many of whom were already furious about the murder, grew incensed. They intercepted the sheriff at gunpoint and liberated his prisoner.

News of the murder, arrest, and jailbreak spread rapidly across Kansas. Both proslavery and antislavery activists formed militias, and it wasn’t long before violence began to erupt. On May 21, 1856, 800 slavery supporters sacked the city of Lawrence—home to the state’s antislavery leaders—looting houses and murdering one resident. In response, a group of abolitionists led by John Brown killed five proslavery settlers in Franklin County. Hundreds of slavery supporters retaliated by attacking an antislavery settlement in the town of Osawatomie, murdering several locals and burning most of the village to the ground. Abolitionists then drove proslavery forces out of Linn County. Slavery proponents next pulled 11 antislavery settlers from their homes and shot them down.

Bleeding Kansas is, per its name, most famous for the bloodshed. But the clash went further than raids. The two camps established rival territorial administrations, each claiming to represent the entirety of the state. They drafted their own constitutions, passed their own laws, and egged on their side’s combatants. Both petitioned Washington for official recognition. But the U.S. capital, itself polarized by disputes over America’s original sin, was unable to decide which group ought to be in command. It was not until the South seceded that Kansas was finally admitted to the Union.

There are many critical differences between the 1850s and today. The government is now far more expansive and powerful than it was in the antebellum era. There is no modern problem as singular and overriding as slavery was; we are instead polarized over many issues. And while there are geographic dimensions to our divisions, they are not nearly as clean as those that once split the U.S. Much like territorial Kansas, almost every American state has its own union and its own confederacy.

Studies suggest that a growing number of Americans think political violence is acceptable. In a January poll, researchers found that 56 percent of Republicans believe that “the traditional American way of life is disappearing so fast that we may have to use force to save it.”

But there are also clear parallels. The present United States may be more polarized than it has been at any time since the 1850s. Large swaths of the population simply refuse to accept the election of political opponents as legitimate. Many of the social issues that divide us, in particular questions of systemic discrimination, stem from slavery. 

Most frighteningly, research suggests that a growing number of Americans believe that political violence is acceptable. In a 2017 survey by the political scientists Lilliana Mason and Nathan Kalmoe, 18 percent of Democrats and 12 percent of Republicans said that violence would be at least a little justified if the opposing party won the presidency. In February 2021, those numbers increased to 20 percent and 28 percent, respectively. Other researchers have found an even bigger appetite for extreme activity. In a January poll conducted by the American Enterprise Institute, researchers asked respondents whether “the traditional American way of life is disappearing so fast that we may have to use force to save it.” Thirty-six percent of Americans, and an astounding 56 percent of Republicans, said yes.

All of this raises a serious question: Could the United States experience prolonged, acute civil violence? 

According to dozens of interviews with former and current government officials, counterterrorism researchers, and political scientists who study both the U.S. and other countries, the answer is yes. “I think that the conditions are pretty clearly headed in that direction,” says Katrina Mulligan, the managing director for national security and international policy at the Center for American Progress and the former director for preparedness and response in the national security division at the Department of Justice (DOJ). The insurrection on “January 6 was a canary in the coal mine in a way, precisely because it wasn’t a surprise to those of us who have been following this.”

“Unfortunately, I think it’s a heightened risk,” Janet Napolitano, the former secretary of homeland security, told me. As evidence, she cited the Capitol attack, as well as “the rhetoric that’s being exchanged on social media, and just the number of groups out there that are organized and don’t seem reticent about using violence.”

Scholars of conflict differed in their estimates of how much violence might erupt, from sporadic terrorist attacks to a sustained insurgency. Individual assaults could be successfully handled by local and state police, but they could also easily escalate into a broader conflagration requiring federal involvement and inspiring copycat attacks. Experts also listed a wide range of potential targets, from Democratic politicians and institutions affiliated with minority groups to city halls and state government buildings.

But officials and researchers overwhelmingly agreed on the main source of the threat: the radical right. Despite overwrought warnings of “antifa,” it has been extreme conservatives who have driven into crowds of protestors, killing liberal activists. No leftists have murdered police officers or security guards, as right-wing fanatics did last summer in California. Progressives have not called for a race war or the bloody overthrow of the federal government. “Primarily, this is a far-right problem,” Napolitano said. “We saw it pretty clearly expressed on January 6.”

That, however, could shift. The modern American left does have a violent tradition. During the 1960s and ’70s, groups including the Weather Underground bombed banks, statues, and major government buildings. The Black Lives Matter protests in the summer of 2020 were overwhelmingly peaceful, but some demonstrators looted stores and destroyed police vehicles. And when Donald Trump’s supporters protested in Portland wielding paintball guns, the far-left activist Michael Reinoehl shot and killed one of them. His justification—self-defense—is both inexcusable and telling. If right-wing agitators continue down an increasingly extreme trajectory, and if the state does not stop them, it is easy to imagine liberals becoming increasingly less pacifistic.

Unfortunately, none of the officials I spoke with thought that any agency—from local and state law enforcement to the Department of Homeland Security (DHS) and the armed forces—is fully prepared for the challenges posed by domestic terrorism. At least for federal employees, this should be expected. After four years of working under an administration that courted extremism rather than combated it, many bureaucrats and officers are just getting up to speed.

“In my entire 40 years in the military, from Annapolis to supreme allied commander of NATO, I never gave a thought to these challenges,” says James Stavridis, a retired four-star Navy admiral and the former dean of the Fletcher School of Law and Diplomacy at Tufts University. “I suspect my successors in the Pentagon at the four-star level have spent a huge amount of time thinking them through over the past 12 months.”

These officials—and their peers in the DOJ and DHS—should be able to scale up fast. Because the federal government has such strong surveillance measures, it is very effective at penetrating and eliminating terrorist cells. It has powerful law enforcement agencies and the world’s most well-funded military, meaning that it retains an overwhelming force advantage. As a result, full-scale civil warfare is highly unlikely.

But as a tool of counterinsurgency, force has serious drawbacks. It is tricky to devise operations that target violent extremists without also targeting nonviolent ones. Sometimes, counterinsurgency measures sweep up random civilians. As a result, they can quickly generate backlash, further radicalizing both militant groups and the broader public. And even if every operation is perfectly precise, violence does not have a good track record of changing hearts and minds. 

To truly end an insurgency, the government must address the underlying social conditions that allow terrorism to thrive. It must build trust with alienated communities, which means finding partners that are welcomed by hostile populations even if the government itself is not. Part of why America failed in Iraq and Afghanistan is that in both places, wide swaths of the country simply rejected the authority of the U.S.-backed Iraqi and Afghan governments.

Unfortunately, the Biden administration might not have much more luck fighting insurgents on the home front. The economic dislocation and racism (and other misplaced cultural grievances) that are driving discontent are not easy to fix, especially with our knotty political system. And even if the president can tackle these challenges, the institutions that are trusted by the right—incendiary conservative politicians, Fox News, talk radio grifters, Facebook commentators obsessed with “owning the libs,” and, above all else, Donald Trump—have no incentive to stop peddling lies or to cool their tone. Hate works to their political and financial benefit.

“We can run around and do targeting operations. The FBI can sweep up dudes nonstop,” says Jason Dempsey, an adjunct senior fellow at the Center for a New American Security and a former special assistant to the chairman of the Joint Chiefs of Staff. But political violence is, ultimately, a political problem. So long as the GOP remains in thrall to the far right, attackers will have enough support to regenerate. “If you don’t address that,” Dempsey says, “then no amount of tactical action will ever get you ahead of the game.”


Does America have enough extremists to sustain an insurgency? Mason and Kalmoe’s research suggests that tens of millions of Americans view political violence as acceptable. This doesn’t mean that tens of millions of people are willing to commit violence themselves. But they don’t need to be. According to The New York Times, between 15,000 and 20,000 Americans belong to militias. If there’s at least tacit outside backing, that’s more than enough potential actors. “These groups are in the hundreds, and membership is in the five digits,” says Linda Robinson, a longtime foreign correspondent covering the Middle East and the director of the center for Middle East public policy at the RAND Corporation. “This puts it up at a parallel with some of the more significant armed insurgencies in other countries that many of us have spent years studying.”

The raw numbers, of course, do not tell us much about the precise nature of the threat. America’s militia scene may be large, but it is also diverse and chaotic. Some groups, like the Proud Boys, are avowed misogynists. Others, like the Three Percenters, welcome women members. The Oath Keepers recruit heavily from the police. The Boogaloo Boys, meanwhile, encourage violence against them. Without a clear hierarchy and leadership, America’s militias would find it impossible to wage organized warfare against the federal government. That is part of why a redux of the 1860s is currently unlikely.

But international experience suggests that disorganization among insurgents is no impediment to sustained violent activity. Indeed, for many states struggling with serious civil conflict, diffuse terrorist networks are the norm. Both Robinson and Mason, for example, told me that America’s budding domestic terrorism scene in some ways resembles the structure of al-Qaeda. While al-Qaeda may have slipped off many Americans’ radar screens since Osama bin Laden’s death, the movement has survived two decades of sustained assault from international militaries because its dispersed setup—multiple branches with minimal overlapping infrastructure—makes it very difficult to completely dismantle. It is alive and well in the Middle East and, especially, in Africa. Over the past two years, it carried out grisly attacks in Mali, Somalia, and Kenya. 

For counterinsurgency forces, fighting a dispersed network poses special challenges. A clearly defined enemy can certainly surprise, but if it has a stated agenda and staked territory, there are known battle lines. When the opponent is a collection of groups with differing aims, there is an especially wide range of targets. In the United States, those targets include some obvious marks—Democratic politicians, Republicans who won’t help steal elections, Black churches, synagogues, shopping centers popular with immigrants. But antigovernment extremists could also select targets that are more idiosyncratic. Investigators have speculated that the Nashville man who blew up his van near an AT&T facility in December 2020 may have been inspired by conspiracies about 5G technology. In February 2021, a group of anti-vaccine protestors temporarily shut down a mass vaccination facility. The next such demonstration might not be conducted peacefully.

Successful attacks against any of these people or places would have clear negative consequences for national security. But when it comes to our political stability, experts say, the most frightening targets are government properties. “I think what we should be most concerned about here are attempts to take over state institutions by force,” says Yuri Zhukov, an associate professor of political science at the University of Michigan who studies domestic and international political violence. Doing so, he told me, is usually a prerequisite for seizing and controlling territory: in other words, for starting a more serious civil conflict.

Violent militias are unlikely to successfully occupy the Capitol again, given the complex’s heightened security. They will also struggle to overtake state capitols, which have likewise grown more fortified after armed protests in Lansing and Richmond. But Zhukov and others are concerned about local government buildings, which are much easier for extremists to attack. Indeed, for some of the world’s most successful insurgent groups, municipal facilities were the first sites of violence. 

In Ukraine, for example, residents who opposed the 2014 Euromaidan revolution—in which pro-Western protestors forced the pro-Russian president out of office—began destabilizing the country’s more Russia-aligned eastern provinces by seizing city property. “Protestors come to the square, they get whipped up into a frenzy, then they’re told to march on the local administrative building, like the town council or police station,” Zhukov said, describing what happened. Fueled by regional grievances, opportunistic politicians, and Russian propaganda, they broke down doors, smashed through windows, and streamed into the facilities. Local police, either overwhelmed or sympathetic to the insurgents’ aims, failed to hold them off. The uprising was contagious, and town after town across the country’s eastern flank fell to insurgents. Eventually, it was impossible for the government to dislodge them without turning to its military. Later, once Russian troops got directly involved, it became impossible to dislodge them altogether.

It is hard to envision America’s messy militia scene destabilizing the United States in a similar manner. But it is possible that U.S. militias could quickly grow more organized, or at least more orderly, if even one group took control of a local government building. “Once that happens in one city, people are going to try and replicate it somewhere else,” Zhukov said. “That’s kind of the nightmare scenario here.”


The United States is not Ukraine. It is a far more powerful country with better-equipped and better-trained security forces. And we do not share a border with a hostile adversary that likes to intervene in our politics.

But there are enough parallels to cause concern. Like Ukraine, we are highly polarized. Russia is not as forceful a presence here, but it has been injecting disinformation into our political discourse and trying to destabilize our government. There are emerging linkages between America’s far right and Vladimir Putin’s regime. Certain American political elites are fueling extremism to gain more power. And there are those within our local police departments, federal security services, and the armed forces who sympathize with far-right groups.

“The two most common features of some of the worst civil conflicts are political elites instrumentally looking to use violence and mobilize people in pursuit of their own power ambitions, and divisions within the military,” says Michael Kofman, the director of the Russia studies program at CNA, a nonprofit research and analysis organization, and a fellow at the Woodrow Wilson International Center. To varying degrees, the U.S. has both.

There is a good chance that any sustained right-wing insurgency in America would follow the pattern Ukraine experienced. Militia groups, responding to some perceived tyranny by federal authorities—or perhaps to another liberal political victory—would attempt to seize city government buildings in a safely conservative state. If it worked, extremists elsewhere would attempt to replicate the attack.

But in some countries wracked by civil conflict, violence largely takes place in politically mixed communities. In Iraq, for example, attacks are generally concentrated in the same handful of provinces. According to Robinson, of RAND, these are places “where people have been in conflict for a very long time over basic issues, and where the government is either not addressing the violence or, in some cases, a shadow government is of like mind with the actors.” In 1855, Kansas bled for a similar reason: It was a seam territory, home to people with irreconcilable political differences who followed competing governments of uncertain legitimacy. A modern American insurgency could break out in a similarly split state. 

One candidate for that kind of violence might be Michigan. The swing state has a long history of militia activity, and it has a fiercely divided government. The governor, Gretchen Whitmer, is a Democrat, as are the attorney general and the secretary of state. But Republicans control the state legislature, and they have worked hard to undercut and delegitimize the executive branch. In response to the governor, attorney general, and secretary of state’s 2018 election victories, the legislature spent a lame-duck session passing laws to limit the executive branch’s powers. The legislature has sued the governor repeatedly over COVID-19 regulations. They have threatened to sue her if she allocates federal pandemic relief money without their approval. And they have openly cultivated ties to antigovernment militias. 

This came to a head on October 6, 2020, when the FBI arrested six men for plotting to kidnap Whitmer. These men were radicals, but they were not isolated. At a May 2020 protest over COVID-19 rules, the state senate leader appeared on the same stage as one of the arrested men. It is not difficult to imagine a militia furious at Whitmer in particular, or at Democrats in general, trying to seize a state building with the tacit support of Republican legislators.

Whether the violence began in a safely red state or a contested purple one, the response to any insurrection would follow a similar chain of escalation. Under the U.S. law enforcement system, responsibility for fending off insurgents attacking a local or state government building would first fall to local and state police. In an ideal world, that is where it would end. 

Yet police officers are, on the whole, more conservative than the general population, and militia groups curry favor among law enforcement. Research suggests that well over 1,000 cops might actually belong to right-wing extremist organizations, a figure that does not capture the number that simply sympathize with radicals. Were a right-wing group to seize power, it is possible that the police would not work aggressively to arrest insurgents.

But even if the police fought the mob, they might struggle. Due to lax American gun laws, most militia groups are exceptionally well armed. The connections between law enforcement, the military, and extremist outfits mean that some militia members are also quite well trained. Dozens of the people involved in the January 6 riot were active or former members of the police and the military, and many employed technologies that are frequently used by those in uniform. Others used common military combat techniques. 

If local and state police were unable to stop extremists, or if insurgents targeted a federal building, the national government could take charge. There are a number of armed federal agencies that could potentially claim jurisdiction. In 1992, for example, the FBI commanded federal and local officers as they stood off against a white nationalist in Idaho accused of illegally selling weapons. The next year, agents from the Bureau of Alcohol, Tobacco, Firearms, and Explosives raided a compound in Waco, Texas, following reports that residents were stockpiling banned arms. When the raid devolved into a gunfight that killed five ATF agents, the FBI came in. The Department of Defense (DOD) even played a role, providing the FBI with specialized equipment.

In isolated instances, this is almost always enough. Sometimes, it can be too much. In its attempt to flush residents out of the Waco compound, the FBI accidentally set off a horrifying fire that killed almost everyone inside. The dead quickly became martyrs for antigovernment extremists. Timothy McVeigh said the disaster was one of the reasons he blew up a federal building in Oklahoma City in 1995, killing 168 people.

But in cases where disorder becomes widespread, federal agencies might not be the best fit. If a militia took control of more than one government building or started multiple violent riots, the governor might call on the National Guard. There is very recent precedent for this step, albeit not against the far right. In the rioting after George Floyd’s murder, Minnesota Governor Tim Walz deployed the National Guard in Minneapolis to try to stop the unrest. This is not a positive memory. It involved a justified protest being co-opted by violent actors, resulting in a forceful government response. But the Guard did eventually restore order. 

Yet if, as in Ukraine, a successful insurrection led to either copycats or a loss of territorial control, a state’s National Guard might not be enough. The governor might ask the president to invoke the Insurrection Act, federalizing the Guard and bringing in additional military forces. This, too, would have precedent. In 1992, President George H. W. Bush invoked the Insurrection Act at the request of California Governor Pete Wilson to try to stop the L.A. riots.

It’s also possible that the president would invoke the Insurrection Act without first receiving a governor’s request. If widespread violence broke out in a GOP-controlled state, the governor could refuse to fight it or, worse, could order state law enforcement to side with insurgents. With the state abdicating its responsibility, the White House could feel compelled to step in.

That might seem unlikely, but it has happened within the living memory of many Americans. In 1957, Arkansas Governor Orval Faubus deployed the state National Guard to prevent Black students from enrolling in Little Rock’s Central High School. In response, President Dwight Eisenhower federalized the Guard and sent in the 101st Airborne Division to protect the teenagers. On September 30, 1962, Mississippi Governor Ross Barnett ordered the state highway patrol to withdraw from the University of Mississippi so an angry mob of segregationists could stop James Meredith from enrolling in classes. The rioters shot at federal marshals sent to protect Meredith, attacked journalists, and beat up bystanders. Later that day, President John F. Kennedy federalized Mississippi’s National Guard and sent in U.S. Army units to quell the unrest. The military succeeded, but only after two people were killed and hundreds injured.

Militia groups, responding to some perceived tyranny by federal authorities—or perhaps to another liberal political victory—would attempt to seize city government buildings in a safely conservative state. If it worked, extremists elsewhere would attempt to replicate the attack.

Ugly as these incidents were, they were discrete. There were no cross-state riots, no cascading acts of violence. But in the worst-case, Ukraine-style scenario—in which thousands of insurgents seized buildings, destroyed infrastructure, and simultaneously carried out other attacks across multiple parts of the country—the military would have to get more involved. In this situation, the president might direct the Joint Special Operations Command, the agency that chases terrorists in Iraq and Afghanistan, to look for domestic insurgents and put down attacks.

These troops would make fast work of any insurrectionist brushfire they were sent to contain. Tactically, special operations units are extraordinarily proficient. But like in the Middle East, that work would result in mass casualties and horrifying violations of human rights. In attempting to preserve the territorial integrity of the United States, the military might rip it apart. If it succeeded in maintaining the authority of the federal government, it could come at the expense of the rule of law.

Leaders of the U.S. Northern Command, the military branch that controls the DOD’s homeland defense efforts, did not respond to questions about planning for domestic conflict. Neither did the current assistant defense secretary for homeland defense—the DOD official who oversees domestic operations. In response to written questions about their concerns, plans, and capabilities, DHS and FBI spokespeople simply emphasized that they were paying close attention to domestic terrorism and would work collaboratively to combat it. In an interview, Mike Dugas, the provost marshal for the National Guard Bureau, told me the Guard has not done national-level planning for a counterinsurgency mission.

But Dugas said there was “heightened awareness” among the Guard about the threat of violent domestic extremists, and that it’s possible individual governors have taken steps with their own Guard to plan for an insurgency. Former DOD officials told me they considered ways in which to assist civilian law enforcement if domestic agencies needed specialized technology or expertise. And though these officials did not map out broad domestic tactical operations during their tenures, they suggested that the department’s thinking might be shifting.

“We did not, so far as I know, have a plan for an outbreak of a civil war,” said Tom Atkin, who served as acting assistant secretary of defense for homeland defense from September 2015 to January 2017. “And I guess that part of it is, prior to Trump’s presidency, who ever thought that would happen?”


The good news is that most of the people I interviewed thought that, while we must prepare for the worst, a nationwide insurgency remains unlikely. For starters, America’s robust intelligence capabilities make it difficult for would-be terrorists to carry out plans. The men who plotted to kidnap Gretchen Whitmer were caught before they could act. So were the Boogaloo Boys who sought to set off explosives at a Black Lives Matter protest. The January 6 insurrection was a high-profile national security failure. But even then, the government received warnings. With a newly attentive administration, it is less likely that future plotters will be able to act.

More importantly, the country has in recent memory faced domestic extremism and brought it to heel without spiraling violence. During the 1990s, militia groups and radicalized individuals in the Pacific Northwest, the Ozarks, and other locations emphatically rejected the authority of the federal government. They stockpiled weapons. They declared themselves “sovereign citizens” and refused to pay taxes. They set up kangaroo courts where they issued decrees and placed liens on the property of local authorities. None of these had the force of law, but they were intimidating. And some extremists were able to co-opt—at least to a limited degree—local police forces. 

But after the Oklahoma City bombing, the FBI began a sweeping crackdown. They infiltrated, broke up, and shrank many of the country’s militias by arresting members who had engaged in criminal activity. Other members were merely taken in for questioning, but the experience was frightening enough to discourage further militia involvement. Many veterans of this fight are still in the government. The man who led the successful prosecution of McVeigh, Merrick Garland, is now the attorney general. He has made it clear that he will hold domestic terrorists to account.

Perhaps the most hopeful historical precedent, however, comes from the 1960s, the last time a presidential administration made fighting discrimination and expanding democracy central to its agenda. Much like today, these plans encountered fierce opposition. White people marched and rioted to stop integration. Right-wing politicians denounced the federal government as tyrannical and promised to fight to stop its plans. “Segregation now, segregation tomorrow, and segregation forever,” Alabama Governor George Wallace declared in 1963. Later that year, he stood in front of a school to stop Black students from entering, even after Kennedy sent in federal troops. 

But eventually, and without the use of force, Wallace stepped out of the way. So did most other white southerners. Black people enrolled in previously all-white schools. They began sitting at the front of buses. Between 1960 and 1970, Black southerners went from being mostly disenfranchised to more than 65 percent registered to vote.

That’s not to say the end of Jim Crow was a peaceful affair. Black people and their allies were spat on and beaten up. Forty-one activists were killed. But given the rhetoric of the time, experts say it is remarkable that more people didn’t die. “If you took these yahoos at their word, and not just individuals but politicians, you’d say, ‘Oh God, I don’t know, 10,000 [killed] or something,’ ” said Robert Mickey, a professor of political science at the University of Michigan who studies the democratization of the American South. “But no, they were full of it.”

Yet in some ways, the present moment feels more frightening than the 1960s. Part of that is geographic. Today’s conflict is national, rather than concentrated in the South. Part of it is polarization. The Democratic Party of the 1960s included both Wallace, southern apartheid’s most voracious defender, and Lyndon B. Johnson, its most powerful elected opponent. There were clearer lines of communication and a joint political tent. Now, the conflict over minority rights, the rule of law, and basic principles of democracy is mapped neatly onto parties. Unlike in the 1990s, when right-wing extremism was overwhelmingly disavowed by national Republicans, the modern GOP actively courts the far right.

And the 1960s and 1990s are not the only historical analogs. In 1876, an armed militia of Democrats successfully pressured Mississippi’s Republican governor to resign. In 1898, an angry mob of white supremacists in North Carolina seized Wilmington’s city hall and forced the Republican mayor out of office. These were not isolated incidents. The entire collapse of Reconstruction was an extraordinary victory for right-wing insurgents over fatigued federal forces and their regional allies.

The entire collapse of Reconstruction was an extraordinary victory for right-wing insurgents over fatigued federal forces and their regional allies.

We should learn from this failure. Coercion can be an essential tool in fighting insurrections, but Reconstruction’s demise shows that insurgencies cannot be permanently defeated by simply applying force. This is a lesson that the U.S. military has recently rediscovered while mired in the Middle East. American soldiers could quickly topple the Baathist and Taliban regimes, yet without continuous, widespread occupation, they could not stop Iraq from devolving into chaos and the Taliban from slowly regaining power. To defeat an insurgency, government actors must ultimately change the political beliefs of a hostile population. 

That means today’s democrats—lower- and upper-case d alike—must enact policies that win over voters, restore their trust in the government, and ultimately reduce discrimination. This will require legislation to mitigate economic inequality and improve living standards for everyone. The American Rescue Act—which is expected to cut poverty by a third, fight the pandemic, and stimulate the economy—is a positive first step. Raising the minimum wage and fixing our infrastructure would be a wonderful second and third. Actions that give local communities more power, like the freedom to set up municipal broadband networks (something prohibited in many states by laws passed at the behest of cable companies), would also prove valuable. The Biden administration could more fully unrig the economy by using competition policy to break up giant corporate conglomerates. And in the long run, regulations that further integration would help to lessen racial animus and the violence it inspires.

But to truly stop the spread of extremism, economic and social reform will not be enough. So long as the Republican Party continues to peddle lies and hate, extremist groups will find political cover. Unfortunately, the average Republican member of Congress has more to fear from primary challengers than from Democrats, and denouncing the far right opens officials up to accusations that they are not sufficiently supportive of the Trumpist cause. Embracing it, by contrast, helps them stay in office. The result is escalating rhetoric that further inflames the base.

Democrats are not powerless to stop the GOP’s vicious cycle of radicalization. If the Senate can pass many of the critical, pro-democracy reforms contained in H.R. 1—the House’s For the People Act—they might be able to change the Republican Party’s underlying electoral calculus. The gerrymandering ban would force more GOP politicians to compete in genuine swing districts. Provisions that expand access to the franchise could make them reach out to traditionally liberal demographics. Combined, these changes might just cause Republican politicians to moderate.

But without them, the odds of a shift are long. Republicans will have the upper hand in redistricting, leading to a new, disproportionately conservative congressional map. The party’s ongoing efforts to keep minorities from voting will move full steam ahead. Maybe a robust pandemic recovery and strong economy will be enough to lock the GOP out of federal power for years, forcing some kind of independent reckoning. But an unreconstructed Republican Party could capture the White House, again while losing the popular vote, and then use its national power to more thoroughly rig elections. 

If that happens, hell could really break lose. A few blue states might try to secede. Leftist militias, now a relative rarity, might expand. Indeed, even moderate liberals could embrace violence. It isn’t hard to see why. If voting can no longer bring about political change, people tend to reach for alternatives.

Some progressive groups are already arming themselves. At least one of them, based in Washington State, has named itself after John Brown. The homage is instructive; Brown took to weapons after concluding that abolition would be impossible without them. He was tired, he explained, of the antislavery movement’s commitment to rhetoric in the face of slaveholder violence. “These men are all talk,” Brown said of his compatriots, shortly before moving to Kansas. “What we need is action—action!” 

The echoes of the 1850s are loud, and grim.

“There are not a lot of cases of highly socially and politically polarized countries that depolarize without something horrific happening,” said Mickey, of the University of Michigan. “We don’t have a playbook.”

The post America’s Next Insurgency appeared first on Washington Monthly.

]]>
127559
Out of School, Out of Work https://washingtonmonthly.com/2021/04/04/out-of-school-out-of-work/ Mon, 05 Apr 2021 00:35:49 +0000 https://washingtonmonthly.com/?p=127563

The pandemic has sidelined a generation of young adults. And the federal government’s only program to help them, Job Corps, sucks.

The post Out of School, Out of Work appeared first on Washington Monthly.

]]>

Even before the pandemic, America’s young adults were in crisis. 

In 2017, as many as 4.5 million young people—or 11.5 percent of young adults ages 16 to 24—were neither in school nor working, according to the nonprofit Measure of America. By the summer of 2020, the organization estimated, the ranks of these “disconnected” young adults had swelled to 6 million. 

The pandemic has taken an outsized economic toll on young workers, who disproportionately hold jobs in hard-hit sectors such as retail, hospitality, and food service. Unemployment among 16-to-24-year-olds soared from 10 percent in March 2020 to 26 percent in April, according to the Department of Health and Human Services, with the highest rates of joblessness among Black and Latino youth.

The damage could be long-lasting. The Millennials who graduated into the Great Recession still bear economic scars, with lower employment and earnings than peers who started out in better years. The same fate will befall the COVID-19 generation, unless effective interventions put them back on track.

In theory, the federal government has a program to help. 

Job Corps is the government’s largest program for disadvantaged young adults ages 16 to 24. It offers training and certification in more than 80 fields, including IT, construction, manufacturing, health care, and hospitality. The program’s website features smiling, clean-cut young people driving forklifts, cooking in chef’s whites, and fixing cars. “Job Corps provides everything you need to succeed in your education and career training,” the site promises. To be eligible for the program, students must not only qualify as low income but also have at least one barrier to education and employment, such as low literacy or homelessness. Job Corps centers provide housing and meals, along with a small allowance, a uniform, books, supplies, and dental and medical care. Tuition is free. 

But just as the need for Job Corps escalated, the program ground to a halt. After the pandemic shut down its physical campuses in March 2020, the program struggled to secure laptops and internet access for students. Enrollment shrank to one-fourth of what it was before the pandemic. 

The real scandal, though, is that Job Corps has performed poorly for decades—and the government has not invested in any large-scale alternative. Despite an annual budget of roughly $1.7 billion, Job Corps served barely 41,000 students in 2019. Evaluations have found that while the program helps some young adults, teenagers get no long-term benefits in earnings or employment. Government audits have been harsh, documenting mismanagement, safety problems, and persistent failures to place trainees into meaningful jobs. A scathing 2018 audit by the Department of Labor’s inspector general concluded that the program “could not demonstrate beneficial job training outcomes.” Another report, from the Government Accountability Office, noted more than 13,500 safety incidents from 2016 to 2017 at Job Corps centers, nearly half of them drug-related incidents or assaults. In 2015, two students were murdered in separate campus-related crimes. 

Rather than the young people it purports to serve, Job Corps’ biggest beneficiaries may be a tight-knit coterie of for-profit government contractors who administer the program, some of whom have held on to multimillion-dollar contracts for decades.

Even successful graduates call the program a last resort. “If you’re really desperate and ain’t got anywhere else to go, then I would do Job Corps,” says 24-year-old Earvin Rogers, who enrolled at a New Jersey Job Corps center in 2017 after dropping out of community college. Rogers landed a hotel maintenance job before the pandemic, but is currently unemployed. 

Rather than the young people it purports to serve, the program’s biggest beneficiaries may be a tight-knit coterie of for-profit government contractors who administer the program, some of whom have held on to multimillion-dollar contracts for decades. But in a testament to congressional inertia, the program lingers on, surviving threatened closures, resisting overhauls, and garnering enough political support to maintain its funding. 

Worst of all, inattention has led to decades of underinvestment in other solutions for young people. As a result, there’s no obvious large-scale alternative to Job Corps. Barring a change in course, the millions of young adults who saw opportunities evaporate during the pandemic may not get the help they need. 

Launched during President Lyndon B. Johnson’s War on Poverty, Job Corps now consists of 123 centers across the country, many of them in former military facilities and often in rural areas. A typical campus is the Woodstock Job Corps Center, which sits on 64 acres of tranquil woodland in rural Maryland, about an hour from Washington, D.C. A stately, H-shaped stone building that was once the oldest Jesuit seminary in America, the center can host up to 400 students. 

The purpose of this residential setting, as the program founder Sargent Shriver testified to Congress in 1964, was to “take young men from crippling environments and put them in centers where they will receive a blend of useful work, job training, and basic education.” Young people would get “a chance to escape from the cycle of poverty and to break out of the ruthless pattern of poor housing, poor homes, and poor education,” he argued. 

Over the years, Job Corps has unquestionably had its share of successes. The professional boxer George Foreman, who attended the program in the 1960s, was reportedly so grateful that he repaid the federal government the cost of his enrollment. 

Today, success stories include alumni like Shimira Mills. Now 28, Mills enrolled in the Pittsburgh Job Corps Center in 2017 on a cousin’s advice after a brief stint in culinary school did not launch her dream career of being a chef. Mills spent seven months living at the Pittsburgh campus, learning to be an HVAC technician. She found a job almost immediately at a small local business but has since been hired at a large residential heating and air conditioning company in Philadelphia with better pay and room for promotion. “Within the last two years, I have acquired two cars and an apartment where I’m living by myself,” Mills told me. The program, she said, gave her a second chance when she needed one. Without Job Corps, Mills continued, “If I’m going to be 100 percent honest, I would probably still be working dead-end jobs.” 

Job Corps was also a lifeline for Ricky Gass. Now 24, Gass was 18 when he got involved with drugs. One day he woke up in the back of a police car. “I was smoking some bad stuff one time—I don’t even remember what happened,” he told me. Gass wanted to turn his life around, especially when he learned that he had a daughter on the way. But with only a high school diploma, the best job he could get was at the local Family Dollar. “My check was $200 every two weeks,” he said. “It was horrible.” 

In 2019, Gass enrolled at New Jersey’s Edison Job Corps Center and threw himself into his classes, learning everything from putting in electrical wiring to hanging drywall. When one of his advisers offered him a training opportunity at a solar power company, Gass jumped at the chance and was hired soon thereafter. Today, he installs panels for Solar Landscape, a company that specializes in building large-scale solar projects in New Jersey. He is paid the industry’s prevailing wage, which is currently about $62 an hour, according to Solar Landscape’s director of community engagement, Katelyn Gold. “It’s a good thing I’m part of this company,” said Gass, who was expecting a second child when we spoke in September. “I’m pushed every day to be better, and it’s a perfect scenario where I’m at.” 

The defenders of Job Corps point to stories like those of Mills and Gass as proof of the program’s value. But then there are students like Julea Shannon, who spent seven months at the Joliet Job Center in Illinois and earned certificates in Microsoft Word and other office software. While she told me she enjoyed the experience of living on campus (“You get to see how it is to live without your parents. It teaches you to be a mature adult,” she said over chat on LinkedIn), the program didn’t help her land a job. She ultimately went back to community college and earned an associate’s degree in criminal justice in 2019. When I contacted her this March, however, she was still looking for work and considering more schooling. 

Formal evaluations of the program are similarly mixed. Evidence shows that for older students like Mills and Gass, Job Corps can be effective. In a rigorous series of evaluations published in 2018, the research organization Mathematica found that students between the ages of 20 and 24 at the time of enrollment were 4.2 percentage points more likely to be employed 20 years later than a comparable group that did not attend. They were also 1.4 percentage points more likely to be filing taxes and 3.6 percentage points less likely to be on disability. For these students, the net benefit of Job Corps to society was about $30,000 over the course of a participant’s lifetime (in inflation-adjusted dollars), taking into account the taxes they paid on earnings, as well as savings from reductions in crime and dependency on public benefits. 

But Mathematica’s study told a different story about the teenagers who enroll in Job Corps—and who make up the bulk of attendees. It found no long-term gains in earnings or employment for students who started the program when they were ages 16 to 19. The study concluded that overall, Job Corps was not cost-effective. In fact, Mathematica found that even taking into account the benefits from older students’ success, the net cost of Job Corps was about $17,800 per participant (again, in inflation-adjusted dollars). 

This research has one important caveat: Because it tracked the long-term fortunes of Job Corps students for up to 20 years after they left the program, its results reflect the program as it was in 1995, versus today. But little evidence indicates that Job Corps has improved dramatically since then. 

The program has been a frequent target of the Department of Labor’s inspector general, with dozens of audits over the years examining everything from student outcomes to contracting practices, center safety, reports of cheating, and inadequate financial oversight. 

Especially damning is a 2018 audit by the IG, which found that the program couldn’t demonstrate that it placed students into “meaningful jobs appropriate to their training.” Out of the 50 students for whom the IG was able to track down employment histories, more than half were placed in jobs similar to what they were doing before Job Corps. One student, who worked as a retail cashier before Job Corps, spent 310 days in bricklaying training only to return to the exact same store where they had worked previously. Job Corps reported this as a “successful” graduation and placement. Among 231 students for whom wage records were documented, the median annual income was just $12,105 in 2016—nearly $15,000 less than the median income for all workers without a high school diploma. 

Another audit, in 2011, found that Job Corps overstated the success of 42 percent of 17,787 job placements, with those students taking entry-level jobs unrelated to their training. Among the mismatches were “culinary students placed as pest control workers, funeral attendants, baggage porters, concierges, tour guides, telemarketers, cashiers, telephone operators . . . and file clerks.” 

In a testament to congressional inertia, Job Corps lingers on, surviving threatened closures, resisting overhauls, and garnering enough political support to maintain its funding.

Poor reporting and financial oversight are also consistent themes in these government investigations. In 2013, for example, the IG’s office concluded that Job Corps had improperly awarded $353 million in noncompetitive contracts to its contractors. A 2011 IG report determined that the program may have spent as much as $164.6 million in 2010 on training for students who were not eligible to enroll. In every year since 2006, the IG’s annual report has identified Job Corps safety and program effectiveness as among the Department of Labor’s “top management and performance challenges.” 

In fairness, the mission of Job Corps is a difficult one to make good on. 

“The labor market is not very hospitable to young people without high levels of post-secondary skills,” says Dan Bloom, senior vice president at MDRC, a nonpartisan policy research organization focused on social and education issues. “Put together with problems in the public schools, a harsh criminal justice [system], and a bunch of other contextual factors, and it’s very difficult to change those trajectories.” 

Job Corps is taking on a group of young adults who are tough to reach successfully. Of the nearly 50,000 young people enrolled in Job Corps in 2018, 60 percent did not have a high school diploma or GED when they entered the program, 20 percent were receiving public assistance, and 5 percent were homeless, runaways, or in foster care. About 80 percent of Job Corps students were teens and younger adults, ages 16 to 20. 

Job Corps graduate Malcolm Little, who served as student body president at the Woodstock Job Corps Center in Maryland in 2016, said many of his classmates were victims of crime or had witnessed violence. 

“I met girls who were pimped out, put on drugs, folks trying to kill them,” he told me. “There were other people, both male and female, who were sex trafficked and ex–drug dealers who were trying to get themselves together.” Little said one young man he met in the program didn’t go home to Atlanta for Christmas break because he feared for his life. “He told me the neighborhood he lived in was so bad that if he disappeared for three weeks and came back, people would assume he had been detained by the feds and snitched to get out, and that would have put a target on his back,” Little said. 

For older students, the program can be effective. But for teenagers, who make up the bulk of Job Corps attendees, rigorous evaluations show no long-term gains in earnings or employment.

The prevalence of this kind of trauma means that Job Corps must serve as far more than a training program to its students if they are to succeed. “They need mental health, they need physical health, they need medications,” says Tony Staynings, who is now the director of the Potomac Job Corps Center in Washington, D.C., but was a consultant to the program when we spoke. “There’s a whole element to the delivery of service that the average person looking from the outside doesn’t understand.” 

On the one hand, the expectations placed on the Job Corps program are arguably unreasonable. No career and education program can, on its own, salvage the fortunes of young people whose lives have been shaped by deep-rooted systemic poverty. A young person’s disconnection from the economic mainstream is the end result of subpar schools, a dearth of jobs, and, often, neighborhoods beset by violence. 

On the other hand, it’s not unreasonable to question the efficacy of a more than 50-year-old approach to training and education that has so far delivered decidedly mixed results. Particularly deserving of scrutiny is Job Corps’ residential model. While it’s the program’s signature feature, it’s also its central weakness, especially during a pandemic. 

When schools and colleges across the country shut down in March 2020, Job Corps centers followed suit, sending home or finding other housing for 30,000 students, according to a July 2020 IG report on the program’s COVID-19 response. About 450 students with nowhere else to go stayed on campus. The program switched to remote learning in May and began working to supply students with laptops, tablets, and internet access. My repeated inquiries to the Department of Labor last fall on the status of the remote learning plans went unanswered, and there has been no data published publicly on the number of students who received a laptop or were accessing remote instruction. But as of the July IG report, the transition had not been completed. It still hadn’t been finished as of October 2020, according to a person I spoke with who has knowledge of the matter but was not authorized to talk. 

The Job Corps program came nearly to a standstill in the fall of 2020. Just 9,138 students enrolled nationwide in all of 2020. By the end of February 2021, however, Job Corps centers were in the process of returning students to campus. Though the question of whether the program ever successfully transitioned to online is now effectively moot, it is still unclear how—or whether—the program can recover the students it lost. 

Proponents of Job Corps’ residential model argue that living on campus can provide students with a clean break from negative influences in their lives. “It’s good for people who need a safe place,” Malcolm Little said. 

“The folks who come to Job Corps need to be there in order to succeed,” says Grace Kilbane, who served two separate stints as the national director of Job Corps, most recently under President Barack Obama. “I met students who were homeless or had aged out of foster care and had nowhere to go,” she told me. “That need has not gone away—if anything, it’s gotten greater.” 

Yet the residential model is a major driver of the program’s expense as well as its persistent problems with safety and security. “Centers do what they can to create a positive culture,” says Jeffrey Turgeon, who worked for nearly five years at a Job Corps center in Massachusetts and is now the executive director of the MassHire Central Region Workforce Board. “But any time you’ve got a bunch of young people, especially young people who are at risk and haven’t been successful in the past, they’re going to come with whatever baggage or drama they bring with them.” 

Shimira Mills, the Job Corps graduate who is now an HVAC technician, said discipline was a big part of life on campus, which she described as having a “boot camp type of vibe.” “We got graded on a day-to-day basis on how our rooms were and chores that we had to complete every morning and every evening,” she said. “If you’ve ever been to boot camp, they had the same system. Or jail—whichever one.” Unlike a college campus, days were strictly regimented. Students woke up at 6 a.m.; breakfast was at 7; classes began at 8. 

Many of her classmates did not make it through the program, Mills said. Her roommate was expelled after getting into a fight with another student. Other students were kicked out after failing drug tests, which are part of Job Corps’ “zero-tolerance” policy toward alcohol and drugs. Of the 45,173 “separations” (graduations or other departures) from Job Corps in 2018, 65 percent of students left for jobs or the military and around 7 percent went on to further education, implying that the remaining 28 percent either left without completing the program or graduated without a meaningful placement. 

Both Mills and Little said their rooms were searched from time to time. “Every once in a while I’d come back and all my stuff from my dresser and my closet would be on the bed,” Little said. 

The former student Ricky Gass was among a minority of students who commuted to his Job Corps center every day. Unlike most centers, which are located in rural or out-of-the-way areas, the Edison Job Corps center he attended in New Jersey was reachable by public transportation, and Gass had child care obligations for his daughter. “Sometimes I felt like I wanted to be on campus, but if I was staying there I would have possibly gotten into a lot of different situations—like females, the drama. There was a lot of testosterone,” he said. “Some situations I was shielded from by not being there on campus.” 

Phillip-Matthew Golden, who taught at the Woodstock Center in Maryland for four years, told me he witnessed a significant amount of turnover among the staff who managed the dorms. “They handled a lot of volatile situations,” he said. “They tried really hard, they really cared, but it was hard to keep people there.” 

On occasion, that volatility has spilled over into violence. In August 2015, four students at the Homestead Job Corps Center near Miami were arrested and charged with hacking to death a 17-year-old fellow student with a machete and then setting his body on fire. Earlier that year, a 20-year-old student at the St. Louis Job Corps Center was arrested for allegedly shooting and killing another student in the dorms. Between 2007 and 2016, the GAO found, Job Corps centers reported 49,836 safety and security incidents, including 6,541 incidents involving drugs and 9,299 assaults. 

The need to maintain safety is just one reason why the program’s residential model is expensive. There are also more prosaic concerns, such as meals, laundry, water and electricity bills, and building maintenance. 

Job Corps operates on what is essentially a franchise model—much like McDonald’s—where a central office dictates the operation of individual centers in conformity with a single standard. Just like how every McDonald’s must make its fries in exactly the same way, Job Corps contractors are obligated to deliver education, training, and residential services that are standardized across the program’s 123 centers. With the exception of roughly two dozen centers run by the U.S. Forest Service, private contractors are effectively Job Corps franchisees. 

The expectations placed on the program are arguably unreasonable. No career and education program can, on its own, salvage the fortunes of young people whose lives have been shaped by deep-rooted systemic poverty.

Center specifications are spelled out in a more than 1,000-page “Policy and Requirements Handbook,” which governs every aspect of center operation, including recruitment and screening of prospective students; curriculum; discipline; placement services; and tracking of performance metrics. Contractors must, for instance, have written plans for “blood borne pathogens,” “respiratory protection,” and “hearing conservation” (to protect students’ hearing). Meal service is prescribed in exacting detail. “Meals shall be planned using a minimum of a 28-day cycle cafeteria menu,” the handbook dictates. Students must be offered, for example, “five choices of fresh or frozen vegetables and/or fruits,” and “low-fat and/or fat free milk and dairy alternatives and water.” 

The intent of this specificity is consistency across centers (although in truth, performance reports issued by the Department of Labor show that centers vary widely in quality). The downside is that few contractors can comply with the complexity and sheer scale of the Job Corps requirements. As a result, a relatively small number of companies, mostly for-profit, effectively hold a monopoly on the operation of centers.

Some of the biggest players in this fraternity are conglomerates that run Job Corps centers as one of several lines of business. Among these is Job Corps’ largest contractor, the Utah-based for-profit Management & Training Corporation, or MTC, which operates five detention centers for ICE, 21 correctional facilities, and 22 Job Corps centers, according to its website. For other companies, running Job Corps is their only business. Adams and Associates, Inc., for example, runs 14 centers, and the Career Systems Development Corporation runs 12 centers, according to their websites. Another private company, MINACT, runs centers in seven states. 

The universe of Job Corps contractors is strangely opaque. The National Job Corps Association, the trade association for Job Corps contractors, does not disclose its membership. When I reached out to MTC for an interview, I received this email reply from their managing director of corporate communications, Issa Arnita: 

Hi, 

We are not conducting interviews. I would recommend you contact the Department of Labor which administers the Job Corps program.

(The Labor Department, as noted above, did not respond to multiple requests for comment.) MTC’s website does not include a page with the names and bios of its executive team. The same is true of Adams and Associates. 

What is clear, however, is that Job Corps contracts involve big money. Career Systems Development, for instance, was awarded a $99.6 million contract in 2019 to run the San Diego Job Corps Center. MTC has won almost $305 million in contracts just since 2017. Among the contracts awarded to Adams and Associates is a deal to run the Grafton Job Corps Center in Massachusetts, worth nearly $53 million. Many of these companies have been reeling in lucrative Job Corps contracts for years, if not decades. MINACT, for example, opened shop in 1978 when its founder “successfully competed for the company’s first Job Corps Center contract,” according to its website. Career Systems Development notes on its site that it was one of Job Corps’ original contractors when the program launched in 1964. It still runs 13 sites today.

Unlike a college campus, days at Job Corps are strictly regimented. Students wake up at 6 a.m.; breakfast is at 7; classes begin at 8.

One result of having so few players is that much of Job Corps’ business gets awarded under noncompetitive contracts, which puts the government at the risk of overpaying for services. In one 2019 report, the GAO found that Job Corps was running 68 of 97 centers in 2016 under “bridge contracts”—either noncompetitive extensions of expired contracts or short-term noncompetitive contracts. In 42 of those cases, contracts were awarded to companies that had already lost their Job Corps contracts and were formally protesting the decision. The result was that these companies were able to squeeze hundreds of thousands of contract dollars from the government while their protests were being resolved. 

But could the federal government do a better job if it ran the centers directly instead of relying on contractors? Likely not. While a few of the centers still run by the Forest Service are among the best performers, others have been among the worst. In 2014, the Labor Department shut down the Treasure Lake Job Corps center, located within a national wildlife refuge in Oklahoma and run by the Forest Service. The closure came after years of underperformance in which fewer than half of the enrolled students finished their training and barely half of graduates found jobs. 

Despite Job Corps’ flaws, politics, inertia, and the lack of scalable alternatives have all conspired to maintain the status quo. Every state has a center, which has helped the program maintain congressional favor. NJCA, the Job Corps contractors’ trade association, has a small political action committee that makes donations to congressional campaigns. It also maintains the Congressional Friends of Job Corps Caucus, which boasts 89 members from across the ideological spectrum, according to the NJCA’s currently available list. 

Part of the program’s political stickiness is that Job Corps centers are often an important source of local employment. “In rural areas or in a smaller town, [Job Corps] is a player in town,” said one former senior Labor Department official who asked not to be identified in order to preserve their current relationships with members of Congress. “They have jobs, and it’s something people are proud of in those communities.” 

As a result, members of Congress can be reluctant to challenge the program. For example, this former official told me of one instance where a member refused to entertain the idea of closing a poorly performing center. “We said, ‘Look, we’re going to have to close it. It’s not good for the students, and I wouldn’t send my own kid there,’ but the congressperson pushed really hard not to close it,” the official said. “They really loved having it in their community.” 

In 2019, the Trump administration proposed closing or privatizing the 25 centers run by the Forest Service, which would have resulted in the layoff of more than 1,100 federal employees in eight states. The plan, however, quickly ran into bipartisan opposition, and was scrapped in a matter of weeks. 

Progressive advocates who know that the Job Corps system is flawed are also loathe to criticize it for fear of handing conservatives an excuse to kill the program and cut funding for an already neglected area of policy. Given that low-income young adults aren’t a powerful political constituency, advocates say it’s unlikely that money taken from Job Corps would be redirected toward other youth-serving programs. 

Nevertheless, the crisis young Americans are currently facing requires a better solution. Soaring youth unemployment rates should serve as an impetus for the program’s reform—and for substantial investment in new approaches. Sargent Shriver’s vision of a second chance for young people in poverty is more relevant today than it has ever been. But young Americans deserve more and better than Job Corps. 

There are some obvious short-term reforms to the program, such as targeting it to the older students who benefit the most. “They should tilt it more toward 20-to-24-year-olds, and if it’s effective for that group they should grow it,” says Harry Holzer, a labor economist at Georgetown University. Job Corps could also open more commuter campuses to benefit students like Ricky Gass, who have child care or other obligations that make living on campus impossible. 

The larger priority, however, is to dramatically expand investment in pilot programs and new approaches that show promise. 

Job Corps should reorient toward the older students who benefit the most. Meanwhile, the federal government should dramatically expand investment in new approaches that show promise.

More money, for instance, should be going to expand apprenticeships, which are effective and relatively inexpensive. “There aren’t a lot of substitutes for getting young people to work in real employment situations with real supervisors, real mentors, and a real occupational goal,” says Robert Lerman, an economist at the Urban Institute. A well-structured apprenticeship, Lerman says, could cost the federal government as little as $5,000 per participant, given that employers pick up much of the tab. 

More funding could also go to programs like Year Up, which trains students for careers in IT, sales, software development, and more, and places them in paid internships with mentoring and other support. Since its founding in 2000, the organization has enrolled more than 29,000 students, and reports an average starting salary of $42,000 for its graduates. Also promising is the National Guard Youth Challenge Program, a boot camp–style residential program that has achieved relatively strong results among 16-to-18-year-olds—the group that Job Corps has failed to benefit. Evaluations find that graduates earned, on average, 20 percent more than non-attendees and were 86 percent more likely to go on to college. 

None of these programs, however, is a panacea. In the post-pandemic economy, it’s unclear how many employers will be able to afford to hire apprentices. Apprenticeships also tend to be selective programs that favor job-ready applicants, as is also the case with Year Up. Only 28 states offer the National Guard Challenge Program, and its military focus isn’t for everyone. 

In an ideal world, young adults would have a robust set of choices to help them reconnect to school and work. “Young people need a myriad of options, and certain strategies are going to work better for certain circumstances,” Chekemma Fulmore-Townsend, president and CEO of the Philadelphia Youth Network, told me. The problem today is that there are too few options, most of them underfunded or—like Job Corps—inadequate, while the demand for assistance has exploded. “Did the pandemic stress an already stressed system?” Fulmore-Townsend asked. “Absolutely.” 

Recently, the Reconnecting Youth Campaign, a coalition of youth advocacy organizations, called for a dramatic expansion of funding for youth programs, including $500 million for the Corporation for National and Community Service, which runs AmeriCorps, and $5.5 billion for summer jobs and paid work experience programs. 

These investments would be a good start, but they are long overdue. As Congress and the Biden administration work to rebuild the economy, they need to ensure that the nation’s young people aren’t left behind.

The post Out of School, Out of Work appeared first on Washington Monthly.

]]>
127563
Why We Should Rethink Voting Rights from the Ground Up https://washingtonmonthly.com/2021/04/04/why-we-should-rethink-voting-rights-from-the-ground-up/ Mon, 05 Apr 2021 00:30:23 +0000 https://washingtonmonthly.com/?p=126995 Stacey Abrams

It's not enough just to fight discrimination. The government needs to take affirmative steps to make voting easier.

The post Why We Should Rethink Voting Rights from the Ground Up appeared first on Washington Monthly.

]]>
Stacey Abrams

In the wake of the slow moving but powerful blue tide that ushered in a Democratic-controlled Senate, all eyes were on Georgia and the unmatched skill of Stacey Abrams. Not hours had passed before House Speaker Nancy Pelosi vowed to put voting rights on the agenda.

Since then, Congressional Democrats have introduced H.R.1. The For the People Act, a sweeping reform bill meant to modernize and democratize our elections, with a vote on the bill planned for the first week of March. This sprawling 700-page document, covering everything from voter registration and vote-by-mail to ballot security and campaign finance reform, has drawn the ire of Republicans charging legislative overreach.

If it appears that way, it is only because we operate under a faulty understanding of voting rights, one which focuses on the prevention of discrimination rather than the promotion of an affirmative right to vote. It’s time to finally rethink that. If we do, H.R.1 will seem more like a good start than a radical measure.

We tend to think of voting rights in terms that political philosophers would characterize as a negative liberty—as freedom from discrimination, from obstruction, from intimidation. This is no surprise given how voting rights have evolved in the United States. There is no constitutional guarantee of the right to vote, and the Bill of Rights is silent on the matter. This, of course, is by design.

The founders had no intention of extending the franchise beyond the wealthy white men who held power at the time. As increasing wealth and access to property made this restriction on the franchise moot, states began removing tax and property qualifications for white men, but made no substantive assurances of an affirmative right to vote. Subsequent acts enfranchised other segments of the population—Black men through the 15th Amendment and later white and Black women through the 19th—but again offered no guarantee of the right to vote, only the right to be free from discrimination in voting. This has now been entrenched through decades of litigation linking voting rights with the Equal Protection Clause of the Constitution. And this is effectively what the Voting Rights Act of 1965 and its subsequent amendments codified.

The problem with conceiving of voting rights in terms of negative liberty is that it limits us to a defensive strategy, litigating egregious cases in a piecemeal way and targeting only those things that would prevent individuals from voting. Understanding voting rights in terms of positive liberty would require attention to not just the things that prevent citizens from voting, but also to those that empower them to do so.

A truly progressive agenda would treat voting rights like economic rights, which also are not guaranteed in the Constitution, but have been earned through decades of struggle. Various groups mobilized throughout our history to demand that the government not just protect us from harm, but also provide the resources necessary to secure our well-being. (Think of the labor movement or the fight for entitlements such as Social Security and Medicare.) So, too, must we demand that our government support citizens in exercising the franchise, anticipating and meeting the needs of voters at every step from registration to casting a ballot.

This was the genius of Stacey Abrams and her network of advocacy organizations. Their work began with voter education and extended to mobilization, identifying the pitfalls that get between voters’ desire to vote and their ability to do so. In 2020, given the challenges of the pandemic and the subsequent and rapid shift to vote by mail, this was especially crucial. Ensuring that voters understood the different steps took outreach and education. Abrams’ success built on the work of previous generations who had laid a solid foundation for protecting voting rights in Georgia. But Abrams’ great achievement was understanding that it was not enough to guard against discrimination and that in fact our laissez faire approach to elections disadvantages all but the most well informed and politically connected. Hers was a more proactive approach that addressed the obstacles, both malicious and benign, that stand in the way of the right to vote with a systematic strategy to help citizens overcome them.

This is what we need to demand now, not of voting rights activists, but of our government.

Today, the work of voter education and mobilization falls either to private advocacy organizations or to political campaigns. This presents serious challenges since advocacy groups tend to be limited in their reach, and campaigns have incentives to focus only very narrowly on voters who they believe will be favorable to their candidate. Historically there have been many other organizations involved in these efforts including trade unions, civic organizations, and religious groups. But as these institutions have atrophied or withdrawn from their roles in civil society, a tremendous gap is left behind in terms of the support voters may receive.

Those who have called for a constitutional amendment to secure the right to vote are on the right track. But the truth is, we do not need a constitutional amendment to make demands on our government. The most substantial advances of the 20th century including welfare policy, education policy, and health care policy, have come not through the Constitution, but through advocacy. This advocacy has to start with a shift in what we think is appropriate and worthwhile to demand.

H.R. 1 is a step in this direction. And messy as it may be, it brings us closer to achieving an affirmative right to vote in the United States. Many of its provisions still focus on the prevention of discrimination. But others move past this issue to introduce automatic voter registration, secure mail-in voting, and many election administration improvements that would streamline in-person voting. Such provisions remove known obstacles to participation and simplify what can often be a confusing multi-stage multi-deadline time-consuming process for voters.

But more can still be done. Our government can and should require all employers to provide time off for employees to vote, provide sample ballots ahead of time so that voters know what will be asked of them and have time to gather the necessary information and establish an independent non-partisan agency to contact voters in advance to answer questions and provide support.

It is time to shift the frame. While the prevention of discrimination is worthy and noble, it cannot limit our political imagination, especially with so many increasing challenges to voting access by those claiming irregularities and fraud. Right now, our best defense is a more robust offense.

The post Why We Should Rethink Voting Rights from the Ground Up appeared first on Washington Monthly.

]]>
126995
What Joe Biden Can Learn from the Greek War of Independence https://washingtonmonthly.com/2021/04/04/what-joe-biden-can-learn-from-the-greek-war-of-independence/ Mon, 05 Apr 2021 00:25:30 +0000 https://washingtonmonthly.com/?p=127505 Delacroix - Episode from the Greek War of Independence, 1856

Two centuries ago, a fight for freedom in a distant land forced America to balance foreign policy idealism with realism. We need to find the right balance for today.

The post What Joe Biden Can Learn from the Greek War of Independence appeared first on Washington Monthly.

]]>
Delacroix - Episode from the Greek War of Independence, 1856

Few presidents have entered the White House with as much foreign policy experience as Joe Biden—30 years on the Senate Foreign Relations Committee, four as its chairman, eight as vice president during a time of war and global financial collapse. Yet he is already struggling to manage one of the central tensions of American statecraft—between the need to make cold-blooded decisions to protect U.S. interests and the belief, strongly held in many quarters, that the United States also should defend and advance democracy and human rights beyond its borders.

In a February 5 speech before State Department employees, Biden called for a diplomacy “rooted in America’s most cherished democratic values: defending freedom, championing opportunity, upholding universal rights, respecting the rule of law, and treating every person with dignity.” A few weeks later, his administration released a report confirming that Saudi Arabia’s Crown Prince Mohammed bin Salman approved the operation that led to the murder of the Washington Post columnist Jamal Khashoggi. The administration also imposed sanctions on 76 of the individuals involved and froze military sales to the kingdom. But Biden chose not to sanction MBS himself, out of fear of losing Saudi cooperation in countering terrorism and Iran. As a result, he was widely castigated in Congress and the press for being a hypocrite on human rights.

Donald Trump was the rare president who escaped this “idealism”-versus-“realism” quandary in international affairs, thanks largely to the incoherence of his own foreign policy views and the fact that he sincerely didn’t give a shit about democracy, human rights, and the rule of law—and convinced his followers not to care, either. But he was also aided by the blunders of his predecessors that gave democracy promotion a bad name. George W. Bush launched a catastrophic ground war in Iraq with hyperbolic statements about “ending tyranny in the world.” Barack Obama gave eloquent rhetorical support to Arab Spring uprisings but chose not to commit American might to defend them—except in the case of Libya, which didn’t turn out too well. For examples of presidents more successfully balancing morality and realpolitik one has to go back to Bill Clinton’s ending of the wars in Bosnia and Kosovo or Ronald Reagan’s brinkmanship with the Soviet Union.

But to fully appreciate how deeply rooted this tension in U.S. foreign policy is, it helps to look back even further, to when it first manifested itself two centuries ago during the presidency of James Monroe. Like Biden, Monroe governed during a time of rising autocracy. The European powers had recently come together at the Congress of Vienna to reestablish the monarchies Napoleon had overthrown. Their militaries were crushing democratic uprisings in Spain, Portugal, and Italy—and threatening to do so to independence movements in Latin America.

It was in this environment that Monroe articulated a foreign policy doctrine, mostly written by his secretary of state and White House successor John Quincy Adams, that today bears his name. The Monroe Doctrine declared that the United States would consider any attempt by a European state to oppress or control any country in the Western Hemisphere a hostile act. It was intended as a warning to the colonial powers not to restrict the potential spread of democracy in Central and South America nor press any claims on North American territory, thereby clearing the way for U.S. westward expansion. The doctrine also stated that, in return, the United States would not involve itself in the affairs of Europe—a vow meant to protect the ability of American merchants to trade freely on an equal footing without being caught up in Europe’s endless commercial intrigues.

But just as Monroe and Adams were formulating their new policy, an unexpected event occurred that complicated their plans. On March 25, 1821, Christians in southern Greece launched an insurrection against their Ottoman Turkish overlords and declared the creation of an independent democratic Greek state. News of this event captivated the Western public. Pressure quickly grew in the press and Congress for the United States to recognize the new Greek government and support its war of independence.

The public’s support for the Greek cause was partly out of religious prejudice. The Turks were Muslims, and their oppression of the Greeks, including massacres of the innocent, was widely reported (Greek slaughter of innocent Turks, less so.) But it was also because the educated classes in the West had become obsessed with the glories of classical Greece as the result of the greater availability of ancient Greek texts in translation. The idea that the modern Greeks might throw off tyranny and rebuild the virtuous self-governing civilization of their ancestors fired the imaginations of Westerners with republican sympathies—most notably Lord Byron, the English Romantic whose wildly popular poetry gave voice to the idea. It acquired a name: philhellenism—a Greek word meaning “love of Greece.”

Philhellenism was especially strong in the United States. As citizens of the world’s lone republic, Americans had come to see themselves as the inheritors of Greek democracy. The spread of Greek revival architecture and the naming of American towns after ancient Greek ones—Syracuse, New York; Athens, Georgia—give you a sense of how culturally potent this sentiment was at the time.

Monroe himself had some sympathy for the idea of formally supporting the Greek revolution, a position he knew was popular with American voters. So too did his secretary of war, John C. Calhoun, and Speaker of the House Henry Clay, both of whom were eying runs for the presidency in the next election.

John Quincy Adams, however, who was also contemplating a White House bid, believed otherwise. Not only would formally siding with the Greeks contradict the promise of neutrality that was central to Adams’s strategy toward Europe, it would also undermine his efforts to secure a trade treaty with the Ottoman Empire. “Their enthusiasm for the Greeks is all sentiment,” Adams privately wrote of his philhellenic rivals.

The sentiment, however, was shared by many of the country’s most revered elder statesmen. Former President James Madison proposed that Monroe enlist England in a joint statement in support of the Greek war. Thomas Jefferson sent the leading Greek revolutionary thinker Adamantios Koraes, whom he had gotten to know in France when he was the U.S. ambassador, advice on how to structure the new Greek government on American principles—though cautioning his old friend against expecting U.S. support. Even John Quincy Adams’s own father, former President John Adams, confided to Jefferson that “my old imagination is kindling into a kind of missionary enthusiasm for the Greek cause.” But in the end, Monroe sided with the younger Adams that neutrality was the safer course. When Monroe finally delivered his doctrine publicly in the 1823 State of the Union address, he proclaimed his faith that the Greeks would free themselves but did not endorse formally recognizing the new Greek government.

Philhellenes in Congress led by Henry Clay and Daniel Webster countered with a proposal expressing disapproval of the president’s neutral position while encouraging him to send an agent to Greece to collect information. In the days-long debate that followed, Webster made what might be the earliest—and is certainly one of the most masterful—congressional speeches ever delivered on the need for the United States to stand for democracy and human rights in its foreign policy, asking, “Is it not a duty imposed on us, to give our weight to the side of liberty and justice?” He also argued that as a practical matter, the views of average citizens needed to be taken into account in charting America’s foreign policy, as “the public opinion of the civilized world is rapidly gaining an ascendency over brute force.” In the end, however, Congress adjourned without voting on the proposal.

But Webster turned out to be at least partially right about the power of public opinion. After the federal government decided not to get involved in the Greek War of Independence, the American people chose to do so directly. In cities and towns all over the country, citizens banded together to raise money for the Greek war effort, from fancy dress balls for the elite in Boston to collection plates at Methodist churches in the Mid- west. Soon, ships laden with weapons and other supplies were leaving New York and Boston on their way to Piraeus, Nafplion, and other Greek ports. Some assistance was even more direct. Over the 10-year course of the war, a dozen Americans went to Greece to fight as volunteers.

American support, though helpful, was hardly decisive to the war’s outcome. The Greek people, despite their often fumbling leadership, won it with their own blood—though only after the European powers reluctantly got involved and, somewhat by accident, sunk the Turkish navy at Navarino. The sizable quantities of humanitarian aid sent by average Americans, however, did save countless Greek lives. The U.S. effort is currently the subject of considerable media coverage in Greece as that country celebrates the bicentennial of the war.

While most Americans have no memory of this country’s involvement, the experience arguably had a more lasting impact here than in Greece. As the historian Maureen Connors Santelli details in her new book, The Greek Fire: American-Ottoman Relations and Democratic Fervor in the Age of Revolutions, the charitable drives for Greece provided American women their first-ever opportunity to be publicly involved in foreign affairs and thus fed a nascent women’s rights movement. Indeed, some early feminist leaders were prominent philhellenes, including the educator Emma Willard, who sent teachers to Greece to establish schools there after the war.

Santelli further documents how widespread public empathy for the Greeks living under Turkish oppression also led, over the course of the war, to more open questioning of American enslavement of African Americans. Indeed, the abolitionist William Lloyd Garrison originally contemplated going to fight in Greece before deciding to do battle against slavery in America as a journalist and newspaper publisher, writing at one point that rebellious southern slaves “deserve no more censure than the Greeks.” Some of the Americans who did actually journey abroad to fight returned home to become noted antislavery crusaders. The stirring of national conscience that resulted from debate over the war was so profound that the abolitionist Franklin Benjamin Sanborn would later say that the eventual ending of slavery in the U.S. had “begun in Greece.”

One could make the case, then, that Monroe and Adams managed America’s response to the Greek revolution masterfully. By insisting that the U.S. government remain neutral but tolerating and even enabling private involvement by the American public (U.S. naval forces actively protected the private cargo ships ferrying supplies to Greece), they found a clever way to balance the tension between realpolitik and idealism. It allowed them to minimize the risk of war while continuing to pursue a trade deal with the Turks.

It’s also arguable, however, that Monroe and Adams were too clever by half. The trade negotiations between Washington and Constantinople went nowhere until after the war, in part precisely because the Ottoman officials couldn’t make sense of the mixed public-private messages they were getting from the Americans regarding the Greek situation. It’s entirely understandable that the Monroe administration feared provoking the animus of the Great Powers, but in retrospect it’s clear that those powers were in no position to reassert their colonial authority or stop America’s westward expansion. By playing it safe, the administration missed an opportunity—one philhellenes like Daniel Webster advised they seize—to make an official statement to the world that the United States stood against the rising tyranny of the time. Such a statement might have proved helpful to the many independence movements in Europe and elsewhere that were to come.

The world is obviously vastly different than it was two centuries ago. America was then a young and rising free nation with growing prosperity and equality (for white men). Europe was in the grip of reactionary tyranny, from which the United States wisely sought to isolate itself. Today, both the U.S. and the nations of Europe are democracies enmeshed with each other in alliances like NATO that have kept the peace and protected their freedoms for nearly three-quarters of a century. But both are also suffering from growing inequality and downward mobility.

Still, the parallels between 1821 and 2021 are worth paying attention to. Now as then, authoritarianism is on the march. According to the latest assessment from Freedom House, democracy has been in worldwide decline for 15 straight years, including here in the United States over the past four, thanks in no small part to authoritarian meddling by Russia and China.

The need to balance the demands of principle and practicality in foreign affairs is as great now as it was 200 years ago, if not greater. Biden will have to find that balance as he navigates a host of individual foreign policy challenges, such as the military coup in Myanmar, China’s continuing genocide against the Uyghurs, and the increasing authoritarianism and provocations of Turkey’s neo-Ottoman leader Recep Tayyip Erdoğan.

What Biden needs is a doctrine of his own—a comprehensive and workable strategy that can both advance our economic and security interests and defend democracy against resurgent authoritarianism. In the forthcoming issue of the Washington Monthly, Wesley Clark, the former supreme allied commander of NATO who led the successful military intervention in Kosovo, proposes such a strategy. It would involve a new binding agreement among the United States, the European Union, and the United Kingdom on policies such as trade, antitrust, and technology transfer to counter the predations of tyrannical states like China while reversing the economic gutting of the middle and working classes that breeds right-wing populism here and around the world. Trumpian fears of China, Clark argues, might motivate some GOP support for such an agreement. If not, it could be negotiated as a trade deal, which Senate Democrats could pass on their own.

How much enthusiasm there would be among Democrats for such an agreement is another matter. Liberal-minded citizens have historically provided much of the energy behind demands for a more democracy- and human rights-based foreign policy. Today’s left, however, seems relatively quiescent on that front. This is partly out of disenchantment, especially among young people, with almost any application of U.S. global power. It is also based on the feeling that we have no business lecturing anyone overseas about democracy and human rights when we still have immense structural racism and sexism here at home.

On that point, the story of U.S. involvement with the Greek War of Independence is especially instructive. In ways no one could have foreseen, America’s engagement in that fight accelerated necessary confrontations with our own society’s wrongs. That same dynamic played out in later conflicts, too. The need for mass mobilization in World War II compelled the U.S. government to integrate white ethnic communities into the mainstream culture. The need to counter Soviet propaganda during the Cold War opened the way to civil rights advances for Black people. If progressives want to dismantle racist and sexist systems in America, they should set their sights higher by also demanding that their government advance policies that protect and defend democracy and equal rights around the world.

The post What Joe Biden Can Learn from the Greek War of Independence appeared first on Washington Monthly.

]]>
127505
Sickness in Health https://washingtonmonthly.com/2021/04/04/sickness-in-health/ Mon, 05 Apr 2021 00:20:20 +0000 https://washingtonmonthly.com/?p=127568 In the Hospital Sick Male Patient Sleeps on the Bed. Heart Rate Monitor Equipment is on His Finger.

A journalist’s fly-on-the-wall coverage of one small Ohio hospital reveals the deeper story of America’s broken medical system—and the heartland’s decline.

The post Sickness in Health appeared first on Washington Monthly.

]]>
In the Hospital Sick Male Patient Sleeps on the Bed. Heart Rate Monitor Equipment is on His Finger.

Hospitals are among the most opaque institutions in American life. Few allow their doctors to talk to the press except in the hovering presence of “handlers.” Though they often employ cadres of “communications specialists” who pitch reporters with puff pieces, most are obsessed with keeping their finances and internal operations secret. 

The Hospital: Life, Death, and Dollars in a Small American Town
by Brian Alexander
St. Martin’s Press, 310 pp.

The first remarkable thing about Brian Alexander’s new book, The Hospital, is that he managed to pull off an exception to this seeming iron law of U.S. health care. He never explains exactly how, but in early 2018 he persuaded the CEO and board of a small, community hospital in rural Bryan, Ohio, to give him fly-on-the-wall access to their struggling institution—and complete freedom to write up what he witnessed. 

For the next year and a half, Alexander attended long rounds of anguished and divisive strategy sessions. Administrators and board members fought with consultants and each other over how to bring in enough revenue to avoid having to shut down or sell out to a big hospital chain. As Alexander became embedded in the hospital’s day-to-day operations, he gained the confidence of harried doctors and nurses, as well as patients and their loved ones. 

Alexander’s previous book, Glass House: The 1% Economy and the Shattering of the All-American Town, was about the economic decline of his hometown of Lancaster in central Ohio, so he had some advantages in penetrating Bryan’s tightly knit social networks. His career as a journeyman feature writer for publications like The Atlantic and Outside also left him practiced in capturing small but unforgettable details about the many different characters he met in Bryan. 

In one plot line, Alexander traces the tragic arc of a local man named Keith Swihart. When in his early 20s, Swihart faints while working on an assembly line at an auto parts factory. Diagnosed with type 2 diabetes, he often goes without the insulin his doctor prescribes because even with company-sponsored insurance he can’t afford it. Then, in the 10 years following the Great Recession, Swihart faces a series of layoffs and crappy jobs with ever-lower wages and ever-higher-deductible insurance. The grinding downward mobility leaves him and his family exposed to such high medical costs that they can’t afford even routine health screening. Swihart loses his wife to cervical cancer, which wasn’t caught until it was too late. Then, as a grieving widower and single dad, he winds up losing his eyesight and enduring an amputation in the hospital after complications from his poorly managed diabetes land him in the ER. 

But Alexander does not let the sad story end there; he also shows how the amputation affected the local pathologist, who was left to deal with Swihart’s discarded body parts. “Shannon Keil opened the cottage cheese tub containing Keith’s toes and part of his foot,” he writes. “Corruption had invaded the tissue, seeped into the bones, oozed its way up the metatarsals until they became pliable, decayed, and no longer able to support a man.” Alexander uses this image as a symbol for the entire U.S. health care system. “What a fucking failure,” he later quotes the pathologist as saying, explaining that she didn’t mean Swihart or the medical procedure, but the malevolent “forces” within our health care system and beyond “that had swept his toes into that small tub.” 

Alexander’s understanding of just what those forces are and how they operate beneath the surface of events is the second most remarkable thing about this book. While it takes talent and enterprise to go out into the hinterlands and bring back stories full of pain and pathos, plenty of literary journalists have done that. But Alexander also brings to his writing a deep understanding of the larger economic, political, and social trends that are slowly crushing the lives of the people he met in Bryan, and of people like them all over this country. In his telling, Bryan becomes a microcosm of American sickness in all its dimensions.

Alexander begins his story at the beginning. The land on which Bryan sits was once part of the Great Black Swamp, as it was referred to in pioneer days. The landscape was bleak and boggy and barely inhabited, but it was flat, which meant that it would become a favored route for early railroads as they pushed westward. Industry soon followed the rails, and by the 1920s Bryan had become a small but prospering manufacturing center looking to become a modern, progressive metropolis. Reaching that goal would require a modern, progressive hospital, and the town’s boosters soon came together to advocate for one. “Theirs was the vision of a hospital as public good and community asset,” Alexander writes. “There was little mention of money, except that the lack of it would be no barrier to treatment.” 

But like their counterparts in many other small communities across the country, Bryan’s boosters soon found the cost of their dream prohibitive. Alexander does a masterful job of explaining the underlying reasons why. 

As in so many other places, the gutting of Bryan, Ohio’s economy meant that, by default, its hospital became the largest local employer, as well as virtually the only remaining source of upper-middle-class jobs.

During this era, other countries, such as Germany, were crafting large social insurance programs. These systems spread the cost of health care across the whole population, thereby guaranteeing hospitals adequate revenue regardless of the income of their patients. But doctors in the United States, fearful of reductions to their autonomy and earning power, organized into a powerful lobby that opposed not only any kind of national health insurance program but even private plans that could make health care more affordable. In 1929, after two doctors in Los Angeles formed a practice that covered all health care costs for a fixed monthly subscription fee of $1.50, the wrath of other area medical professionals, on whom they depended for referrals, was so extreme that the Los Angeles Medical Association expelled the two doctors. Other physicians who experimented with similar payment plans were denied hospital privileges or were otherwise professionally shunned. 

Meanwhile, America’s dominant unions and large employers also opposed any role for government in providing health care to everyone. A spokesman for the National Association of Manufacturers said that “there is no greater reason for giving free medical service than free food.” Samuel Gompers, president of the American Federation of Labor, argued that if workers were just paid the wages they deserved they could afford to buy their own health care without relying on employers or the government. Both labor and capital joined with doctors in vilifying any effort to spread the cost of health care, using the enduring epithet “socialized medicine.”

In the absence of a national health care plan, the problem of how to provide care to people who couldn’t afford it had to be finessed. When Bryan finally got its own hospital in 1936, it was not a straightforward public institution like the town’s schools, library, or police department. Rather, like most hospitals in the country today, it was a weird, uniquely American hybrid: a nonprofit institution, incorporated as a charity, that provides some contingent care to the needy but effectively operates as a workshop for profit-seeking doctors and health care entrepreneurs. Bryan’s hospital would become an enormous source of local pride. But though it benefited from myriad forms of community support—from fund-raising drives to exemptions from local property taxes—it remained in essence a private corporation in the business of marketing medical services. 

As Bryan prospered, along with most middle-American towns and cities during the decades after World War II, this contradiction was problematic but workable. In the 1960s, millions of kids owned Etch A Sketches, which were all built in Bryan by the Ohio Art Company. Another local firm called ARO flourished by selling breathing masks for high-altitude pilots, and later for NASA astronauts. Mohawk Tools made industrial drill bits and other cutting devices used by carmakers and aircraft manufacturers. Good-paying factory jobs at these and other local firms typically came with generous employer- or union-sponsored health care plans that paid out whatever physicians and hospitals determined was a reasonable fee for their services. By charging newly flush doctors for office space, the hospital had little trouble maintaining its margins while still being able to treat charity cases. 

But by the end of the 1970s, the financing of independent, community-based hospitals like Bryan’s was becoming increasingly difficult, especially in the deindustrializing heartland. First, cheap imports hit the U.S. auto industry hard, including Bryan’s auto parts suppliers. Then, beginning in the Reagan era, financial deregulation and the federal government’s retreat from antitrust enforcement set off a frenzy of mergers, acquisitions, and leveraged buyouts that further decimated Bryan’s locally owned firms. ARO got bought out by Todd Shipyards, which raided the manufacturer’s pension funds and then sold the stripped-down firm to the corporate giant Ingersoll Rand, which in turn moved all of its remaining operations to North Carolina. A Canadian corporation acquired Etch A Sketch, reducing Ohio Art to a remnant of its former self. An Irish firm bought out Mohawk Tools.

“There was a time when on spring and summer Friday afternoons,” Alexander writes, “people in Bryan could stand outside, look toward the county airport, and watch the private planes streak overhead.” But, Alexander continues, as Bryan’s local firms failed or came under the control of distant corporations, “the planes stopped flying, the country clubs’ fairways grew over with weeds, and empty windows faced the town squares.”

As in so many other places, the gutting of Bryan’s economy meant that, by default, its hospital became the largest local employer, as well as virtually the only remaining source of upper-middle-class jobs. The town tried desperately to attract new businesses with tax giveaways and other subsidies. To persuade the midwestern home repair chain Menards to place a distribution center at the edge of town, Bryan built a new road. But even when public subsidies attracted new employers, the companies typically paid little more than the minimum wage and offered minimal health care benefits. The hospital’s CEO complained to Alexander that Menards was “a real problem for us” because three-quarters of Menards employees treated by the hospital were either on Medicaid, which pays hospitals low reimbursement rates, or lacked insurance altogether, which meant no reimbursement or lots of bad debt. 

Meanwhile, as more young and ambitious people left Bryan in search of better prospects, the remaining population served by the hospital grew older and sicker. This increased the demand for some profitable treatments, like heart surgery and chemotherapy, which helped the hospital’s margins. But the hospital could not find a way to expand the kind of medical services that Bryan’s downwardly mobile population most needed: services like helping patients to better manage their diabetes, overcome their alcohol and opioid addictions, or avoid falling victim to the growing epidemic of mental illness.

This remained true even after the expansion of Medicaid and private insurance under the Affordable Care Act. The ACA did bring more revenue to the hospital, but not enough to end the constant need to raise money. “We are seeing more bad debt than we were before [the passage of the Affordable Care Act and Medicaid expansion] from people who do have health insurance,” the hospital’s chief financial officer told Alexander. So the ACA wound up doing little to reorient the practice of medicine away from chasing dollars and toward providing the services most needed by the community. This is no doubt partly why Donald Trump’s attacks on the ACA and established politicians resonated deeply in Bryan. In Williams County, where Bryan is located, he took 69 percent of the vote in 2016. 

Alexander portrays everyone involved in trying to save Bryan’s hospital as largely motivated by a combination of local pride and idealistic dedication to the health care needs of their community. But given the perverse incentives under which the hospital was forced to operate, it became nearly impossible to reconcile its mission and its margins. Facing threats from bondholders as its reserves dwindled, the hospital was increasingly desperate to bring in more revenue. Yet all the most pressing medical needs identified by the local health department were ones that make little or no money. 

Facing threats from bondholders as its reserves dwindled, the hospital was desperate to bring in more revenue. Yet the most pressing medical needs identified by the local health department—primary care, pediatrics, mental health—were ones that make little or no money.

“Pediatrics, primary care, obesity, mental health, and dentistry all affected a lot more people,” Alexander writes, “but none of them were big moneymakers like a cath lab or radiation oncology were. They didn’t spin off lots of lab tests, either. Good dental care could prevent heart attacks, but treating heart attacks, and placing the stents used to prevent another one, made profits.” Following the advice of industry consultants, the hospital attempted to increase its “efficiency” by investing more in health care services that bring in the most money while neglecting those that create the most health. 

It gets worse. By the time Alexander arrived in Bryan in 2018, the hospital was also facing an increasingly mortal threat from two other failures of public policy. Deregulation of Wall Street and lax enforcement of antitrust laws had not only led to the loss of locally owned businesses; they had also made Bryan’s hospital vulnerable to takeover by giant health care conglomerates. 

As Alexander explains, two big hospital chains in the region, one expanding out of Toledo to the east and another expanding out of Fort Wayne, Indiana, to the west, had been gobbling up small hospitals like Bryan’s for more than a decade, “in a crazed rush to consolidate before they could be targeted themselves by even bigger predators.” And each time antitrust regulators failed to block another hospital merger, the power of the big over the small grew. 

Alexander provides a salient example of how this power imbalance compounds over time. When the hospital in Bryan needs to buy a stent for a patient’s clogged coronary artery, the best price it can get for the device is around $1,400. But bigger hospitals can obtain the same stent for as little as $750 because their market power gives them a stronger negotiating position with the manufacturers and distributors of stents and other medical supplies. 

The same raw power dynamics now set prices and allocate resources at every level of the health care sector. Drug companies, pharmacy chains, medical device makers, and insurance companies merge with each other to gain more leverage over hospitals and doctors’ practices. Hospitals and physicians in turn consolidate into still bigger, integrated health care oligopolies so they can push their consolidated suppliers around while also vanquishing their competitors. In the process, any hospital that remains small and independent becomes almost hopelessly vulnerable to the ones that have consolidated. Those of us who are mere consumers of health care bear the cost of consolidation in the form of rampant health care cost inflation, evaporating choice, and declining quality of service. 

When Alexander finished his reporting in Bryan just a few months into the COVID-19 crisis, conditions were deteriorating. Inflamed by Trump’s disinformation and a deepening distrust of government, much of the local population, along with large swaths of America, rejected mask wearing and flouted social distancing orders. Ohio’s chief public health official found gun-toting protestors on her lawn after a Republican state legislator called her “an unelected Globalist Health Director.” Meanwhile, Bryan’s hospital hemorrhaged money as it halted elective surgeries, lab tests, and imaging, and it more than doubled the number of hospital beds to accommodate COVID-19 patients. “Money bled out as if from a gushing wound,” Alexander writes. By August 2020, the hospital had lost $10 million, though reportedly it later recouped this loss with emergency government payments. 

Just as the coronavirus crisis was intensifying, a new CEO at the hospital decided it would be best for Alexander to leave, so that is where his fly-on-the-wall insights end. But the pandemic has likely increased the already huge financial pressure on Bryan’s hospital to close or sell out to a big corporate chain. Large, rich health care systems have the reserves to easily endure the financial strains brought on by the pandemic. But for small independent hospitals, like small independent businesses generally, surviving COVID-19 has become far less likely, so the monopolization of health care and the wider economy continues to accelerate. Upon completing this book, it occurred to me that it would be fitting to include it in a time capsule, so future generations might learn just how the promise of American life faded on our watch.

The post Sickness in Health appeared first on Washington Monthly.

]]>
127568 April-21-Alexander-Books The Hospital: Life, Death, and Dollars in a Small American Town by Brian Alexander St. Martin’s Press, 310 pp.
From Laundress to Business Mogul https://washingtonmonthly.com/2021/04/04/from-laundress-to-business-mogul/ Mon, 05 Apr 2021 00:15:58 +0000 https://washingtonmonthly.com/?p=127570 Madam C.J. Walker

A century after the Tulsa massacre, remembering Madam C. J. Walker, who relied on Black institutions and her own brazen determination to become one of the richest businesswomen in America.

The post From Laundress to Business Mogul appeared first on Washington Monthly.

]]>
Madam C.J. Walker

One hundred years ago this May, a white mob massacred hundreds of Black people in the Greenwood neighborhood of Tulsa, Oklahoma. The 35-square-block district had been a thriving Black business center—so much so that it became known as Black Wall Street. Black entrepreneurs, locked out of other parts of Tulsa by Jim Crow laws, ran luxury hotels, insurance companies, grocery stores, transportation services, newspapers, and theaters in the community. A wealthy Black landowner, O. W. Gurley, gave loans to residents who wanted to start their own businesses. Black prosperity begat more Black prosperity.

Madam C. J. Walker:
The Making of an American Icon
by Erica L. Ball
Rowman and Littlefield Publishers,
166 pp.

But it also led to white resentment. A false allegation that a Black man had raped a white woman activated white locals. They surged through the streets, shooting Black people on sight, looting Black homes, and bombing more than 600 Black-owned businesses. Over the course of two days, nearly the entire district was burned to the ground. 

Entrepreneurship has fueled progress for Black Americans, growing the middle class and funding the fight for racial equality. But it has also been met with waves of devastation. Black farming languished in the 20th century, in part because the U.S. Department of Agriculture discriminated against Black farmers when assessing loan applications. The construction of the interstate system in the 1950s and ’60s wiped out Black business districts in cities across America. The federal government’s retreat from enforcing antitrust laws starting in the late 1970s led to the collapse of small Black-owned firms across the country. The Great Recession in 2008 set back another generation of Black entrepreneurs. 

Now, Black business owners are being wiped out again, this time by a virus. Black entrepreneurs disproportionately run businesses in retail or hospitality, two sectors that immediately took a hit when states implemented social distancing measures. Between February and April 2020, 40 percent of Black-owned firms closed, according to analysis from Robert Fairlie of the University of California, Santa Cruz. Seventeen percent of white-owned companies closed during the same period. 

The current struggle for Black entrepreneurship makes a new book chronicling the life of Madam C. J. Walker especially relevant. Walker, a Black woman who built a beauty product empire in the early 1900s, became one of the richest businesswomen in America. The Madam C. J. Walker Manufacturing Company sold hair and skin products and, at its peak, employed nearly 25,000 agents to sell them throughout the Americas. 

Entrepreneurship has fueled progress for Black Americans, growing the middle class and funding the fight for racial equality. But it has also been met with waves of devastation.

The lingering question for scholars—and, more urgently, for Black entrepreneurs—is how Walker managed to do it. Her life is the subject of several biographies, academic lectures, children’s books, and even a Netflix miniseries. The most recent addition to that literature comes from Erica Ball, the department chair of Black studies at Occidental College, with Madam C. J. Walker, a deeply researched book that situates Walker’s story just one generation removed from chattel slavery, in turn-of-the-century America, when Black people sought to renegotiate their contract with society. It also illuminates her business strategies. Walker built her empire in coalition with other Black institutions and used her working-class background and her philanthropy to connect with the Black masses, not just the Black elite. What she may have lacked in pedigree, she made up for with brazen determination.

Walker was born Sarah Breedlove on December 23, 1867, a few years after the Emancipation Proclamation, in a one-room cabin on a cotton plantation in Delta, Louisiana. Both of her parents, who had been enslaved on the plantation, died before she was 10. As the Reconstruction era was ending and racial terror spiked, Sarah and her older sister moved to Vicksburg, Mississippi. She arrived without the extensive training that would have allowed her to work as a cook, and as a newcomer she lacked the referrals necessary to be a maid or a nurse for a white family. So she worked as a laundress—a position that was thought to be on the lower end of the socioeconomic ladder but enabled her to develop her own client base as an independent contractor. At the age of 17 she gave birth to her daughter, Leila, and in a few more years she moved on to St. Louis, where three of her brothers lived. (Much of what is known about Sarah’s early life is sourced from the subject herself.)

In St. Louis, Sarah began suffering from severe scalp ailments, including hair loss. This was common at the time, especially among working-class Black women, who often used harsh products like lye soap on their hair. In 1903, she was introduced to Annie Pope, a Black hair care entrepreneur. She gave Sarah a treatment of her “Wonderful Hair Grower,” and it impressed Sarah enough that she became one of Pope’s sales agents. 

Soon, Sarah moved to Denver, bringing with her a charming and entrepreneurial newspaper-man named Charles J. Walker, whom she had begun dating. Perhaps, Ball speculates, Sarah saw opportunity in a market with few Black beauty professionals. She made connections quickly by joining the local African Methodist Episcopal church, securing work as a cook in a boarding house, and selling Pope’s products. She married Charles in January 1906, and within six months she had decided to become an independent beauty culturist, severing ties with Annie Pope. 

Sarah’s entry into the beauty industry came at an opportune time. For much of the 1800s, most women could only access a limited range of cosmetic products. But by the 1890s, stage actresses like Sarah Bernhardt were challenging long-held ideals of natural beauty and helping bring beauty culture into the mainstream. Soon, department stores and advertisers began targeting women as consumers. As demand rose and suppliers increased, Black women also began to embrace beauty culture. In 1893, Mary Church Terrell, a prominent Black activist, wrote in Rigwood’s Journal, a leading Black women’s publication, “Every woman, no matter what her circumstances, owes it to herself, her family, and her friends to look as well as her means will permit.” Opportunity was expanding for beauty entrepreneurs like Walker. Now 38, she started making, marketing, and selling her own product line in Denver. 

In early 1906, Sarah began referring to herself as Madam C. J. Walker. “Madam” sounded more European, and many white beauty culturists used the title in their marketing. Walker began traveling to other towns in Colorado to sell her products, and in the summer of 1906 she opened a salon in Denver. Leila, now 20, joined her mother to help expand the business. Once Leila was able to run the Denver salon on her own, Sarah and Charles set out on a seven-state tour of the Southeast. 

While traveling, Walker established several procedures and marketing strategies that would drive her success. The first was to rely heavily on local Black institutions. When she arrived in a new city, Walker would start by identifying a Black hotel or family that would take in travelers. In the Jim Crow South, this was both a practical necessity and a way to make local connections. She would also reach out to Black church and community leaders, who could introduce Walker to their constituencies. 

Her second strategy was to offer a demonstration of her hair care system to groups of local women. The demonstrations were intimate, step-by-step tutorials of the hair care process. They were also social events that drew together women who could readily identify with Walker’s life experience as a domestic worker. 

Her third strategy was to sell her own story, which made her both relatable and an aspirational figure. During the demonstrations, Walker forged trust with the audience by marketing herself as a hair “grower” and a healer of sorts, which linked her to a long tradition of Black women who served an essential function as natural healers. She would tell her listeners that when her hair began to fall out after years of working as a washerwoman, help came in the form of divine inspiration: A “big black man” or an “African” appeared in a dream and provided her with a list of ingredients, which she ordered and used to remedy her problem.

Whether it was true or not, the narrative was savvy. Many Black reformers and ministers at the time railed against the dangers of beauty culture and thought cosmetics gave credence to the misguided belief that Black women were hyper-sexual and immoral. Further, Walker’s system of hair care involved elements that could be characterized as hair straightening, a trend that was heavily criticized by reformers as an effort to imitate white people. Walker’s preferred origin story helped ameliorate those concerns.

As customers placed orders, Walker sent them back to Denver for fulfillment, where Leila mixed, packaged, and shipped the products. With a salon out west and a growing customer base in the South, Walker and her husband decided to relocate twice more, first to Pittsburgh, where she opened another salon, and then to Indianapolis. Her willingness to pick up and move, combined with her intuition about which cities had a business community and a customer base that would be receptive to her products, was key to her successful expansion. Indianapolis, for example, was a railway hub that served as a gateway to the Midwest, Northeast, and South. It had a Black population of more than 20,000 and two Black newspapers in which Walker could advertise her products. She purchased a 12-room house and soon began work on a new state-of-the-art factory steps away from her home. In 1910, Walker earned more than $260,000 annually in today’s dollars. 

Although she had agents and customers scattered across the country, her company was not yet a household name. The beauty industry was becoming a strong force in America—product offerings were growing, as was consumer spending—but there was still a strong bias against it, especially among elite Black men. Booker T. Washington, a leading intellectual and champion of Black entrepreneurship, was generally opposed to Black women’s beauty culture. Winning his backing, Walker thought, was a prerequisite for growing a national brand. 

But when she reached out to Washington repeatedly for help in expanding her business, he showed no interest. In letters, Walker pressed Washington for an invitation to a 1912 farmers’ convention that his Tuskegee Institute was hosting, and got no response—but she decided to go anyway. After petitioning conference leaders, she was given permission to speak briefly at an evening chapel service, separate from the regular activities. She also gave demonstrations and treatments to more than 80 customers, including members of Washington’s own family. Walker secured so many contacts and customers during her time at Tuskegee that she decided to open an agency near the campus. 

Her mission to win Washington’s support didn’t stop there. She attended the 1912 National Negro Business League conference in Chicago, but Washington didn’t allow her to address the assembly. When another delegate requested that Walker be allowed to speak, Washington remained firm. Walker rose to her feet and said to him, “Surely you are not going to shut the door in my face!” She gave an impromptu speech defending her occupation, telling her now-perfected washtub-to-boardroom story, and tying her business to dignity and uplift for the race. With that speech, Walker won over the audience and, finally, Washington. He invited her to speak at the following year’s conference, and agreed to be her guest during his next visit to Indianapolis. As Walker had suspected it would, Washington’s official endorsement instantly elevated her national profile. Later that year, The Freeman, a Black newspaper, published a profile of Madam C. J. Walker and the Walker Manufacturing Company, describing her as “America’s Foremost Colored Business Woman.” 

As Walker’s business empire grew, she became more involved in Black politics, always prioritizing the perspective of the working class. Their views dictated many of Walker’s branding initiatives, including her charitable giving. She made a public pledge of $1,000 toward a new “colored branch” of the YMCA in Indianapolis, a sum that put Walker in league with the wealthy white men who had initially organized the fund-raiser. Walker’s immense contribution expanded her celebrity and earned her a good deal of press, including coverage in The Crisis, the official magazine of the NAACP. Walker also supported projects at Black colleges, and, in exchange, the schools were happy to teach the “Walker method” in their curriculums, creating a direct talent pipeline for the Walker Manufacturing Company. When non-Black companies began creeping into the Black hair care space, Walker convened Black beauty manufacturers, inviting many to her home to discuss the development, which led to the organization of the National Negro Cosmetic Manufacturers Association. Toward the end of her life, she would join the executive committee of the New York NAACP. 

By the early 1900s, Walker’s beauty product empire employed nearly 25,000 women as agents to sell hair and skin products throughout the Americas. She offered her agents dignity, flexibility, and help adjusting to working outside the domestic sphere.

Walker also offered thousands of Black women independent employment with dignity. She offered her agents flexibility, and helped them adjust to working outside the domestic sphere. She took that opportunity beyond American borders when she traveled to the Caribbean and Central America to expand her business and recruit new sales agents. By 1918, the Walker Manufacturing Company was a global enterprise, and that year Walker earned $275,000, or roughly $4.7 million in today’s dollars. By 1919, her net worth was $600,000, more than $9 million in today’s dollars. 

Unfortunately, as her empire rapidly grew, her health declined. She suffered from hypertension for years, which ultimately damaged her kidneys. On May 25, 1919, Walker died at her estate. The following Friday, Ball writes, 1,000 mourners came to her home to pay their respects, including officers from the NAACP, the National Association of Colored Women, and the National Negro Business League. 

Madam C. J. Walker would face a different set of challenges if she tried to launch her business today. One force crushing contemporary Black-owned businesses—the kind Walker relied on when growing her company—is economic consolidation. As the federal government retreated from enforcing antitrust and antimonopoly laws in the late 1970s, larger white-owned corporations began buying up successful, small Black-owned businesses. White-owned chain stores expanded, undercutting smaller Black-owned grocers and pharmacies wherever they went. Large banks acquired Black-owned community banks, replacing pillars of the Black business community with distant corporate entities who weren’t inclined to give loans to Black entrepreneurs. In its 1989 ruling in City of Richmond v. J. A. Croson Company, the Supreme Court essentially stalled any progress Black mayors had made in increasing Black entrepreneurs’ access to municipal contracts. These forces and others widened the racial wealth gap, and financial redlining compounded the problem. Black entrepreneurs trying to launch their businesses today face an extremely inhospitable landscape. 

Ball’s breadth of knowledge is abundantly clear. But writing a biography about the great Madam C. J. Walker posed some research challenges. As the author points out, scholars “rely heavily upon the tightly scripted narrative that Madam Walker created for herself as she built her business empire.” 

Ball has done a masterful job reconstructing the context in which Walker grew her company. That strength is also a liability. Rather than fueling the narrative of Walker’s life, Ball’s research on regional migration patterns and the individual personalities associated with various civic organizations generally comes across as the primary narrative. Often, the reader is left craving a return to the person of Madam Walker. 

But the book shines a light on the world Walker lived in, the structural barriers she overcame, and the barely traveled pathways she utilized to arrive at icon status. If one wishes to learn marketing strategies from a true pioneer, Ball meticulously documents Walker’s playbook—one that Black entrepreneurs would do well to read at this moment in history. If one chooses to draw inspiration from Madam Walker’s commitment to  Black institutions, Ball provides plenty of examples of it. As so many Black-owned businesses close up shop, Walker’s story is evidence that triumphant success is possible—and a reminder to support the Madam C. J. Walkers of the future.

The post From Laundress to Business Mogul appeared first on Washington Monthly.

]]>
127570 Madam C. J. Walker: The Making of an American Icon by Erica L. Ball Rowman and Littlefield Publishers, 166 pp.
Survival Instinct https://washingtonmonthly.com/2021/04/04/survival-instinct/ Mon, 05 Apr 2021 00:10:28 +0000 https://washingtonmonthly.com/?p=127574 Bald Eagle Perched on Stump - Alaska

Conservation movements have won in the past. Can they do it again?

The post Survival Instinct appeared first on Washington Monthly.

]]>
Bald Eagle Perched on Stump - Alaska

One cold January afternoon, as I was hiking my regular trail loop on Theodore Roosevelt Island, I noticed a bald eagle perched in a large oak tree, about 30 feet away. The island is one of Washington, D.C.’s forest parks, a dollop of wilderness in the middle of the Potomac River. I’d seen eagles flying over the river before, their white heads unmistakable, but to see one perched so close felt uncanny, as if I’d stumbled into the presence of a visitor from another world. 

Beloved Beasts: Fighting for Life in an Age of Extinction
by Michelle Nijhuis
W. W. Norton and Company, 351 pp.

As the sun dipped below the horizon, the eagle lifted its wings and flew off. Twice it circled back, gliding with its outermost wing feathers spread out like long fingers. 

The cold set in and my fingers and toes started to grow numb, so I hurried home along the path before the light was entirely gone. The moment would have stuck with me in any year, but this was just three days after Donald Trump’s supporters stormed the U.S. Capitol, plunging the city into angst and anger. The sight of the eagle felt like a bit of grace.

Such an encounter would have been far less likely in this park 30 years ago, and almost certainly wouldn’t have happened in the early 1960s. At that time, there were fewer than 500 nesting pairs across the lower 48 states. Many naturalists worried that America’s national bird was hurtling toward extinction, another casualty of habitat loss, hunting, and pollution—especially the indiscriminate use of DDT, an insecticide that persists in the food chain and causes eagles to lay eggs with paper-thin shells, too fragile to protect the developing embryos inside. 

Today there are more than 71,400 nesting pairs of bald eagles in the country, and their return to the mid-Atlantic region has been particularly spectacular. This winter I’ve seen bald eagles cruising over the nearby Anacostia and Susquehanna Rivers. In 2014, one pair established a nest within the National Arboretum, and wildlife biologists set up a remote camera to watch the eagles raise chicks. 

At a time when climate change threatens entire ecosystems and many elements of human societies, it’s worth remembering the times that people have managed to undo some of the havoc they’ve wreaked upon the planet. That’s one of the implicit messages of the journalist Michelle Nijhuis’s new book, Beloved Beasts. The book is an ambitious effort to chronicle the development of the global conservation and environmental movements over three centuries. “The past accomplishments of conservation were not inevitable, and neither are its predicted failures,” she writes. “Fantasy and despair are tempting, but history can help us resist them.”

At a time when climate change threatens entire ecosystems and many elements of human societies, it’s worth remembering the times that people have managed to undo some of the havoc they’ve wreaked upon the planet.

If there’s one overriding takeaway from Beloved Beasts, it’s that most conservation success stories aren’t the result of a single decisive act, but of many complementary policies working together. When I go hiking on Theodore Roosevelt Island and see bald eagles, I have at least four successive waves of conservation movements to thank—each with varied origins and coalitions behind them. 

The Migratory Bird Treaty Act of 1918 stopped the rampant commercial hunting and sale of birds and feathers. The movement to set aside and restore wilderness areas as parks began in the late 19th century and was expanded by President Theodore Roosevelt in the early 20th. (In 1931, the Roosevelt Memorial Association purchased the little island in the Potomac to make it a park.) The Bald Eagle Protection Act of 1940 and later the Endangered Species Act of 1973 safeguarded the eagle. Finally, DDT was banned in 1972. Within a decade, scientists estimated that the number of bald eagles in the lower 48 states had more than tripled, and it continued to rise.

By 2007, bald eagle numbers had rebounded enough that the bird was removed from the list of threatened and endangered species, although it is still federally protected. 

Beloved Beasts is organized through a series of profiles and interlocking vignettes about key leaders in what Nijhuis calls “the story of modern species conservation.” The subjects hail from different eras and varied political contexts, and include Carl Linnaeus, an 18th-century Swedish botanist who named thousands of species; William Temple Hornaday, the chief taxidermist at the predecessor to the Smithsonian Natural History Museum, who fought to save the American bison from extinction at the turn of the 20th century; Rosalie Barrow Edge, an American suffragist who advocated for wild bird protections in the early 20th century; Rachel Carson, the scientist who documented the detrimental effects of DDT on ecosystems in her 1962 book, Silent Spring; and many others. 

Notably, Beloved Beasts is not a book of nature writing—most glimpses of the natural world come through choice quotes from conservation leaders. As Nijhuis defines her mission, “This book is about the humans who have devoted their lives to these questions [of conservation]—the scientists, birdwatchers, hunters, self-taught philosophers, and others who have countered the power to destroy species with the whys and hows of providing sanctuary.” 

By and large, the people Nijhuis features come from fortunate backgrounds. “Most early conservationists were privileged North Americans and Europeans, and no wonder; location and education enabled them to recognize the effects of humans on other species, and money and status freed them to take controversial positions,” she writes. That means the heirs of the conservation movement today must reckon with questions about who was left out, or denigrated, or displaced by, early environmental campaigns. 

Nijhuis notes that environmentalism’s “early chapters are shadowed by racism, and some conservationists still hold blinkered views of their fellow humans, causing them to mislay blame for the damage they seek to contain.” After Yellowstone was named the U.S.’s first national park in 1872, the federal government forced the Native Americans who lived there to relocate outside park boundaries. 

Modern conservationists now strive to protect the rights of Indigenous groups, but there are still wounds to heal. In the spring 2021 issue of Audubon magazine, the Black ornithologist and birdwatcher J. Drew Lanham wrote an article titled “What Do We Do About John James Audubon?,” reflecting on the racism of the organization’s founder, and how it still hampers modern efforts to make birdwatching a more inclusive hobby. 

The contemporary conservation movement is more self-aware, but it is also operating in far more partisan times. Whether a politician aims to address climate change—or even accepts the underlying science—has become an ideological litmus test. Donald Trump called climate change science a hoax on the campaign trail, and then his administration rolled back more than 100 Obama-era environmental regulations and withdrew the United States from the Paris climate accord. Hours after Joe Biden was sworn in, he began the process of rejoining the agreement.

Support for environmental laws wasn’t always so divided along party lines. As Nijhuis writes, the Endangered Species Act of 1973 “attracted wide support, and few detractors. The National Rifle Association testified on its behalf, and some of the most conservative members of the House and Senate backed it with little hesitation.” The final vote in the House of Representatives was 355 to 4. 

Can we make that kind of progress on environmental policy again? Nijhuis isn’t making predictions. As she writes in the concluding chapter, “Like the human societies they work within, these movements must constantly weigh individual interests against the common good, and those decisions are only becoming more difficult.” But it’s also become clear that the fates of humans and the rest of the planet are intertwined—a quickly warming world threatens both coastal cities and coral reefs. It jeopardizes our agricultural systems as well as the survival of polar bears. Taking action now—by, say, choosing to protect rain forests as habitats for endangered species and as carbon stores for the planet—benefits many species, including our own.

Since this piece was published in print, the U.S. Fish and Wildlife Service released a report updating the total number of bald eagles in the lower 48 states. This piece has been changed to include that new number.

The post Survival Instinct appeared first on Washington Monthly.

]]>
127574 April-21-Nijhuis-Books Beloved Beasts: Fighting for Life in an Age of Extinction by Michelle Nijhuis W. W. Norton and Company, 351 pp.
Can Amazon Be Stopped? https://washingtonmonthly.com/2021/04/04/can-amazon-be-stopped/ Mon, 05 Apr 2021 00:05:28 +0000 https://washingtonmonthly.com/?p=127343 Jeff Bezos painting

The story of the e-commerce giant is the story of America’s economic unraveling.

The post Can Amazon Be Stopped? appeared first on Washington Monthly.

]]>
Jeff Bezos painting

About two and a half years ago, as media speculation about where Amazon would locate its second headquarters reached a fever pitch, The Onion, a satirical website, decided to make its own projection. “‘You Are All Inside Amazon’s Second Headquarters,’ Jeff Bezos Announces to Horrified Americans as Massive Dome Envelops Nation,” the site declared. The story described a world in which Amazon divided the United States into segments of its supply chain. “The entire state of Texas will be replaced with a 269,000-square-mile facility used exclusively to house cardboard boxes, tape, and inflatable packaging materials,” the authors wrote. “A large swath of the Midwest will soon be razed to make way for a single enormous Amazon Fulfillment Center.”

Fulfillment by Alec MacGillis
Fulfillment: Winning and Losing in One-Click America
by Alec MacGillis
MacMillan, 400 pp.

It was, of course, a joke. But based on reporting from the veteran ProPublica journalist Alec MacGillis, it’s a joke with more than just a ring of truth. In Fulfillment: Winning and Losing in OneClick America, MacGillis argues that Amazon’s dramatic expansion is Exhibit A for America’s economic unraveling. Armed with stark statistics and moving anecdotes, MacGillis illustrates how the retail giant pushes regional stores out of business. He shows how the company extracts tax incentives from desperate local governments in exchange for poor-paying warehouse jobs. Amazon has “segmented the country into different sorts of places, each with their assigned rank, income, and purpose,” he writes. It has altered “the landscape of opportunity in America—the options that lay before people, what they could aspire to do with their lives.” 

It is a damning and powerful assessment. But Amazon isn’t MacGillis’s only, or even most fundamental, subject. Instead, he treats the company as both a cause and a symptom of a bigger problem: skyrocketing regional inequality in the United States. 

Over the past 40 years, certain parts of America—mostly along the coasts—have become far more prosperous than others. This trend has not received as much attention as rising income disparities, but its political consequences have been similarly grave. Regional inequality has fueled authoritarian nationalism in the U.S. It has concentrated well-educated liberals in economically vibrant, overwhelmingly Democratic states. It has left white working-class voters elsewhere embittered and detached from mainstream politics. After decades of job losses and wage stagnation, it’s not surprising that some people in struggling counties embraced a candidate who promised to restore a halcyon era (“Make America great again”) and blamed their challenges on groups many were already prejudiced against (minorities). Donald Trump’s path to the presidency was paved in part by declining economic opportunity in the Midwest.

MacGillis provides readers with a useful primer on how this happened. Beginning in the late 1970s, politicians gradually stopped enforcing fair competition policies: the many laws designed to create an even economic playing field for different businesses and different parts of the country. Regulators started neglecting antitrust statutes, allowing a few companies in each sector to expand rapidly by purchasing or crushing their competitors. They loosened restrictions that had prevented chain stores, like CVS and Walmart, from dramatically underselling smaller rivals. And they eliminated regulations that made it equally easy to transport goods to and from all parts of America. “Profits and growth opportunities once spread across the country,” MacGillis writes. Now, they cluster in places where the dominant companies are based.

These trends are all bigger than any one business. But it’s easy to see why MacGillis chose to focus on Amazon specifically. The company owns a third of the country’s data storage market. It controls somewhere between roughly 40 and 50 percent of America’s e-commerce market, more than five times the share of its nearest rival. That makes Amazon both singularly powerful among U.S. businesses and representative of winner-takes-all corporate America at large. Together, Facebook and Google control more than 50 percent of the online advertising market. Like Amazon and its neighboring company Microsoft, they are headquartered only a few towns apart. Comcast and Charter, both located along the Acela corridor, collectively own a majority of the U.S. cable market.

These companies haven’t just survived the current recession. They’ve thrived. While the employment rate has gone down since COVID-19 arrived in America, the S&P 500 has gone up by more than 15 percent. All but one of the five richest companies have seen their value grow, including Amazon. Indeed, Amazon’s stock has increased by an astonishing 80 percent over the past 12 months. MacGillis writes that the company is reporting record profits.

The distribution of Amazon’s newfound wealth, however, has been deeply uneven. The company is hiring warehouse workers across America, but these low-paying jobs require famously long shifts, involve strenuous and monotonous work, and offer little autonomy. Meanwhile, Amazon is also expanding its Seattle and Washington, D.C., offices—adding well-paid, white-collar jobs in elaborately sculpted buildings with rooftop dog parks, onsite botanical gardens, and discounted child care.

Geographically, the United States was once an equitable place. Between the 1930s and the late 1970s, per capita earnings in almost every part of the country gradually converged. In 1933, the average income in the southwestern United States was roughly 60 percent of the national average. By 1979, it was approximately the same. During the same period, New England fell from being 1.4 times richer than the rest of the country to just above average. In 1978, the average income in the Detroit metro area was on par with that of New York City and its suburbs. Drawing on findings from this magazine, MacGillis notes that the 25 richest metropolitan areas in 1980 included Milwaukee, Des Moines, and Cleveland.

This equality was hard won. Starting at the turn of the twentieth century and accelerating during the New Deal, the federal government enacted antimonopoly laws to prevent extreme regional inequality. Throughout the middle of the twentieth century, it blocked mergers that today wouldn’t draw any attention—including one between two shoe companies that, together, controlled just over 2 percent of the nation’s footwear market—in order to keep chain stores from colonizing the country. It prohibited wholesalers and manufacturers from giving bulk discounts to these chains, which would put community retailers at a serious disadvantage. When national politicians spoke about the need to help small businesses, they meant it.

But like anything achieved through vigilant enforcement, this parity was easily erased. Beginning under Gerald Ford and Jimmy Carter and continuing under Ronald Reagan, the federal government started ignoring or outright repealing fair competition regulations. As a result, the fortunes of America’s regions diverged. The St. Louis metro area, for example, had 23 Fortune 500 companies in 1980, but in recent decades most have been acquired by larger corporations or otherwise pushed off the list. Today, it has only eight. By contrast, New York’s per capita income in 1980 was 80 percent higher than the national average. By 2013, after years of mergers in banking and finance, that figure was 172 percent. In 2018, 20 of the top 25 wealthiest cities were on the coasts.

As businesses departed from large swaths of the interior, many Americans were left without good economic options. Fewer businesses meant less competition among employers to drive up pay. The main employers that have moved in as most companies moved out—retailers like Walmart, Dollar Tree, Family Dollar, and Dollar General—are notorious for their low-wage business models. The bulk of the money each store makes flows out of the local community and into the company’s headquarters, almost always located far away.

But perhaps no growing employer is as notorious as Amazon. According to MacGillis, the company has hired hundreds of thousands of new warehouse workers in the past five years. It has added more than 175,000 during the pandemic alone, even as thousands of small retailers have shut down. The indignity of life in the company’s warehouses is well documented, but MacGillis makes space to describe the dangers. He recounts how one worker was killed after being crushed by a forklift, and how another was killed by a tractor. He covers various attempts by warehouse workers, some with past union experience, to organize for safer conditions. None of those attempts go well. (Hopefully, the ongoing union drive in Alabama will end with more success.)

Massive retailers with low prices, like Amazon, are not just a poor replacement for local employers. They are part of why local employers shut down. Inexpensive products are nice for customers, but they drive community stores straight out of business. And Amazon has tools beyond low pricing that it uses to squeeze competitors. The company is the main, and for many small businesses the only, way to sell products online. It capitalizes on this by forcing vendors on its platform to hand over a hefty percentage of their profits—usually 15 percent—for every sale, a transaction fee that MacGillis compares to a tax. Amazon also manufactures goods itself, often copying its vendors’ most popular products based on its privileged look at their sales data. Free of the same fee (the company doesn’t tax itself), Amazon sells these knockoffs at a lower price than the originals, driving the real creators into insolvency. As a result, money that would have gone to small businesses instead winds up with Amazon.

While consolidation has devastated most of America, it has been a boon for Amazon’s hometown. Once a manufacturing city as distressed as present-day Detroit, Seattle has become a rich tech mecca. The metro area has a median household income of $94,000, making it the ninth wealthiest in the country. Its population has roughly doubled since 1970. Not all of this can be attributed to Amazon; Boeing drew engineers to the area, and the city’s growth began in earnest when Bill Gates and Paul Allen set up shop to build Microsoft. But there’s no doubt that Amazon is now the city’s crowning jewel. It has accounted for 30 percent of all new jobs in Seattle over the past decade, most of which are well paid.

For Seattle’s boosters, this growth is a testament to the city’s inherent virtues. “From its beginning, Seattle showed a do-whatever-it-takes resilience,” the Seattle Times columnist Ron Judd wrote in 2016. He acknowledged that there was “a little bit of dumb luck” involved in the city’s success, but ultimately praised its location, lifestyle, and innovative spirit for attracting people like Jeff Bezos. The success of Seattle, Judd wrote, is heavily tied to “history, geography, education and, yes, some creative capitalizing on all the gifts the place was given.” 

MacGillis implicitly rebuts such arguments. Baltimore’s Atlantic waterfront, he points out, could not save it from postindustrial economic erosion, nor could St. Louis’s central location. The author outlines the careers of retailer entrepreneurs from overlooked metros—like El Paso, Texas—who clearly possess the enterprising talents that libertarians believe made Bezos a billionaire. They were eventually brought to heel, in no small part because of Amazon. 

The truth is that Seattle has prospered because Gates and Allen, both Seattle natives, set up their company in the area during the late 1970s, when it was still possible to build a successful corporation in most American cities. And once Seattle had Microsoft, it became easier to attract other tech companies. But now, without fair competition rules, other cities don’t have the same opportunities Seattle once had. It’s hard for new retailers to grow when they have to contend with Amazon.

Yet even Seattle is suffering. Its housing and living costs have skyrocketed, pushing many middle-class and working-class residents out. Soaring rents have pushed some of them onto the street. By the end of 2017, Seattle had the third-largest homeless population in the country, after Los Angeles and New York City. The changes have disproportionately harmed Black residents, whose median income has fallen since 2000 even without accounting for inflation. Seattle, MacGillis writes, is “proof that extreme regional inequality was unhealthy not only for places that were losing out in the winner-take-all economy, but also for those who were the runaway victors.”

The author chronicles the city’s hapless attempts to fix the nasty side effects of its hyper-prosperity. It passed a tiny income tax followed by a tiny tax on large businesses in order to address housing shortages and improve public transportation. The first was successfully challenged in court by coalitions representing the city’s rich. The latter was attacked by a collection of big companies, including Amazon, which threatened to cancel a planned Seattle building expansion if the law wasn’t repealed. Amazon also helped bankroll an expensive ballot initiative to eliminate the tax. Ultimately, the city council repealed it first.

The tax exemptions that less prosperous cities offer Amazon in exchange for becoming the site of a new fulfillment center are even more degrading. While local authorities view the subsidies as the price of keeping their economies alive, MacGillis suggests that the tax cuts may ultimately cost cities more than the new jobs are worth. Emergency service departments in two Ohio counties, for example, have had to contend with a steady stream of 911 calls for warehouse injuries, an expense that Amazon creates but does not pay for, because it is exempt from each county’s property taxes. The company’s distribution centers build up truck traffic on nearby highways, but Amazon is excused from paying the taxes that fund roadway maintenance.

That so many local governments kowtow to Amazon is a depressing statement about political power in America. But even if these places—be they Seattle or Dayton—could muster the political will to take on the company, they simply don’t have the tools to win. The affordability crisis threatening rich metros and the hollowing out of poor ones are both by-products of concentrating economic power in a handful of cities. And that’s something that only the federal government can fix.

The good news is that there are national policies that could rebalance our economy. By reviving underused competition policies, the Biden administration has the power to distribute wealth much more fairly. There’s no shortage of consolidation in the American economy, from gigantic agribusinesses to hospital chains. But for any would-be trustbuster, Amazon must be at the top of the list. At a minimum, the administration should fight to prohibit the company from both owning America’s dominant online marketplace and selling its own products in it. Better yet, it could spin off Amazon’s data storage business, its smart home business, and its many other non-search components into independent companies. Better still, it could break up Amazon’s marketplace outright.

The government has many other tools it can use to better distribute opportunity. It could bring back regulations that made it impossible for big businesses to get better deals from suppliers than small ones could. It could reconstruct the dismantled Civil Aeronautics Board, an institution that kept airfare prices roughly the same on a per-mile basis wherever one went and made sure small and midsize cities received adequate service. It can re-create the Interstate Commerce Commission, which did the same thing for passenger trains and freight transportation. Those transit regulations enabled new small businesses to thrive in midsize heartland cities rather than just existing economic hubs. 

Some of these steps require new legislation. But many are possible through executive action under existing, if currently unenforced, competition statutes. Either way, there’s hope. Democrats have unified control of the government, and progressives are increasingly concerned about concentrated economic power. The U.S. House of Representatives, multiple state attorneys general, and the Department of Justice are all investigating anticompetitive practices by Amazon. The latter two are already suing Facebook and Google.

But the Democratic Party does not have the best recent track record when it comes to curbing corporate power. Democrats dominate Seattle’s government, and they ultimately killed a law that mildly inconvenienced Amazon. The Obama administration did virtually nothing to stop the mergers, acquisitions, and other actions that fueled retail consolidation and helped give rise to Big Tech. As MacGillis points out, many of Obama’s officials went on to prominent, powerful roles in major tech companies. Jay Carney, one of the former president’s press secretaries, now heads public policy for Amazon.

It is still too soon to say whether Joe Biden will take the aggressive antitrust positions favored by progressive activists or the lenient approach of the president he served beside. His first Federal Trade Commission nominee, Lina Khan, is an antitrust expert who advocates for curbing the power of large corporations. Her selection was a promising sign. So was choosing the Big Tech critic Tim Wu to work on technology and competition policy at the National Economic Council.

But the most important positions are yet to be filled, and progressives are worried that his early choices will soon be counterbalanced by monopoly-friendly personnel. Should Biden ultimately opt to follow Obama’s path, it might be because he simply does not recognize the economic damage oligopolistic companies have caused. If so, he would do well to read MacGillis’s book. But it is also possible that Biden and his team are aware, but their interest in fighting back will be tempered by fund-raising concerns, a belief that challenging monopolists would be too risky for the economy, or simply a desire to tackle other priorities.

If that is the case, the administration should consider the political consequences of continuing on our current path. The steady draining of wealth and opportunity from large parts of America is part of why many onetime Democratic strongholds, like Michigan, are now swing states, and why many onetime swing states, like Missouri, are now Republican strongholds. Well-educated liberals will not move to these places unless there are economic opportunities. The remaining white, non-college-educated residents will continue to feel economically embittered.

With thin congressional majorities, competition policy may be one of the few tools Biden can really wield to restructure America’s political economy. He must use it.

The post Can Amazon Be Stopped? appeared first on Washington Monthly.

]]>
127343 Fulfillment by Alec MacGillis Fulfillment: Winning and Losing in One-Click America by Alec MacGillis MacMillan, 400 pp.
How to Write Like Your Parents Are Dead https://washingtonmonthly.com/2021/04/04/how-to-write-like-your-parents-are-dead/ Mon, 05 Apr 2021 00:00:29 +0000 https://washingtonmonthly.com/?p=127579 Philip Roth

Philip Roth had an idyllic childhood. How did he become America’s greatest chronicler of malcontents? 

The post How to Write Like Your Parents Are Dead appeared first on Washington Monthly.

]]>
Philip Roth

In 1969, Philip Roth summoned his parents for lunch in New York City. His novel, Portnoy’s Complaint, was soon to hit bookshelves, and he wanted to prepare them for what he predicted would be an onslaught of controversy and media attention. “You can politely or un-politely hang up,” he said. “They’re just journalists, you know.”

Philip Roth: The Biography
by Blake Bailey
W. W. Norton, 916 pp.

Roth had reason to be worried. He had already been labeled an anti-Semite because his earlier work contained unflattering Jewish American characters. Portnoy was likely to further fuel those charges and was bound to come with another dimension: Readers would unquestionably wonder whether Alexander Portnoy’s domineering and deranged parents were based on the writer’s own. 

After lunch, Roth sent his parents back home to New Jersey. Once the taxi left, his mother began to cry. “What’s wrong?” Philip’s father, Herman, asked her. “He has delusions of grandeur!” she told him.

Of course, Roth’s delusions turned out to be justified. The raunchy and emancipating novel about a guilt-stricken Jewish man’s obsession with sex and masturbation, told in the form of a rant to his psychotherapist, was an immediate best seller and a cultural landmark. Practically overnight, it made Roth into an international celebrity who could no longer dine in restaurants without someone heckling him over whether he was going to order liver (in the most infamous scene, Portnoy masturbates with a piece his family will eat for dinner). 

Twenty-five books later—Roth wrote a total of 31—it remains the one for which he is probably best known.

In reality, Herman and Bess Roth were a devoted and doting set of parents who bore only mild resemblances to Jack and Sophie Portnoy, and Roth’s childhood was very unlike Alexander’s. Roth enjoyed an idyllic upbringing in the Weequahic section of Newark, New Jersey, where he suffered no significant trauma. 

Still, readers were right to wonder. As Blake Bailey observes in Philip Roth, his new, eagerly anticipated, and magisterial biography of the American master, Roth loved to play on the ambiguity. “The most cunning form of disguise,” Roth wrote in The Facts, “is to wear a mask that bears the image of one’s own face.” Many of his protagonists had similarities to the author (some were even named Philip Roth). The author’s rejection of Judaism (“I don’t have a religious bone in my body”) and his adventurous sex life (he was never quite a one-woman man) served as fodder for many of his greatest works, and when not writing about himself, Roth almost always drew on people he knew. He would bring a notebook with him at all times and take notes on what he picked up from interactions with acquaintances. The habit was so pervasive that a lawyer would have to pore through his final drafts to make sure no one could sue him for libel.

Roth’s early depictions of radical individualism made him an astute chronicler of the ethos of the 1950s and ’60s, when young people across the country were railing against authority and the strictures of their parents’ generation. His most pervasive theme was the struggle of the individual against the communal (the I versus the we)—and he always sided with the individual. 

Many of Roth’s characters had similarities to the author. “The most cunning form of disguise,” he wrote in The Facts, “is to wear a mask that bears the image of one’s own face.”

But his rebellious characters, like Portnoy, were flawed and—more than occasionally—a tad perverted. So, too, ultimately, was the individualistic uprising he channeled. By the 1980s, individual rebellion against social authority had morphed into economic libertarianism, which led to the rise of Ronald Reagan and a conservative movement animated by the belief that one should seek autonomy from government. Roth, for his part, was keenly attuned to politics, and later in his career he would offer a prophetic vision of where this ideological crusade would lead. In his dystopian The Plot Against America, written in 2004, Roth imagined what would have happened if Charles Lindbergh had defeated Franklin D. Roosevelt in 1940. In the text, Lindbergh keeps America out of World War II, Nazism triumphs in Europe, and anti-Semitism spreads in the United States. The novel gained new currency when Donald Trump won the presidency in 2016. It was from Lindbergh, after all, that Trump stole his trademark mantra of “America First.” 

Roth’s body of work, however, was hardly focused just on politics or rebellion. His writing touched on an extraordinarily wide variety of subjects, from Jewish American life and the trap of the self to sex and love (and sex without love) and the coming of death and more. What underlay it all was his unflinching commitment to telling the unvarnished truth as he saw it. 

Sometimes, this tendency brought negative attention on himself or his loved ones—as happened with his parents after Portnoy (though it did not impact their relationship; his father, in fact, would offer strangers signed copies of the novel: “From Philip Roth’s father, Herman Roth”). But as Bailey explains, Roth would go wherever his imagination took him, and “the impact on family and friends was something he worried about later, if at all.” As the author once told a young Ian McEwan, who was seeking writerly advice, “You have to write as though your parents are dead.” 

Philip Roth first became interested in literature while earning his bachelor’s degree at Bucknell University. As a sophomore, he started a literary magazine but was barely able to keep up with the mechanics of managing a publication. After getting one of his favorite professors to write an essay on the misuse of the English language, he brought a finished copy of the magazine to the professor, who quickly realized that the paragraphs in his article were arranged out of order. Similarly, Roth’s early experiments with fiction went poorly. By trying too hard to imitate J. D. Salinger, Roth later said, he wrote “very bad, very sensitive stories.”

Roth’s writing talent became clear later, while he was a graduate English student at the University of Chicago and was encouraged to write about the people and places he knew. That led to his first great piece of fiction: his short story “The Conversion of the Jews,” published in The Paris Review in 1958, about a rebellious adolescent who forces his Hebrew school teacher to say he believes in Jesus. From that came other short treasures, such as “Defender of the Faith.” Two years later he wrote Goodbye, Columbus, which won him the National Book Award at the age of 27—the youngest-ever recipient of the prize. 

These works brought Roth literary fame, but they also brought intense criticism. In “Defender of the Faith,” Roth writes about a Jewish soldier who tries to use his religion to get special treatment. As a consequence, Roth was reviled from pulpits nationwide as a self-hating Jew who gave ammunition to anti-Semites. One of the most powerful rabbis in America, Emanuel Rackman, wrote to the Anti-Defamation League, “What is being done to silence this man? Medieval Jews would have known what to do with him.” 

In an exhibition of fortitude, Roth appeared alongside Ralph Ellison for a panel discussion at Yeshiva University in 1963. Making clear that he, as a writer, had no loyalties to his ethnic group, Roth made a more profound point: If writers were forbidden from depicting flawed characters because people with small minds would draw stereotypes from them, then that, in itself, is submission to bigotry. 

Around the same time, Roth made what he would deem his worst mistake: his first marriage. He met Margaret “Maggie” Martinson Williams in Chicago in 1956 and married her three years later, but under dubious conditions. She told him she was pregnant with his child, and despite his misgivings about the relationship—by many indications, Williams was mentally unwell—Roth agreed to marry her as long as she got an abortion. He wanted to first take her to a doctor’s office for a pregnancy test, to which she agreed. Williams then found an obviously pregnant homeless woman and asked her to pee into a cup, which she gave to a nurse claiming it was her own. Never actually pregnant, she then lied about having gotten an abortion. 

Years later, Williams admitted this to Roth. “I was completely stunned on learning of her deception,” he wrote in an affidavit for his divorce case. “Our marriage had been three years of constant nagging and irritation, and now I learned that the marriage itself was based on a grotesque lie.” Their relationship ultimately reached a conclusion in May 1968, when Williams was killed in a car accident. 

But while Roth’s relationship with Maggie caused him an excruciating amount of turmoil, she also provided him with material for his fiction. His 1974 novel My Life as a Man was about a turbulent marriage based on his own; it included the urine episode pretty much exactly as it had happened in real life. On the day he finished the final draft, Bailey writes, he wept in the shower, having “turned the shit of that marriage into a book.”

It was not Roth’s last bad marriage. His second union, with the actress Claire Bloom, from 1990 to 1995, was another source of hardship, humiliation, and, eventually, literary inspiration. Her tell-all 1996 memoir Leaving the Doll’s House was one of the reasons Roth enlisted a biographer in the first place: to set, in his eyes, the record straight. Bloom alleged in the book that Roth was a misogynist and toxic bully who would not let her daughter, Anna Steiger, live with them. According to Bailey, Roth was disturbed by the friction Anna (then in her 30s) brought to their marriage and wrote Bloom a letter asking if Anna could live elsewhere, but he didn’t force her out; in fact, she continued to live with them for six months after he sent the request. That said, Bailey’s account still shows Roth to have been far from a perfect husband: he had multiple affairs during their marriage and could be callously indifferent to Bloom’s needs and struggles. 

Shortly after the release of Doll’s House, Roth checked into a residential treatment center, suffering from suicidal depression. The book created a wound that would haunt him for the rest of his life, the fear that the un-litigated accusations would forever stain his reputation. For that reason, his next novel, I Married a Communist, in 1998, was about a teacher who becomes the target of a McCarthy-era witch hunt, precipitated by an anti-Semitic wife who destroys him.

Despite Roth’s firm commitment from the outset to going wherever his mind took him, his best writing came after both of his parents had died. For all his depictions of characters like himself, who revolt against the expectations of a good Jewish boy, Roth was himself as sensitive and loving toward his parents as one could be.

His most tender book, Patrimony, from 1991, is a memoir of his father dying from a brain tumor. It showcases, perhaps better than anything else Roth wrote, one of his major contradictions: his unmistakable filial piety and his commitment to the integrity of the story, no matter the costs. In the most memorable passage, Roth writes about a time his father, after a biopsy, lost control of his bowels at Roth’s Connecticut home. The son cleans up the shit with a monkish intensity—on the toilet, on the towels, in his hair. Roth gets on his knees and uses a toothbrush to get it out of the crevices between the tiles. Many readers at the time objected to Roth including the embarrassing episode; he had promised his father—in the text!—never to tell anyone. But for Roth, the moment was filled with a kind of spiritual transcendence. It was “one of the most extraordinary and wonderful things that has ever happened to me,” he wrote his friend and rival John Updike. 

The shit-cleaning scene sets up the book’s thesis. Earlier, Roth helps his father set up a will and forgoes his inheritance; he’s a rich man by this point, after all. He advises his father to leave his money to his nephews (the father’s grandsons), who could use it more. Instead, Roth says, he got the experience of washing up for his father. “So that was my patrimony. And not because cleaning it up was symbolic of something else but because it wasn’t, because it was nothing less or more than the lived reality of what it was.” To Roth, the moment is so intensely meaningful because it is stripped of any clichés about what a son “gets” from his father. 

Paradoxically, though, by his making it a seminal scene in his book, it has the opposite effect. As the great Roth critic Mark Schechner once wrote, Roth’s memoir shows how his patrimony was, in fact, “nothing less than his own character: his humor, his stories, his own iron will, vernacular heart, and toughness of mind.” 

Four years later, Roth released his masterpiece, the death-obsessed Sabbath’s Theater, the first novel he wrote from start to finish after both of his parents had died. He was now at his most free to write about grotesque or harsh realities.

Roth would bring a notebook with him at all times and take notes on the actions of his acquaintances. The habit was so pervasive that a lawyer would have to pore through his final drafts to make sure no one could sue him for libel.

An incomparable literary experience, Sabbath moves from the profound to the profane with remarkable fluency, able to at once shock you and disgust you, and then move you. The novel begins with a disgraced former puppeteer and professor, Mickey Sabbath, in his 60s, who regularly masturbates on the grave of his dead lover. The rest follows Sabbath as he plans to commit suicide, with his mother’s voice in his head telling him that it is the proper end to his failed life. Meanwhile, his behavior pushes all boundaries—sexual, social, and moral. It was Roth at his most willing to probe a character who would “let the repellant in,” as he would say, and show no remorse. 

But the book, for all its dirtiness, contains some of the most touching passages of any novel written in the 20th century, such as when Sabbath goes looking for the graves of his grandparents, parents, and brother Morty, who died in World War II, as he reflects on his own looming death. It becomes an elegy for the family members we’ll never speak with again. (It was the passage Roth chose to read aloud when he celebrated his 80th birthday at the Newark Public Library in 2013.) 

More challenging to readers is a haunting passage toward the end, when Sabbath recollects visiting his mistress, Drenka Balich, before she died. (The character was based on one of Roth’s longtime lovers.) Together, they remember, with transcendent wonder, the joy of having once pissed on each other. It is, in a sense, classic Roth—the idea of sex as freedom and transgression, but also of sex as protest. In Sabbath’s Theater, it becomes a protest against mortality itself. 

Drenka’s death leaves Sabbath grief-stricken and feeling ready to die, but what’s noteworthy is that Sabbath lived as Portnoy, a New York City political staffer, wanted to but couldn’t dare allow: in Freudian terms, to fully release the id and abandon the ego or superego. As William Blake wrote, “The road of excess leads to the palace of wisdom.” Maybe not for Roth’s characters, or even Roth the man, but perhaps for Roth the writer. 

Sabbath’s Theater won the National Book Award—37 years after Roth won it for Goodbye, Columbus—and his next three books, his “American Trilogy,” would all earn major prizes, including the Pulitzer for American Pastoral and the PEN/Faulkner Award for The Human Stain.

Toward the end of his career, Roth wrote lesser novels—like The Humbling and Indignation—but he did not lose his ability to tap into, or even anticipate, the zeitgeist. His elegant and final novel, Nemesis, was about the polio epidemic in the 1950s. It depicts the terror of living with the spread of an infectious disease that has paralyzed and tortured a community. All of us, unfortunately, know that phenomenon too well by now. 

It was shocking to many when, in 2012, he announced his retirement from writing. But (unbeknownst to the world at the time) he had been living with coronary heart disease for decades, and he had always disliked the way Saul Bellow churned out weaker work after his mental acuity started ebbing. It didn’t hurt that Roth already had cemented his status as one of the greats, whose name would be in the same category as William Faulkner, Ernest Hemingway, F. Scott Fitzgerald, and Toni Morrison. He would quote the heavyweight champion Joe Louis, “I did the best I could with what I had.” 

And, much as with Louis, no one could question Roth’s commitment to his craft. He was a prisoner to his writing routine and was resolutely dedicated to the cause of literature overall. During the 1970s and ’80s, in one of his greatest acts of service, he engineered the worldwide publication of dissident Czech writers living under totalitarian rule, such as Milan Kundera. 

While he was no longer producing new work, Roth always made the news cycle each autumn when the Nobel Prize winners were set to be announced. It would be a running joke that the Swedish panel would snub America’s greatest living writer every year. “He’s not terribly politically correct, you know, and they are,” said Harold Bloom. By the time they gave the prize for literature to Bob Dylan in 2016, Roth had accepted the injustice. When asked what he thought of it, he said, “It’s okay, but next year I hope Peter, Paul, and Mary get it.” 

Still, he would win every other major literary prize and was highly honored in his final years. In 2011, President Barack Obama gave him a National Humanities Medal. “How many young people learned to think,” Obama said, pausing for dramatic effect, “by reading the exploits of Portnoy and his complaints?”

Note: After this review appeared in the Washington Monthly, W.W. Norton, the book’s publisher, paused sales of the biography while investigating claims of sexual assault leveled against author Blake Bailey.

The post How to Write Like Your Parents Are Dead appeared first on Washington Monthly.

]]>
127579 April-21-Bailey-Books Philip Roth: The Biography by Blake Bailey W. W. Norton, 916 pp.